Connect with us

Hi, what are you looking for?

The American GeniusThe American Genius

Opinion Editorials

Should machines be tasked with catching future criminals, or are we encoding our own biases?

(OPINION EDITORIAL) Increasingly, law enforcement agencies and courts are using computer algorithms to assess the risk that criminals will re-offend. But is it right?

To catch a (potential) criminal

In the movie Minority Report, a panel of psychics peers into the future to predict who will commit a crime. Today, using machine learning technology, courts, police departments, and parole officers are attempting to do the same.

bar

Recommending sentences and placement

Increasingly, law enforcement agencies and courts are using computer algorithms to assess the risk that criminals will re-offend.

This information is used to sentence defendants in court, to choose where to place inmates in prison, which streets should be more heavily policed, and who should be released on parole.

There are several different risk calculators available. Some fully disclose the kinds of data they use to assess risk, while others companies, such as Northpointe, which makes risk assessment software called Compas, won’t reveal how their products work. We know that the calculators likely include data such as arrest records, the type of crime committed, and demographic information, which could include age and where a person lives.

Encoding bias and prejudice into another system

While some claim that risk assessment will help alleviate bias in the judicial system, including racial or gender prejudice, and the whims and moods of judges and law enforcement officials, critics feel that these programs actually encode these biases into computerized systems.

Advertisement. Scroll to continue reading.

For example, a website called ProPublica used Compas to study criminal activity in a Florida county for one year. They found that, amongst people that Compas clocked as “high risk,” whites were twice as likely as blacks to actually commit a crime.

In other words, a lot of black people are being labeled as high risk, even though they aren’t committing crimes. Meanwhile, it was much more common for white people to be labeled as low risk, and then commit a crime anyway.

Because of the controversies surrounding commercially produced risk assessors like Compas, the state of Pennsylvania has created a special commission to come up with its own risk assessment algorithm. They haven’t made much progress, because every variable they’ve considered has been met with a deadlocked debate.

Skewing the numbers or keeping it fair

Even when risk assessors don’t explicitly input racial information, the data is tainted by decades of racial disparity. Given the racial segregation of housing in most cities, using data from home addresses is basically the same thing as revealing someone’s race. Arrest records are going to be skewed as well, since racial bias has led to aggressive policing in predominantly black neighborhoods.

Sonja Starr, a law professor at the University of Michigan law school, points out that because risk assessors use “every mark of poverty…as a risk factor,” these algorithms will inherently discriminate against the poor.

In response to Pennsylvania’s struggle to come up with a fair algorithm, she says, “if the variables aren’t appropriate, you shouldn’t be using them.”

In other words, you can’t use data from an unfair system to create a fair algorithm. Perhaps all of this machine learning would be put to better use trying to solve the problems, like poverty and racial injustice that lead to crime in the first place.

Advertisement. Scroll to continue reading.

#MachineLearningCrime

Ellen Vessels, a Staff Writer at The American Genius, is respected for their wide range of work, with a focus on generational marketing and business trends. Ellen is also a performance artist when not writing, and has a passion for sustainability, social justice, and the arts.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Advertisement

The
American Genius
news neatly in your inbox

Subscribe to our mailing list for news sent straight to your email inbox.

Advertisement

KEEP READING!

Business News

Shoplifting is altering retail's trajectory - common items like toothpaste, dish soap, makeup, eye drops, canned tuna, and laundry detergent are now behind lock...

Politics

OK. In the spirit of fellow AG columnist Fred Glick, this will be a short one. It seems from my last post (and I...

The American Genius is a strong news voice in the entrepreneur and tech world, offering meaningful, concise insight into emerging technologies, the digital economy, best practices, and a shifting business culture. We refuse to publish fluff, and our readers rely on us for inspiring action. Copyright © 2005-2022, The American Genius, LLC.