Monday, December 22, 2025

Unlock AG Pro Today

Why Now?

AG Pro gives you sharp insights, compelling stories, and weekly mind fuel without the fluff. Think of it as your brain’s secret weapon – and our way to keep doing what we do best: cutting the BS and giving you INDEPENDENT real talk that moves the needle.

Limited time offer: $29/yr (regularly $149)
✔ Full access to all stories and 20 years of analysis
✔ Long-form exclusives and sharp strategy guides
✔ Weekly curated breakdowns sent to your inbox

We accept all major credit cards.

Pro

/ once per week

Get everything, no strings.

AG-curious? Get the full-access version, just on a week-to-week basis.
• Unlimited access, no lockouts
• Full Premium archive access
• Inbox delivery + curated digests
• Stop anytime, no hoops

$
7
$
0

Get your fill of no-BS brilliance.

Pro

/ once per year

All in, all year. Zero lockouts.

The best deal - full access, your way. No timeouts, no limits, no regrets.
A year for less than a month of Hulu+
• Unlimited access to every story
• Re-read anything, anytime
• Inbox drop + curated roundups

$
29
$
0

*Most Popular

Full access, no pressure. Just power.

Free
/ limited

Useful, just not unlimited.

You’ll still get the goods - just not the goodest, freshest goods. You’ll get:
• Weekly email recaps + curation
• 24-hour access to all new content
• No archive. No re-reads

Free

Upgrade later -
we’ll be here!

Unlock AG Pro Today

Why Now?

AG Pro gives you sharp insights, compelling stories, and weekly mind fuel without the fluff. Think of it as your brain’s secret weapon – and our way to keep doing what we do best: cutting the BS and giving you INDEPENDENT real talk that moves the needle.

Limited time offer: $29/yr (regularly $149)
✔ Full access to all stories and 20 years of analysis
✔ Long-form exclusives and sharp strategy guides
✔ Weekly curated breakdowns sent to your inbox

We accept all major credit cards.

Pro

/ once per week

Get everything, no strings.

AG-curious? Get the full-access version, just on a week-to-week basis.
• Unlimited access, no lockouts
• Full Premium archive access
• Inbox delivery + curated digests
• Stop anytime, no hoops

$
7
$
0

Get your fill of no-BS brilliance.

Pro

/ once per year

All in, all year. Zero lockouts.

The best deal - full access, your way. No timeouts, no limits, no regrets.
A year for less than a month of Hulu+
• Unlimited access to every story
• Re-read anything, anytime
• Inbox drop + curated roundups

$
29
$
0

*Most Popular

Full access, no pressure. Just power.

Free
/ limited

Useful, just not unlimited.

You’ll still get the goods - just not the goodest, freshest goods. You’ll get:
• Weekly email recaps + curation
• 24-hour access to all new content
• No archive. No re-reads

Free

Upgrade later -
we’ll be here!

Should machines be tasked with catching future criminals, or are we encoding our own biases?

To catch a (potential) criminal

In the movie Minority Report, a panel of psychics peers into the future to predict who will commit a crime. Today, using machine learning technology, courts, police departments, and parole officers are attempting to do the same.

Recommending sentences and placement

Increasingly, law enforcement agencies and courts are using computer algorithms to assess the risk that criminals will re-offend.

This information is used to sentence defendants in court, to choose where to place inmates in prison, which streets should be more heavily policed, and who should be released on parole.

There are several different risk calculators available. Some fully disclose the kinds of data they use to assess risk, while others companies, such as Northpointe, which makes risk assessment software called Compas, won’t reveal how their products work. We know that the calculators likely include data such as arrest records, the type of crime committed, and demographic information, which could include age and where a person lives.

Encoding bias and prejudice into another system

While some claim that risk assessment will help alleviate bias in the judicial system, including racial or gender prejudice, and the whims and moods of judges and law enforcement officials, critics feel that these programs actually encode these biases into computerized systems.

For example, a website called ProPublica used Compas to study criminal activity in a Florida county for one year. They found that, amongst people that Compas clocked as “high risk,” whites were twice as likely as blacks to actually commit a crime.

In other words, a lot of black people are being labeled as high risk, even though they aren’t committing crimes. Meanwhile, it was much more common for white people to be labeled as low risk, and then commit a crime anyway.

Because of the controversies surrounding commercially produced risk assessors like Compas, the state of Pennsylvania has created a special commission to come up with its own risk assessment algorithm. They haven’t made much progress, because every variable they’ve considered has been met with a deadlocked debate.

Skewing the numbers or keeping it fair

Even when risk assessors don’t explicitly input racial information, the data is tainted by decades of racial disparity. Given the racial segregation of housing in most cities, using data from home addresses is basically the same thing as revealing someone’s race. Arrest records are going to be skewed as well, since racial bias has led to aggressive policing in predominantly black neighborhoods.

Sonja Starr, a law professor at the University of Michigan law school, points out that because risk assessors use “every mark of poverty…as a risk factor,” these algorithms will inherently discriminate against the poor.

In response to Pennsylvania’s struggle to come up with a fair algorithm, she says, “if the variables aren’t appropriate, you shouldn’t be using them.”

In other words, you can’t use data from an unfair system to create a fair algorithm. Perhaps all of this machine learning would be put to better use trying to solve the problems, like poverty and racial injustice that lead to crime in the first place.

#MachineLearningCrime

Ellen Vessels, Staff Writerhttps://www.linkedin.com/in/ellenvessels
Ellen Vessels, a Staff Writer at The American Genius, is respected for their wide range of work, with a focus on generational marketing and business trends. Ellen is also a performance artist when not writing, and has a passion for sustainability, social justice, and the arts.
Subscribe
Notify of
wpDiscuz
0
0
What insights can you add? →x
()
x
Exit mobile version