Connect with us

Hi, what are you looking for?

The American GeniusThe American Genius

Tech News

AI fights crime and sometimes mistakes sand dunes for porn

(TECH NEWS) Artificial intelligence (AI) is amazing and does so many cool things, but it can get confused from time to time…

AI sand dunes

There’s a certain melancholy to the end of the holiday season, isn’t there? Whichever is your winter festival of choice, it’s easy to be a bit down when the fam heads home and your vacation days, if you have vacation days, dwindle to their end. But as you emerge from holiday coma and trudge to work in the winter lull, take heart! At least it’s not your job to convince a computer that sand dunes aren’t porn.

Because it could be. That’s a thing. We live in the most ridiculous possible future.

Specifically, it’s a British thing. In their ongoing – and laudable! – campaign against child abuse, the Metropolitan Police of London are testing an algorithm that searches seized data for inappropriate sexual content.

Well, that’s what it’s supposed to do. At the moment, it’s shouting at sand. See, sand comes in curving lines and a variety of (literal!) earth tones. Various other activities are also characterized by curving lines and a variety of earth tones. I trust I don’t need to spell it out.

That’s the trouble with algorithms: they do need me to spell it out. As we’ve written before, AI does not do context, and context is the most important human thing. When all you have to work with is “sort of brown and curvy and all over the place,” it becomes possible to mistake a pitiless desert landscape for naked humans engaged in naked human activities. People don’t do that. I mean, I hope. That sounds scratchy and embarrassing.

Advertisement. Scroll to continue reading.

That’s why it’s currently someone’s job to explain to a robot that sand is not sex. Fair play to the Metropolitan Police, they’re doing that correctly. Their AI solution isn’t scheduled to turn its pitiless steel gaze on British sex for two to three years. Programs are supposed to have hilarious fails in the testing phase. That’s why there’s a testing phase.

The private sector has a habit of leapfrogging that and letting the fail happen right out in public. Just in the last 6 months, premature AI implementation has had Google accusing an innocent person of the Las Vegas shooting and Facebook promoting explicit anti-Semitism.

To state the obvious, the stakes are even higher when the cops are involved. Neither Google nor Facebook has the legal right to shoot you. Yet. And alongside the hilarious fail, the Metropolitan Police are discussing non-hilarious fail, including putting potentially incriminating information on public cloud storage, rather than in a dedicated data center. In case you’re time traveling from 2012, putting private information on a publicly accessible system is a really bad idea. Really.

In short, law enforcement’s experiment with Robocop seems to have run smack into the modestly named Salter’s Law: for every implementation of AI in a people-facing role, you will have to hire a minimum of one real person just to handle the fallout when it screws up.

Advertisement. Scroll to continue reading.
Written By

Matt Salter is a writer and former fundraising and communications officer for nonprofit organizations, including Volunteers of America and PICO National Network. He’s excited to put his knowledge of fundraising, marketing, and all things digital to work for your reading enjoyment. When not writing about himself in the third person, Matt enjoys horror movies and tabletop gaming, and can usually be found somewhere in the DFW Metroplex with WiFi and a good all-day breakfast.

6 Comments
Advertisement

The
American Genius
news neatly in your inbox

Subscribe to our mailing list for news sent straight to your email inbox.

Advertisement

KEEP READING!

Tech News

Amazon quietly applied for a patent that confirms why we've been so nice to AI voice assistants in the event they can someday understand...

Opinion Editorials

AI is being used in some fascinating ways, but as it enters more of our lives, we must pause to ask if it has...

Opinion Editorials

This year, AI went mainstream, and English is suddenly the hottest programming language, so why are colleges nixing English departments?

Tech News

Facial recognition as a security measure that is fairly dystopian concept to the modern man. Here's why it's not a reliable source.

Advertisement

The American Genius is a strong news voice in the entrepreneur and tech world, offering meaningful, concise insight into emerging technologies, the digital economy, best practices, and a shifting business culture. We refuse to publish fluff, and our readers rely on us for inspiring action. Copyright © 2005-2022, The American Genius, LLC.