Connect with us

Hi, what are you looking for?

The American GeniusThe American Genius

Tech News

Facial Recognition thinks you might be a toaster, really

(TECH NEWS) Facial Recognition is still a log way from being perfect. Ceci n’est pas une toaster. Really. Repeat it with me: I am not a toaster.

Facial recognition failure

Using facial recognition seems pretty seamless, think of your iPhone. Yet, a human face has actually been confused with a toaster, according to a facial recognition technology expert.

If a computer, which is thought to be highly reliable, will confuse a human face for a toaster, what might that mean for facial recognition accuracy when seeking out suspects of crimes? Possibly, not so reliable.

“Obviously, the technology has immense value in promoting societal interests such as efficiency and security but it also represents a threat to some of our individual interests, particularly privacy,” Nessa Lynch, associate professor of law at Victoria University of Wellington, New Zealand. Lynch and other experts are part of a research project that will be completed in mid-2020. The researchers presented some of their findings during a panel recently held at the university.

Some of the very first images used to test data were those of convicted felons in Florida. They had abused meth and had great cheekbones. But, that presented problems when using facial recognition on actual real folk without a meth habit.

Advertisement. Scroll to continue reading.

The cheekbones are very different than the average person, which can happen when you eat food. Data from such a source was not useful when training a system to recognize normal people, said Rachel Dixon, Privacy and Data Protection Deputy Commissioner at the Office of the Victorian Information Commissioner in Australia.

Companies who sell the technology products often claim they are highly reliable, but Dixon said, often they are reliable because of the environments where they are used, which may be unvarying. And, the systems are tuned for these specific environments.

“…Picking you out walking randomly down the street can be quite challenging. There’s a whole bunch of environmental factors there that go to essentially reducing the confidence level,” Dixon said in a story published on Ideasroom. “None of this is absolute. There is no one-to-one match. And by perturbing an image even a small amount you can make the machine-learning system think the person is a toaster. I’m not joking.”

If a computer recognizes a face, for example, as person of interest in a crime, it is very hard to change that perception, even if it is wrong, because humans have a hard time believing a machine can make a mistake, especially if it has said it is the correct match, Dixon explained.

In the United States, a conservative estimate is that roughly a quarter of all the 18,000 law enforcement agencies have access to facial recognition systems, particularly for the use in investigations. Yet, Georgetown Law Professor Clare Garvie said there are no laws – at the state or federal level – governing its use.

Advertisement. Scroll to continue reading.

Garvie, a senior associate at the center on privacy and technology at Georgetown said, “As a result, this technology has been implemented largely without transparency to the public, without rules around auditing or public reporting, without rules around who can be subject to a search. As a result, it is not just suspects of a criminal investigation that are the subject of searches. In many jurisdictions, witnesses, victims or anybody associated with a criminal investigation can also be the subject of a search.”

Because there is little reporting and auditing of the use of the technology, it’s unclear if agencies are checking to determine if it’s being misused or if it is actually a helpful and successful tool, Garvie said. Are law enforcement officials “catching the bad guys” or is the use of the technology a waste of money, which she said she suspects it is in some jurisdictions.

Meanwhile, it may come as no surprise to some, those often caught in the crosshairs are from lower socio-economic status or marginalized populations.

In one instance, a person who was ranked 319th for being a likely match based on the algorithmic ranking, was the one police arrested. The police also failed to provide the ranking evidence to the defense lawyers.

In the United Kingdom, the technology has been used extensively and with mixed results by law enforcement and businesses in order to search for people on watch lists, according to Dr. Joe Purshouse from the School of Law at the University of East Anglia in the UK.

Advertisement. Scroll to continue reading.

“The human rights implications for privacy, freedom of assembly – those are chilling, Purshouse said, adding the marginalized are caught in the middle such as, “Suspects of crime, people of lower socio-economic status who are forced to use public space and rely more heavily on public space than people who have economic advantages, perhaps.”

Mary Ann Lopez earned her MA in print journalism from the University of Colorado and has worked in print and digital media. After taking a break to give back as a Teach for America corps member and teaching science for a few years, she is back with her first love: writing. When she's not writing stories, reading five books at once, or watching The Great British Bakeoff, she is walking her dog Sadie and hanging with her cats, Bella, Bubba, and Kiki. She is one cat short of full cat lady status and plans to keep it that way.

Click to comment

Leave a Reply

Your email address will not be published.

AdBlocker Message

Our website is kept FREE to you by displaying online ads to our visitors. Please consider supporting us by disabling your ad blocker OR subscribing to our email newsletter: https://theamericangenius.com/get-american-genius-newsletter/

The
American Genius
news neatly in your inbox

Subscribe to our mailing list for news sent straight to your email inbox.

Advertisement

KEEP READING!

Tech News

Thanks to its ability to simulate situations safely, virtual reality technologies are proving effective in therapy for PTSD patients.

Tech News

(TECHNOLOGY) No-code app-building tools are becoming more available to the everyday user, which could lead to more inventive and original apps.

Tech News

(TECHNOLOGY) No downloading obscure apps to increase your production here. This site gives you productivity hacks to utilize the tools you already have.

Tech News

(TECH NEWS) If you didn't know by now, you can change your background on Zoom to something more exciting than your office wall. Here's...

The American Genius is a strong news voice in the entrepreneur and tech world, offering meaningful, concise insight into emerging technologies, the digital economy, best practices, and a shifting business culture. We refuse to publish fluff, and our readers rely on us for inspiring action. Copyright © 2005-2022, The American Genius, LLC.