Macy’s is getting sued for allegedly misusing a facial recognition program to illegally identify shoppers in their stores. A woman filed a class action lawsuit against Macy’s in Chicago, claiming that the fashion retail giant has violated the Illinois Biometric Information Privacy Act.
BIPA is one of the strictest data protection laws in the country. Passed in 2008, it requires companies doing business within Illinois state lines to first get a person’s permission before collecting biometric information about them, such as identifying facial details. Any such information that is collected must be securely stored, and destroyed in a timely manner.
The plaintiff, one Isela Carmine, based her suit on reports from Buzzfeed News and The New York Times which revealed that Macy’s had conducted over 6,000 searches using faces captured on store security camera footage. She asserts that storing and using facial data about Macy’s customers without fully disclosing this practice constitutes a clear violation of BIPA, adding that Macy’s can potentially use that information to track customers and use their identity for business purposes against their will.
Though it is not named in the lawsuit, the tool that Macy’s used to perform those searches was developed by none other than Clearview AI: the facial recognition company bringing about, as an NYT headline quipped, “the end of privacy as we know it.”
Clearview AI works by collecting human facial data from billions of public photographs using websites like Facebook and Google. It organizes them into a searchable database: Put in a picture, and you get a person’s identity. Because they use publicly available photos, these pictures are utilized without the subject’s knowledge or consent.
Clearview is primarily marketed as a tool for law enforcement, which is already considered (to put it lightly) controversial. But as Buzzfeed and the New York Times demonstrated, Clearview is also available for private use. Because, of course it is.
Carmine vs Macy’s will determine whether or not that particular use of Clearview’s technology was legal. But the suit invites another big question: Under BIPA, is it legal for Clearview AI to operate at all in Chicago? This is a highly pertinent question, considering that when ethical concerns prompted the Canadian Privacy Commissioner to launch a probe into Clearview’s practices, they suddenly announced plans to stop doing business in Canada entirely.
This industry has existed in a legal grey area for too long, which allowed businesses like Clearview to get used to behaving badly. The clarity that we’ve been waiting for around these applications of facial recognition tech is coming soon: it just remains to be seen exactly who it will end up benefiting.