The news is by your side.

Rite Aid’s AI facial recognition has wrongly labeled people of color as shoplifters

0

Rite Aid, the pharmacy chain, used facial recognition technology to falsely and disproportionately identify people of color and women as likely shoplifters, the Federal Trade Commission said Tuesday, describing a system that embarrassed customers and raised new concerns about the biases inherent in such technologies were ingrained. .

Under the terms of a settlement, Rite Aid will be barred from using facial recognition technology in its stores for surveillance purposes for five years. the FTC said. The agency, which enforces federal consumer protection law, appeared to indicate how seriously it would respond to concerns about facial recognition technology.

The FTC’s 54-page complaint also sheds light on how a once theoretical concern—that human biases would permeate artificial intelligence algorithms and increase discrimination—has become a concern in the real world.

Samuel Levine, the director of the FTC’s Bureau of Consumer Protection, said in a statement that “Rite Aid’s reckless use of facial surveillance systems has subjected its customers to humiliation and other harm.”

From October 2012 to July 2020, the complaint said, Rite Aid employees responding to false alerts from the systems followed customers into stores, searched them, ordered some to leave and, if they refused, sometimes called police to arrest them. confront or remove. in the presence of friends and family.

Rite Aid’s actions have disproportionately affected people of color, especially Black people, Asians and Latinos, all in the name of keeping “persons of interest” out of hundreds of Rite Aid stores in cities like New York, Philadelphia and Sacramento, according to the release. the FTC.

Rite Aid said in a statement that while she disagreed with the FTC’s allegations, she was “pleased to reach an agreement.”

“The allegations relate to a pilot program for facial recognition technology that the company has implemented in a limited number of stores,” the company said. “Rite Aid stopped using the technology in this small group of stores more than three years ago, before the FTC’s investigation into the company’s use of the technology began.”

The settlement with Rite Aid comes about two months after the company filed for bankruptcy protection and announced plans to close 154 stores in more than 10 states.

Rite Aid used facial recognition technology when retail chains raised alarms about shoplifting, particularly “organized retail crime,” where multiple people steal products from different stores and later sell them on the black market.

These concerns prompted several stores, including Rite Aid, to protect merchandise by storing much of it in plastic crates.

But those concerns seem exaggerated. This month, the National Retail Federation withdrew its incorrect estimate that organized retail crime was responsible for nearly half of the $94.5 billion in retail merchandise missing in 2021. Experts say this number is probably closer to 5 percent.

Rite Aid did not tell customers it was using the technology in its stores, and employees were “discouraged from disclosing such information,” the FTC said.

It’s not clear how many other retailers are using facial recognition technology for surveillance. Macy’s told Business Insider that it uses it in some stores. DIY store says on its website that it “collects biometric information, including facial recognition.”

Alvaro M. Bedoya, the commissioner of the FTC, said in a statement that “the blunt fact that surveillance can harm people” should not be lost in conversations about how surveillance violates rights and invades privacy.

“It has been clear for years that facial recognition systems can perform less effectively on dark-skinned people and women,” Mr Bedoya said.

Woodrow Hartzog, a law professor at Boston University who has researched facial recognition technologies and the FTC, said the agency’s complaint against Rite Aid shows that it views AI surveillance technology as a serious threat.

The purpose of the agency’s complaint was significant, Professor Hartzog said. Although Rite Aid hired two unnamed companies to help create a database of people it believed might shoplift, the FTC went after Rite Aid alone.

The FTC, he said, is essentially saying that “the culpable conduct we are focusing on is the failure to conduct due diligence when working with other suppliers.”

The complaint notes that Rite Aid has used the surveillance systems in urban areas and along public transportation routes, resulting in a disproportionate impact on people of color, officials said.

About 80 percent of Rite Aid stores are located in areas where whites are the largest racial or ethnic group. But 60 percent of Rite Aid stores that used facial recognition technology were in areas where white people were not the largest racial or ethnic group, the FTC said.

Rite Aid trained security guards at its stores to enter images into a “enrollment database” of people it considered “persons of interest,” and employees were told to “pursue as many enrollments as possible.” The databases were filled with low-quality images, many of which came from closed-circuit television, cell phone cameras and media reports, the FTC said.

That flawed system, officials said, produced thousands of “false positive matches,” or alerts that incorrectly indicated a customer was a “match” with a person in Rite Aid’s database. Even worse, Rite Aid failed to detect false positives, the complaint said.

“Rite Aid’s failure to properly train or monitor employees who use facial recognition technology further increases the potential for harm to consumers,” the FTC said.

In one case, Rite Aid employees stopped and searched an 11-year-old girl who had been incorrectly flagged by the system as likely to shoplift.

In another example cited in the complaint, a black man wrote to Rite Aid after being the victim of a false positive facial recognition match.

“When I walk into a store now, it’s weird,” he said, adding, “Every black man is not a thief, and they shouldn’t feel that way.”

Leave A Reply

Your email address will not be published.