Rite Aid Prohibited from Using Facial Recognition Technology Due to False Shoplifting Identifications

Rite Aid is prohibited from using facial recognition technology for five years, following a Federal Trade Commission (FTC) investigation that revealed the drugstore giant’s “reckless use of facial surveillance systems.” The FTC found that this misuse embarrassed customers and jeopardized their “sensitive information.”

The FTC's order, pending approval from the U.S. Bankruptcy Court after Rite Aid's Chapter 11 filing in October, requires the company to delete all images collected during its facial recognition rollout. Additionally, Rite Aid must eliminate any products developed from these images and establish a strong data security program to protect personal data.

A 2020 Reuters report uncovered that Rite Aid secretly deployed facial recognition systems in around 200 locations over eight years, primarily testing in “largely lower-income, non-white neighborhoods.” With the FTC intensifying its scrutiny of biometric surveillance, Rite Aid became a focal point for several allegations. One major point concerns the creation of a “watchlist database” in collaboration with two contracted firms, which included images of customers accused of wrongdoing in their stores. These images, often of poor quality, were captured via CCTV and employee mobile cameras.

When a customer entered a store, and their appearance matched an image in the database, employees received alerts prompting them to take action—most often requiring identity verification and asking the customer to leave. Many of these alerts resulted in false positives, leading to wrongful accusations and causing “embarrassment, harassment, and other harm,” according to the FTC.

The complaint stated, “Employees acted on false alerts by following customers, searching them, requesting their departure, or even calling the police, publicly accusing them of shoplifting or other misdeeds.” Furthermore, the FTC noted that Rite Aid neglected to inform customers about the use of facial recognition technology and instructed staff not to disclose this information.

Facial recognition software has sparked significant controversy in the AI surveillance landscape. In recent years, numerous cities have enacted broad bans on the technology, while politicians push for regulations on its use by law enforcement. Meanwhile, companies like Clearview AI face lawsuits and fines globally for serious data privacy violations related to facial recognition.

The FTC's recent findings regarding Rite Aid also highlight the biases present in AI systems. The commission stated that Rite Aid's technology was “more likely to generate false positives in stores located in plurality-Black and Asian communities than in plurality-White communities.” Additionally, Rite Aid did not test or evaluate the accuracy of its facial recognition system before or after its implementation.

In a press release, Rite Aid expressed satisfaction in reaching an agreement with the FTC but disagreed with the core allegations. “The allegations pertain to a limited pilot program of facial recognition technology that the Company implemented in a small number of stores,” Rite Aid said. “We ceased usage of this technology in these locations over three years ago, prior to the commencement of the FTC's investigation.”

Most people like

Find AI tools in YBX