Shopper wrongfully ejected after facial recognition alert • The Register


A British supermarket says staff will undergo further training after a store manager ejected the wrong man when facial recognition technology triggered an alert.

Warren Rajah was approached by a store manager at Sainsbury’s in London’s Elephant and Castle and instructed to leave after the store’s Facewatch system alerted staff to a match.

Sainsbury’s told The Register that its Facewatch system correctly identified a man on its offenders’ database, and alerted store managers who manually review each flag. However, in responding to the alert, the manager approached the wrong person, Rajah, and escorted him out of the store.

A Sainsbury’s spokesperson said: “We have been in contact with Mr Rajah to sincerely apologise for his experience in our Elephant and Castle store. This was not an issue with the facial recognition technology in use but a case of the wrong person being approached in store.”

Facewatch technology is currently operating in six Sainsbury’s stores in the UK, five of which are in Greater London.

The facial recognition tech was first trialled in September 2025 in Sydenham and Bath Oldfield Park, before being rolled out to Dalston, Elephant and Castle, Ladbroke Grove, Camden, and Whitechapel earlier this year.

The technology has a reported 99.98 percent accuracy rate, we’re told, and has led to a 46 percent reduction in logged incidents of theft, harm, aggression, and antisocial behavior.

The majority of offenders (92 percent) do not return to stores with Facewatch running, and this is the first time a store manager has misidentified a customer after the system issued a alert.

Rajah, who works in sales at tech reseller CDW, told the BBC: “Am I supposed to walk around fearful that I might be misidentified as a criminal?

“Imagine how mentally debilitating this could be to someone vulnerable, after that kind of public humiliation.”

He reported being approached by three store managers holding smartphones. They looked at the phone, then at him, and told him to leave the store, pointing to posters near the entrance informing shoppers that facial recognition tech was in operation.

Rajah had to submit a copy of his passport and head shot to Facewatch so the company could verify he was not on the offenders’ database.

A Facewatch spokesperson said: “We’re sorry to hear about Mr Rajah’s experience and understand why it would have been upsetting. This incident arose from a case of human error in-store, where a member of staff approached the wrong customer.

“Our data protection team followed the usual lawfully required process to confirm his identity and verified that he was not on our database and had not been subject to any alerts generated by Facewatch.”

Facewatch is currently rolled out across other retailers in the UK including B&M, Budgens, Costcutter, Southern Co-op, Spar, and Sports Direct.

Other supermarkets such as Iceland began trialing the tech last year.

Digital rights group Big Brother Watch branded the frozen food purveyor’s trial “Orwellian” and “dystopian,” and said the company’s technology also led to the ejection of a woman in a Home Bargains store after she was wrongfully accused of theft.

Jake Hurfurt, head of research and investigations at Big Brother Watch, said at the time: “Iceland’s decision to deploy dystopian facial recognition technology to monitor its customers is disproportionate and chilling.

“Thousands of people will have their privacy rights violated just to buy basic necessities, and Iceland will turn its shoppers into suspects, making them submit to a biometric identity check as part of their daily lives.”

Big Brother Watch is campaigning against the use of live facial recognition in the UK, especially by London’s Metropolitan Police.

Jasleen Chaggar, Legal & Policy Officer at Big Brother Watch, said: “The idea that we are all just one facial recognition mistake away from being falsely accused of a crime or ejected from a store without any explanation is deeply chilling.

“To add insult to injury, innocent people seeking remedy must jump through hoops and hand over even more personal data just to discover what they’re accused of. In the vast majority of cases, they are offered little more than an apology when companies are finally forced to admit the tech got it wrong.

“This isn’t an isolated incident – Big Brother Watch regularly hears from members of the public who are left traumatised after being wrongly caught in this net of privatised biometric surveillance.

“The government’s promise to regulate this invasive technology will be payment to lip service unless it reins in the unchecked expansion of facial recognition by retailers.”

Big Brother Watch is currently spearheading a legal challenge against the technology, arguing that it is incompatible with human rights laws. ®



Source link