With retail crime at record levels, major chains including Iceland, Asda and Home Bargains are deploying facial recognition systems to identify potential thieves. But as false accusations mount and privacy concerns grow, is the technology doing more harm than good?
Anna was shopping in Home Bargains when she was accosted by staff, accused in front of other shoppers of stealing a 39p pack of paracetamol, had her own medication taken from her bag after a search, and ordered to leave.
Little did she realise that her facial features – including the distance between her eyes, the width of her nose, the shape of her jawline – had been captured and stored by Home Bargains and its security partner Facewatch on a secretive ‘blacklist’ database.
Anna (not her real name) has never been charged or convicted of the supposed theft and protests her innocence. Home Bargains – having described her telling of the incident as “entirely fabricated and lacking in credibility” – initially refused to supply CCTV footage as proof, then later claimed it had been deleted.
The incident has “had a serious and long-lasting impact on my mum”, says Anna’s daughter. She’s “scared to go shopping alone” and is “constantly worried about being recorded and recognised – she’s even having counselling to deal with the anxiety it caused”.
With more retailers rolling out this type of tech, how reliable is it? What different systems are available and how do they work? What are the legal and reputational risks? And is the toll taken on the likes of Anna worth it to reduce theft and protect staff more widely?
Technology like Facewatch – where individuals’ facial data is added to a blacklist and staff alerted to their presence on entering a store – is becoming increasingly popular. And mistakes are being made. Early last year a 19-year-old woman was wrongly flagged as a suspected shoplifter when shopping in Home Bargains in Manchester. “I’ve never stolen in my life,” she recounts. “I was confused, upset and humiliated.”
Case study: Kmart and Bunnings
The Office of the Australian Information Commissioner launched a probe into department store Kmart and home improvement chain Bunnings in 2022.
While findings on the former have not yet been released, late last year it ruled Bunnings had breached the privacy of shoppers. The OAIC said the tech “was the most intrusive option, disproportionately interfering with the privacy of everyone who entered stores, not just high-risk individuals” and ultimately not justified. Bunnings is appealing the ruling.
Earlier this month, Danielle Horan went public after being falsely accused of stealing £10 worth of toilet rolls (Facewatch later admitted a review of the incident showed the items had been paid for).
Civil liberties group Big Brother Watch has heard from at least 40 people misidentified by or mistakenly added to Facewatch’s list alone. And there’s likely to be more, plus people who won’t be aware they’ve been blacklisted until they next walk into a specific store.
“This tech turns shoppers into walking barcodes and makes us a nation of suspects,” says Madeleine Stone, senior advocacy officer at Big Brother Watch, “with devastating consequences for people’s lives when it inevitably makes mistakes.”
Alternative solutions ot combating retail crime
The rising number of retailers using facial recognition tech responds to an onslaught of theft, abuse and violence in the retail sector. The latest BRC crime survey shows that incidents of violence and abuse rocketed by 53% in 2024 to more than 2,000 per day. About 70 of those involved a weapon – more than double the previous year. So retailers are turning to technology to help.
Iceland began trialling cameras from Facewatch in two stores in June, while Asda began piloting a system from rival provider FaiceTech in five Greater Manchester stores in March.
“Organised and targeted retail crime is out of control,” Iceland executive chairman Richard Walker said last week on LinkedIn. “Every week I see the reports from our stores and read about our colleagues being abused, threatened and assaulted simply for doing their job. So I’m proud that we’re taking action.”
His comments echoed those of Liz Evans, Asda chief commercial officer, non-food and retail, who said: “The rise in shoplifting and threats and violence against shopworkers in recent years is unacceptable. We have to look at all options to reduce the number of offences committed in our stores and protect our colleagues.”
Of the two systems, Facewatch is the better known, with more retailers using it. As well as Iceland, it counts Home Bargains, B&M and a number of convenience retailers such as Southern Co-op among its customers.
Facewatch provides retailers with specialist facial recognition cameras, only one of which is necessary per store, according to the company. It will scan everyone’s face as they enter the store and check for a match on a database of ‘subjects of interest’ (SOI) adjudged to have committed a crime previously. Store staff are sent an alert when a match to the blacklist is detected, including suspects believed to have committed a crime in a store operated by another Facewatch retail customer. If a match is not detected then the shopper’s facial recognition data is immediately deleted.
Case study: Foodstuffs
NZ supermarket group Foodstuffs’ 25-store facial recognition trial has been under investigation by the country’s privacy commissioner, which this month ruled it complied with the Privacy Act.
It is satisfied with safeguards such as no watchlist sharing between stores, and it only being used for customers “engaged in seriously harmful behaviour”. Yet the commissioner said he “can’t be completely confident it has fully addressed bias issues, including the potential negative impact on Māori and Pacific people”.
SOIs are added to the list after retailers upload an image of a suspect along with a formal, signed witness statement. Facewatch then reviews the submission and has the final say – meaning it acts as the data controller and takes on the compliance obligations that come with it.
“I have a cohort of retired police officers who have dealt with shoplifters, antisocial behaviour, and weapons,” says Facewatch data protection officer Dave Sumner. “They will scrutinise the words and make sure what’s described amounts to a crime. That means [we need] somebody who witnessed the crime or took witness testimony from staff or has CCTV evidence, and can describe it.
“The ex-cops will use their experience and judgement and may say: ‘What you’re describing doesn’t amount to a crime.’ If it does, it goes on to the system, accompanied by a specialist image from our camera.”
Rival systems
Altrincham-based crime tech consultancy FaiceTech differs: it acts as the data processor, while its retail clients are the data controllers deciding who’s added to their own blacklist, which is not pooled with other retailers. FaiceTech provides software – called FaiceAlert – that can turn retailers’ existing cameras into facial recognition ones. Like Facewatch, it usually relies on a single camera near the entrance to do the job of face-matching against a blacklist.
Matches require “human verification”, so alerts are sent only to trained security staff, says FaiceTech operations & compliance director David Pain: “The tech simply helps humans identify people who have previously committed abusive, threatening or criminal acts.”
FaiceTech will not name its clients, but says they include other national retailers besides Asda.
Retailer rollouts are set to continue. A 2024 Avery Dennison survey of 300 retail leaders found 44% said they had used facial recognition technology in the past 12 months, with 37% planning to in the next two years.
Case study: Mall of America
In November, two US senators from opposite ends of the political spectrum united against Mall of America’s use of facial recognition, dubbing it a “direct assault on privacy”.
The technology rolled out last summer in response to a spate of gun-related incidents. Tech provider Corsight says its software has undergone rigorous independent testing by the National Institute of Standards and Technology and that the Department of Homeland Security correctly identified individuals 99.3% of the time.
Iceland is already planning to extend its pilot to up to six stores by October 2025. The retailer intends to expand it even further in the long-term. It anticipates a 30% reduction in violent incidents in stores using the technology, while Facewatch claims its system is proven to reduce theft by at least 35% in the first year of deployment.
A retail source tells The Grocer that Facewatch makes staff safer by giving them the opportunity to ask repeat offenders to leave before they have committed another crime and “got into that fight or flight mindframe”. But as adoption spreads, protest and legal challenges are likely to follow.
A May report by the Ada Lovelace Institute concluded facial recognition rollouts exist in a “legal grey area”.
“The current governance arrangements are less than ideal for everyone,” says Nuala Polo, UK public policy lead at the institute. “The lack of clarity in the law presents risks for people, who may be subject to unlawful deployments; public trust in these technologies, which are being deployed in the absence of public scrutiny and democratic debate; and crucially, to those wishing to deploy the technologies, because they cannot be certain that what they’re doing is lawful and have to expend a lot of resource trying to ensure it will be.”
The UK has a “fragmented and piecemeal approach” to governing facial recognition, Polo adds, and “is failing to provide legal certainty or safeguard the public”.
A spokeswoman for the Information Commissioner’s Office (ICO), which is responsible for scrutinising how the technology is used in practice to ensure compliance with data protection law, says: “Facial recognition technology’s use must be necessary and proportionate, and the benefits must not outweigh people’s fundamental right to privacy.”
Where that line is drawn isn’t entirely clear. In 2019, the ICO issued guidance that live facial recognition technology was an “intrusive tactic” with a high and “strict necessity threshold” for its use, for example “to locate a known terrorist suspect or violent criminal” but likely not to “identify known shoplifters”. However, following a 2023 investigation into Facewatch by the ICO, the office stated facial recognition “helping businesses prevent crime is in the public interest and a benefit to society”. Having “identified various areas of concern”, it was content to request Facewatch make improvements such as “focusing on repeat offenders or individuals committing significant offences”.
Facewatch’s Sumner says: “The ICO looked at all of my compliance paperwork, and we received considerable legal support in order to achieve a position whereby after four years they were happy we were compliant.”
The ICO is taking “quite a light-touch approach”, argues Alex Lawrence-Archer, solicitor at data rights agency AWO. Its present stance has “definitely fallen short of what the people we act for would like to see, which is something a bit firmer – saying they’ve really got it wrong in a way that affects people in a really serious way. It’s dangerous when private companies start keeping lists of supposed criminals in such low-stakes scenarios and on such a flimsy evidential basis. Yes, it can be lawful, but at the moment it often isn’t.”
Retailers “should carefully consider” its use, the ICO told The Grocer. “The right checks and balances must be in place to ensure the accuracy and integrity of the personal information they collect,” the spokeswoman added.
When facial recognition gets it wrong
For its part, Facewatch acknowledges that, between retailers and itself, the process of adding people to the blacklist is not infallible. Though he does not want to comment directly on the case of Danielle Horan and Home Bargains, Sumner accepts the possibility of a “rare event” in which the retail staff member reporting the alleged crime “has made a mistake and the item was actually paid for on a previous visit”.
“We’ve made enquiries. We’ve established that a crime wasn’t in fact committed. And then we resolve it by communication with the subject and their removal from the system entirely,” Sumner adds. “In those narrow circumstances it’s understandable to anyone, including the ICO, that humans will err.”
Among other things, the Data Protection Act requires that the processing of data is “proportionate”. For Facewatch, this means a retailer will receive an alert to an SOI’s presence only if they are within a set radius of the alleged earlier crime, though the exact distance can depend on the threat.
“If they’ve stolen a bottle of whisky in Glasgow, it wouldn’t be proportionate for them to then experience an alert in Manchester,” says Sumner. “However, if when they stole that bottle of whisky they had a weapon with them, then it would be quite proportionate.”
A similar principle applies to the length of time people are kept on the list. Anyone can check if they’re on it at any time by sending Facewatch a subject access request, with information on doing so on its website.
It’s not a burden shared by FaiceTech, thanks to its status as data processor rather than controller.
“Really early on we recognised you can make a system as compliant as possible, but it depends on how humans use it,” says FaiceTech’s Pain. “Our model is to provide the software and ensure our customers are going through the appropriate process to show what they’re doing is justifiable and proportionate.
“By asking our clients to take responsibility for that, it means they go through that process, they ensure everybody within their business is properly briefed and trained and that they have all the correct processes and policies to take responsibility for compliance.”
FaiceTech and Facewatch also have some mutually exclusive views on what makes their systems ethical and compliant. FaiceTech questions the ethics and proportionality of a “national database”. “There’s obviously a case for national databases when you’re dealing with organised crime or acquisitive crime,” says Pain. “But if, for example, it’s a person that’s fallen on hard times, arguable you’re making their life harder by barring them from buying food on the high street for something that happened in an electrical store. We don’t think that’s ethically right.”
Facewatch’s Sumner argues its system is compliant precisely because more than one company benefits from the list. Only this passes as a ‘substantial public interest’ in the Data Protection Act, he says.
Case study: Dutch warning
The Dutch Data Protection Authority in 2021 issued a formal warning to an unnamed supermarket for its use of facial recognition tech, forcing it to disable the system. Monique Verdier, DPA deputy chairperson, said: “Use of such tech outside the home is banned in nearly all cases. And for good reason.”
The authority found a customer walking into a shop couldn’t give explicit consent for their data to be processed, and store security did not meet the bar of substantial public interest.
Reputational risks
Beyond the legalities, use of facial recognition tech can significantly damage retailer reputations, argues Big Brother Watch’s Stone. Consumers have been quick to vent on social media about security measures brought in by supermarkets – like Tesco’s ‘VAR’ self-checkouts and trolley weighing gates, or Morrisons’ locked booze cabinets. Facial recognition is even more “dystopian, disproportionate and chilling”, she argues.
“We’ve had awful stories about elderly people now being afraid of shopping alone, absolutely terrified of being publicly accused of being a shoplifter,” Stone says. “If you’re a vulnerable person, if you’re elderly, if you have mental health issues, if you don’t speak English very well – it can really affect your life and willingness to go out in public.”
Stone also says the technology is open to abuse. It’s not a stretch, she argues, to imagine an abusive partner, someone with a personal grudge, or even a discriminatory shopworker, putting an individual on a blacklist with “devastating” consequences.
Big Brother Watch has mounted several campaigns against retailers’ use of facial recognition. When Asda announced its five-store trial in March, the campaign group hired a digital display van to circle the stores taking part, displaying the message ‘Asda: Rolling Back Your Privacy’. It claims its followers sent some 5,425 emails to the supermarket in opposition to the trial.
The Grocer has also seen correspondence from B&M apologising to a customer mistakenly challenged for shoplifting as a result of Facewatch. The retailer’s customer services team told them: “Please accept my sincere apologies for any embarrassment, inconvenience and upset caused. We would never want this for one of our valued customers, and we hope that in time this incident does not deter you from shopping with B&M.” The retailer also sent the customer a £25 voucher.
The concerns of civil liberty campaigners have prompted a stiff response from supermarket chiefs. Asda chairman Allan Leighton, when questioned about the trials, said: “Be very clear, we will do whatever we need to do to make sure our colleagues are safe. Whatever we need to do.”
Iceland’s Walker hit a similar note when he spoke up on this issue in June, making “zero apologies” for protecting staff.
Dialogue between the groups and major retailers appears to have broken down completely. As an Asda spokesperson told The Grocer regarding inquiries about Big Brother Watch: “I’d be keen to understand why you’re even giving them the time of day.”
Case study: Mercadona
In 2021, Spanish supermarket Mercadona launched facial recognition technology in 48 stores to spot individuals with restraining orders or known criminal records.
But the system scanned every customer without informed consent, including children and employees, and so fell foul of Article 9 of the EU General Data Protection Regulation (GDPR), ruled Spanish regulators.
The Spanish Data Protection Authority fined Mercadona €2.5m, calling the programme illegal, disproportionate and unnecessary.
Tony Porter, former surveillance camera commissioner for England and Wales, says the cases exposed by Big Brother Watch are “undoubtedly emotive, but we need balance. One person’s anecdote should not outweigh the demonstrable societal value of proportionate, responsible use of technology,” he says. “The complaint this is ‘chilling’ – is that necessary? Is that proportionate? Given the billions in loss going down the swanee. The debate is several years out of date – I think they’re now on the wrong side of history.”
Porter is now chief privacy officer at facial recognition firm Corsight AI. The company’s technology has been rolled out at Mall of America sites in the US.
“They had guns outside – they don’t now,” he says. “There’s increasing confidence for mums and dads with kids going to stores because they’re not going to be hit by people that are on the latest drugs craze.” He adds: “You’ve got retailers more confident in opening stores. Yet you’ve still got Big Brother Watch saying this is chilling?”
But it’s not campaigners that retailers should be upset with, says Stone. Their attention should be on Government and the lack of response to retail crime from law enforcement.
The BRC survey found dissatisfaction with the police has increased, with 61% of respondents describing the police response to incidents as ‘poor’ or ‘very poor’.
“No doubt there’s a problem with shoplifting in this country,” Stone says. “And retailers should actually be very unhappy with the fact they’re being asked to shoulder this kind of legal responsibility and liability because the police are not responding.”
But as long as they are the ones shouldering that responsibility, retailers look likely to continue adopting facial recognition technology.
As Iceland’s Walker explains: “If I have to choose between upsetting a campaign group or protecting our colleagues from violence, I’ll pick our people every time.”
Auror: the new platform connecting the dots
The Crime & Policing Bill third reading, which is in the House of Lords stages of parliamentary process, will scrap the £200 threshold for shoplifting offences to be tried in Crown Court.
Until it does, shoplifting under that value remains a summary offence, meaning retailers face an uphill battle to get police to engage as a result. So retailers turn to tech such as facial recognition to protect staff and profits.
But there’s a different solution helping retailers link multiple thefts by a single offender, until the total value compels police to act.
New Zealand-based Auror is a crime reporting platform that helps retailers and police connect repeat offenders and organised crime groups to what might otherwise be thought of as one-off offences.
It’s used by 45,000 stores and 3,000 law enforcement agencies across Australia, New Zealand, North America and the UK, where it counts M&S, Home Bargains and Holland & Barrett among its clients.
“You log all your incidents and they can get commonality between you and other retailers,” says a retail source.
“We’ve got to catch someone getting up to the value of £200 before the police will entertain it. It could be in different stores and other people’s stores, but this system allows us to pool it together.”
Police have access to the platform, and the forms completed by retailers are similar to witness statements, dispensing with the need for an officer to attend in person.
According to Auror, the platform enables retailers to connect events to repeat offenders but does not let them see each others’ data or evidence. Nick McDonnell, Auror senior director of trust & safety, says: “Through Auror, retailers and law enforcement agencies can better collaborate on addressing high-volume retailer crime.
“Instead of relying on manual reporting and time-consuming evidence gathering, law enforcement can use a secure, digital reporting process to gain greater visibility of the scale of crime in their communities, helping them focus their precious resources on those causing the most harm.”
No comments yet