Innocent man arrested after AI facial recognition said he was a jewellery thief
A police force’s AI-powered facial recognition system mistook an innocent man for a burglar.
Alvi Choudhury, 26, was arrested at about 4pm on January 7 while he was working from his home in Southampton.
Thames Valley Police systems matched his mugshot – taken during a false arrest five years ago – to CCTV footage of a thief who stole £3,000 and jewellery from Milton Keynes Buddhist Vihara in December.
After waiting until midnight to be interrogated, ‘laughing’ officers realised within 10 minutes that they had the wrong man for a crime 80 miles away.
Choudhury told the Daily Mail: ‘The TVP officer admitted to me that before she even interviewed me, she knew I wasn’t the suspect because she had seen my custody photos and she had seen the footage of the suspect and she knew straight away.’
Sign up for all of the latest stories
Start your day informed with Metro's News Updates newsletter or get Breaking News alerts the moment it happens.
He added: ‘They said they had officers visually review it. That is even more concerning because that is probably racial discrimination.
‘You’ve probably just seen two brown people, even though they have completely different features and said, “yeah, they look close enough. Let’s arrest them”.’
The software engineer was released at about 2am.
Choudhury claimed that the CCTV footage of the crime featured a younger man with curly hair who did not resemble him.
He explained that his face was in police records because he was arrested in 2021.
The then-Portsmouth university student was wrongly arrested after a gang of nearly 10 men attacked him and his friends during a night out.
Officers detained Choudhury and his friends, despite them being covered in wounds, he said.
Police released him when they discovered another couple had been attacked on the same night.
Choudhury has since launched legal action against the force.
‘It is filled with bugs’
Live Facial Recognition (LFR) tools scan people’s faces and cross-reference them against watchlists of known or wanted criminals.
Police officials have lauded the system as the biggest breakthrough in catching crooks ‘since DNA matching’.
Yet the Home Office admitted in December that this tech , returns more false positives for ‘some demographic groups’ on certain settings.
Matches for Black faces are false positives 5.5% of the time, compared to just 0.04% of white face matches.
AI software is only as smart as the data that trains it, meaning real-world prejudices can seep into it.
Many commercial facial-recognition systems exhibit racial biases, seen in how they struggle to identify people of colour, experts have long said.
This is in part because AI works by consuming data to understand patterns – if it’s trained with more faces of white men, it’ll have a tougher time identifying anyone who isn’t a white man.
Critics warn that such technology can lead to falsely matched people being brought into interrogation rooms, ending up on watch lists or losing their jobs.
And this includes Choudhury, who worries he might look ‘suspicious’ to an employer.
Choudhury said: ‘No tech company would ever put a system into production with a failure rate of one in 25. That’s horrific.
‘It is filled with bugs.’
He’s now calling on the government to re-examine AI technology and introduce legislation to regulate it.
Some 25,000 searches using facial recognition systems are carried out every month, according to the National Police Chiefs’ Council.
Thames Valley Police told Metro: ‘While we apologise for the distress caused to the complainant in this case, their arrest was based on the investigating officers’ own visual assessment that the individual matched the suspect in CCTV footage following a retrospective facial recognition match, and was not influenced by racial profiling.
‘To confirm, retrospective facial recognition technology did initially provide intelligence, but did not determine the arrest.
‘Although later enquiries eliminated the individual from the investigation, this does not make the arrest unlawful.
‘We continue to use policing tools responsibly while striving to improve and build trust in our communities.’
Get in touch with our news team by emailing us at webnews@metro.co.uk.
For more stories like this, check our news page.