
Artificial Intelligence, News, Technology
What’s the problem with facial recognition technology?
Facial recognition technology has multiple flaws. One of its biggest – it’s not very good at recognising faces.
Jennifer Thompson was asleep at home in North Carolina on July 28, 1984, when a man broke into her house and raped her at knifepoint. While the brutal crime was occurring, she tried as much as possible to study her character’s mannerisms and facial features.
After she escaped her attacker, Thompson contacted the police, who helped to make a composite sketch of the wanted man.
The police then received a tip about a restaurant worker called Ronald Cotton, who had previous convictions for sexual assault, and breaking and entering.
A detective then called Jennifer into his office, lay six photos of suspects on the table, and asked her if any of them were the attacker. She pointed at the picture of Ronald Cotton and said he was guilty. She identified him again as her rapist in an identity parade. She was absolutely sure that Cotton was the man who brutalised her.
Cotton received a life sentence. But after ten and a half years in jail, he was released when DNA testing on some semen from the crime scene matched the DNA of convicted rapist Bobby Poole.
But the story does not end there. Cotton eventually forgave Thompson and the pair now work together on reforming the criminal justice system in America.
Do machines have better eyes than humans?
The case shows the difficulties caused by eyewitness testimony. It often persuades juries to give a guilty verdict. But our memories are subject to biases and distortions. We may see things we want to see or remember only fragments of what we saw.
As psychology professor, Stephen L. Chew writes: “Memory doesn’t record our experiences like a video camera. It creates stories based on those experiences.”
Facial recognition technology is supposed to improve on this human fallibility. And in some areas, it is showing promise. Recently, the startup FDNA created the DeepGestalt algorithm that can identify genetic disorders from facial images.
When Jarrod Ramos killed five employees of The Capital newspaper in Annapolis, Maryland last year, the authorities apprehended him. But Ramos’ lack of cooperation and ID and problems with fingerprint analysis prevented him from being identified.
When his image was put through the Maryland Image Repository System (MIRS) though, it was compared to photos of ten million people, including known offenders and all driver’s license photos in the state. Police then found a match.
But the police’s use of facial recognition technology is beset with major problems. One of them is that the technology often struggles to identify people from their faces.
A case of mistaken identity
In one experiment conducted by the MIT Media Lab’s Joy Buolamwini, facial recognition algorithms made by IBM, Microsoft and Face++ were tasked with recognising the faces of politicians from countries with a high proportion of female politicians.
The technology was more likely to identify male politicians from Norway than female politicians from South Africa. Why? Because the technology is much better at recognising white males than black females.
The findings showed that lighter-skinned males were incorrectly identified 1 per cent of the time, but darker-skinned females were wrongly matched on 35 per cent of occasions.
This is likely because tech workers have trained the facial recognition software (perhaps unwittingly) using a disproportionately higher number of white lighter-skinned male faces.
But the technology has a greater problem than racial and gender bias; it’s just not very good at identifying faces.
Research from campaign group Big Brother Watch found that the Metropolitan Police’s use of automated facial recognition (AFR) incorrectly identified innocent people 98 per cent of the time. South Wales Police was better; their AFR matches were only wrong on 91 per cent of occasions.
The South Wales force is now the centre of a legal action brought by office worker Ed Bridges. The former councillor alleges that the force’s AFR captured an image of him without his consent and knowledge and that this infringed on his right to privacy.
Before the case began, a lawyer from the civil liberties group Liberty, which is representing Mr Bridges said:
“Facial recognition…is discriminatory and takes us another step towards being routinely monitored wherever we go, fundamentally altering our relationship with state powers and changing public spaces. It belongs to a police state and has no place on our streets.”
China crisis
We need only look to China to see a forerunner of this dystopian worry. The Chinese state is building a national facial recognition system that it hopes will be able to identify its 1.3 billion citizens in seconds.
The technology is being used in Xinjiang province, home to the persecuted Uighur ethnic minority. The Chinese state has used facial recognition as one of many tools to control the Uighurs, who they think threaten the Communist Party’s rule.
Uighurs have been detained for displaying even minor signs of piety, such as growing a beard, refusing to serve alcohol in restaurants, and praying.
The discrimination of the Uighurs is a warning sign of what might happen as facial recognition systems become more advanced. If it is not well-regulated, our right to privacy and freedom from discrimination will be at stake.
These concerns are making people more sceptical of the technology. The Californian cities of San Francisco and Oakland and the Massachusetts city Somerville have instituted bans on facial recognition software by public agencies.
Our worries about the software should not prevent technologists from working to improve its accuracy. However, the ethics and laws governing facial recognition technology must be rigorously debated and regularly reviewed. Otherwise, the future may look a lot like 1984.
Subscribe to the Blog
Join for the latest tech, design and industry news straight to your inbox.

Our UX team designs customer experiences and digital products that your users will love.
Follow Us