AI Facial Recognition to Identify Criminals: Is It Reliable?
Facial features, along with a person’s fingerprints, DNA analysis, and other evidence, are key parts of criminal identification. With the help of tools like face detection and recognition, police can speed up the investigation process by tracing unknown faces and matching existing databases.
Yet, the use of facial recognition software raises questions about accuracy and privacy. There were cases when these tools helped recognize criminals and there were cases leading to wrongful arrests.
There have been real-life cases where the police have caught criminals using security camera images and facial recognition tools. For example, Craig Walters was jailed for life in 2021 for attacking a woman he followed off a bus. The attack was interrupted by a member of the public and the predator ran off. South Wales Police arrested Walters within 48 hours of the incident, using images captured by security cameras and analyzed with an automation tool for identification. According to a study by the police, identifying a criminal post-event with a facial recognition tool typically takes minutes, while without automated identification, it may take around fourteen days. However, the accuracy of identification depends on various factors such as image quality and algorithms used.
Williams was not the only person misidentified as a criminal by facial recognition tools. In February 2022, 8-month-pregnant Porcha Woodruff was arrested in Detroit for robbery and carjacking based on a false facial recognition match. She spent 11 hours in jail being questioned about the crime.
Facial Recognition Tools Help Police Catch Criminals
To solve crimes, police departments may use facial recognition technology to identify individuals in doorbell camera photos, live video footage of crowds, and photos taken at crime scenes. These systems can access databases containing public photos from social media, passport and driving license images, and other resources. According to a press release by the UK government, the technology analyzes its database to find possible matches. The results are then reviewed by police officers for accuracy, who then decide whether to proceed.There have been real-life cases where the police have caught criminals using security camera images and facial recognition tools. For example, Craig Walters was jailed for life in 2021 for attacking a woman he followed off a bus. The attack was interrupted by a member of the public and the predator ran off. South Wales Police arrested Walters within 48 hours of the incident, using images captured by security cameras and analyzed with an automation tool for identification. According to a study by the police, identifying a criminal post-event with a facial recognition tool typically takes minutes, while without automated identification, it may take around fourteen days. However, the accuracy of identification depends on various factors such as image quality and algorithms used.
Referring to the benefits of the technology, the UK government states:
Technology such as facial recognition can help the police quickly and accurately identify those wanted for serious crimes, as well as missing or vulnerable people. It also frees up police time and resources, meaning more officers can be out on the beat, engaging with communities and carrying out complex investigations.
An early adopter of facial recognition technologies was China. In 2017, the country announced plans to build the world’s largest and most advanced camera surveillance network, intending to add 400 million new cameras to the existing 170 million. These cameras, equipped with AI and facial recognition technologies, are used across the country to assist the police in real-time by identifying individuals, their family members, and their recent locations. In 2017, a BBC reporter John Sudworth tested the system in Guiyang, a city of around 4 million people. Working with the local police, his photo was taken to be flagged as a suspect. For the test, Sudworth stopped his car close to the city center and walked toward the bus station. Police identified and captured him within 7 minutes.
Times When Automated Recognition Tools Were Wrong
In January 2020, Robert Williams became the first man wrongfully arrested because a facial recognition tool misidentified him as a shoplifter. As the New York Times reports, Williams was suspected of stealing five watches from a Detroit store in 2018 after an automation tool listed his photo among possible matches, finding a resemblance between his old driving license photo and a man recorded by the store’s surveillance camera. He spent around 30 hours in jail before a later investigation found him innocent.
In 2021, Williams sued the police for violating his rights; in a legal settlement reached in June 2024, the police were barred from arresting people based solely on the results of facial recognition. As part of the settlement, the complainant received $300,000.
Williams was not the only person misidentified as a criminal by facial recognition tools. In February 2022, 8-month-pregnant Porcha Woodruff was arrested in Detroit for robbery and carjacking based on a false facial recognition match. She spent 11 hours in jail being questioned about the crime.
Woodruff was released on a $100,000 personal bond and went straight to the hospital after her release, where doctors diagnosed her with dehydration. A month later, the court dismissed the case, and in August 2023, Woodruff filed a lawsuit against the police. She became the third person wrongfully arrested in Detroit and the sixth in the US, all of whom were Black.
Studies, including one by Cambridge University, reveal that facial recognition technology can be biased against racial minorities. Specifically, the technology shows lower accuracy rates for African and Asian individuals, with misidentification occurring up to 100 times more often.
Research also found that overrepresentation in police databases increases the likelihood of misidentification. In the US, over three-quarters of Black men are listed in criminal justice databases, raising the risk of errors. Other than that, the technology's accuracy can be affected by factors such as facial feature similarity, face angles, image quality, and data limitations.
Another issue with these tools is that they may use people’s data and photos from road cameras without their consent, which raises privacy concerns. Because of these concerns, the use of facial recognition tools is now banned in US cities such as San Francisco, Austin, and Boston, while the European Union is moving toward allowing the use of facial recognition under certain conditions, such as in law enforcement and public safety. And in China, officials sent notices to hotels to stop face scanning and monitoring guests.
Studies, including one by Cambridge University, reveal that facial recognition technology can be biased against racial minorities. Specifically, the technology shows lower accuracy rates for African and Asian individuals, with misidentification occurring up to 100 times more often.
Research also found that overrepresentation in police databases increases the likelihood of misidentification. In the US, over three-quarters of Black men are listed in criminal justice databases, raising the risk of errors. Other than that, the technology's accuracy can be affected by factors such as facial feature similarity, face angles, image quality, and data limitations.
Another issue with these tools is that they may use people’s data and photos from road cameras without their consent, which raises privacy concerns. Because of these concerns, the use of facial recognition tools is now banned in US cities such as San Francisco, Austin, and Boston, while the European Union is moving toward allowing the use of facial recognition under certain conditions, such as in law enforcement and public safety. And in China, officials sent notices to hotels to stop face scanning and monitoring guests.
The Challenge of Striking Balance in Crime Investigation
The image of a human face provides information about the person, including identity, emotional state, ethnicity, race, and age, which can be key in criminal identification. So far, however, there is no perfect method for identifying criminals solely through photos. Both eyewitness testimony and facial recognition results can be mistaken. In recent years, facial recognition technologies have made progress, improving accuracy, speed, and the ability to recognize faces under various conditions. Despite these advancements, they are not error-free, and law enforcement agencies must decide how to use these tools to uphold justice.