Ministers are under pressure to implement more robust safeguards for facial recognition technology, as the Home Office has acknowledged that it may mistakenly identify Black and Asian individuals more frequently than white people in certain contexts.
Recent tests conducted by the National Physical Laboratory (NPL) on how this technology functions within police national databases revealed that “some demographic groups are likely to be incorrectly included in search results,” according to the Home Office.
The Police and Crime Commissioner stated that the release of the NPL’s results “reveals concerning underlying bias” and urged caution regarding plans for a nationwide implementation.
These findings were made public on Thursday, shortly after Police Minister Sarah Jones characterized the technology as “the most significant advancement since DNA matching.”
Facial recognition technology analyzes individuals’ faces and cross-references the images against a watchlist of known or wanted criminals. It can be employed to scrutinize live footage of people passing in front of cameras, match faces with wanted persons, or assist police in targeting individuals on surveillance.
Images of suspects can be compared against police, passport, or immigration databases to identify them and review their backgrounds.
Analysts who evaluated the Police National Database’s retrospective facial recognition tool at lower settings discovered that “white subjects exhibited a lower false positive identification rate (FPIR) (0.04%) compared to Asian subjects (4.0%) and Black subjects (5.5%).”
Further testing revealed that Black women experienced notably high false positives. “The FPIR for Black male subjects (0.4%) is lower than that for Black female subjects (9.9%),” the report detailed.
The Police and Crime Commissioners Association stated that these findings reflect internalized bias. “This indicates that, in certain scenarios, Black and Asian individuals are more prone to incorrect matches than their white counterparts. Although the terminology is technical, it is evident that this technology is being integrated into police operations without adequate safeguards,” the report noted.
The statement, signed by APCC leaders Darryl Preston, Alison Rowe, John Tizard, and Chris Nelson, raised concerns why these findings were not disclosed sooner and shared with Black and Asian communities.
The report concluded: “While there is no evidence of adverse effects in individual cases, this is due to chance rather than a systematic approach. System failures have been known for a while, but the information was not conveyed to the communities impacted and key stakeholders.”
The government has initiated a 10-week public consultation aimed at facilitating more frequent usage of the technology. The public will be asked if police should have permission to go beyond records and access additional databases, such as images from passports and driving licenses, to track criminals.
Civil servants are collaborating with police to create a new national facial recognition system that will house millions of images.
After newsletter promotion
Charlie Welton, head of policy and campaigns at Liberty, stated: “The racial bias indicated by these statistics demonstrates that allowing police to utilize facial recognition without sufficient safeguards leads to actual negative consequences. There are pressing questions regarding how many individuals of color were wrongly identified in the thousands of monthly searches utilizing this biased algorithm and the ramifications it might have.”
“This report further underscores that this powerful and opaque technology cannot be deployed without substantial safeguards to protect all individuals, which includes genuine transparency and significant oversight. Governments must halt the accelerated rollout of facial recognition technology until protections are established that prioritize our rights, aligning with public expectations.”
Former cabinet minister David Davis expressed worries after police officials indicated that cameras could be installed at shopping centers, stadiums, and transport hubs to locate wanted criminals. He told the Daily Mail: “Brother, welcome to the UK. It is evident that the Government is implementing this dystopian technology nationwide. There is no way such a significant measure could proceed without a comprehensive and detailed discussion in the House of Commons.”
Officials argue that the technology is essential for apprehending serious criminals, asserting that there are manual safeguards embedded within police training, operational guidelines, and practices that require trained personnel to visually evaluate all potential matches derived from the police national database.
A Home Office representative said: “The Home Office takes these findings seriously and has already acted. The new algorithm has undergone independent testing and has shown no statistically significant bias. It will be subjected to further testing and evaluation early next year.”
“In light of the significance of this issue, we have requested the Office of the Inspector General and the Forensic Regulator to review the application of facial recognition by law enforcement. They will evaluate the effectiveness of the mitigation measures, and the National Council of Chiefs of Police backs this initiative.”
Source: www.theguardian.com
