Home Office Acknowledges Issues with Facial Recognition Technology for Black and Asian Individuals

Ministers are under pressure to implement more robust safeguards for facial recognition technology, as the Home Office has acknowledged that it may mistakenly identify Black and Asian individuals more frequently than white people in certain contexts.

Recent tests conducted by the National Physical Laboratory (NPL) on how this technology functions within police national databases revealed that “some demographic groups are likely to be incorrectly included in search results,” according to the Home Office.

The Police and Crime Commissioner stated that the release of the NPL’s results “reveals concerning underlying bias” and urged caution regarding plans for a nationwide implementation.

These findings were made public on Thursday, shortly after Police Minister Sarah Jones characterized the technology as “the most significant advancement since DNA matching.”

Facial recognition technology analyzes individuals’ faces and cross-references the images against a watchlist of known or wanted criminals. It can be employed to scrutinize live footage of people passing in front of cameras, match faces with wanted persons, or assist police in targeting individuals on surveillance.

Images of suspects can be compared against police, passport, or immigration databases to identify them and review their backgrounds.

Analysts who evaluated the Police National Database’s retrospective facial recognition tool at lower settings discovered that “white subjects exhibited a lower false positive identification rate (FPIR) (0.04%) compared to Asian subjects (4.0%) and Black subjects (5.5%).”

Further testing revealed that Black women experienced notably high false positives. “The FPIR for Black male subjects (0.4%) is lower than that for Black female subjects (9.9%),” the report detailed.

The Police and Crime Commissioners Association stated that these findings reflect internalized bias. “This indicates that, in certain scenarios, Black and Asian individuals are more prone to incorrect matches than their white counterparts. Although the terminology is technical, it is evident that this technology is being integrated into police operations without adequate safeguards,” the report noted.

The statement, signed by APCC leaders Darryl Preston, Alison Rowe, John Tizard, and Chris Nelson, raised concerns why these findings were not disclosed sooner and shared with Black and Asian communities.

The report concluded: “While there is no evidence of adverse effects in individual cases, this is due to chance rather than a systematic approach. System failures have been known for a while, but the information was not conveyed to the communities impacted and key stakeholders.”

The government has initiated a 10-week public consultation aimed at facilitating more frequent usage of the technology. The public will be asked if police should have permission to go beyond records and access additional databases, such as images from passports and driving licenses, to track criminals.

Civil servants are collaborating with police to create a new national facial recognition system that will house millions of images.

Skip past newsletter promotions

Charlie Welton, head of policy and campaigns at Liberty, stated: “The racial bias indicated by these statistics demonstrates that allowing police to utilize facial recognition without sufficient safeguards leads to actual negative consequences. There are pressing questions regarding how many individuals of color were wrongly identified in the thousands of monthly searches utilizing this biased algorithm and the ramifications it might have.”

“This report further underscores that this powerful and opaque technology cannot be deployed without substantial safeguards to protect all individuals, which includes genuine transparency and significant oversight. Governments must halt the accelerated rollout of facial recognition technology until protections are established that prioritize our rights, aligning with public expectations.”

Former cabinet minister David Davis expressed worries after police officials indicated that cameras could be installed at shopping centers, stadiums, and transport hubs to locate wanted criminals. He told the Daily Mail: “Brother, welcome to the UK. It is evident that the Government is implementing this dystopian technology nationwide. There is no way such a significant measure could proceed without a comprehensive and detailed discussion in the House of Commons.”

Officials argue that the technology is essential for apprehending serious criminals, asserting that there are manual safeguards embedded within police training, operational guidelines, and practices that require trained personnel to visually evaluate all potential matches derived from the police national database.

A Home Office representative said: “The Home Office takes these findings seriously and has already acted. The new algorithm has undergone independent testing and has shown no statistically significant bias. It will be subjected to further testing and evaluation early next year.”

“In light of the significance of this issue, we have requested the Office of the Inspector General and the Forensic Regulator to review the application of facial recognition by law enforcement. They will evaluate the effectiveness of the mitigation measures, and the National Council of Chiefs of Police backs this initiative.”

Source: www.theguardian.com

Researchers Examine Neanderthal DNA to Gain Insights into Human Facial Development and Evolution

Research led by scientist Hannah Long at the University of Edinburgh has found that specific regions of Neanderthal DNA are more effective at activating genes responsible for jaw development than those in humans, potentially explaining why Neanderthals had larger lower jaws.

Neanderthal. Image credit: Natural History Museum Trustees.

“With the Neanderthal genome being 99.7% identical to that of modern humans, the variations between species are likely to account for differences in appearance,” Dr. Hanna stated.

“Both human and Neanderthal genomes consist of roughly 3 billion characters that code for proteins and regulate gene expression in cells. Identifying the regions that influence appearance is akin to searching for a needle in a haystack.”

Dr. Long and her team had a targeted approach, focusing on a genomic area linked to the Pierre Robin sequence, a condition marked by an unusually small mandible.

“Individuals with the Pierre Robin sequence often have significant deletions or rearrangements in this portion of the genome that affect facial development and restrict jaw formation,” Dr. Hanna explained.

“We hypothesized that minor differences in DNA could produce more nuanced effects on facial structure.”

Upon comparing human and Neanderthal genomes, researchers discovered that in this segment, approximately 3,000 letters long, there are only three one-letter variations between the species.

This DNA region doesn’t code for genes but regulates when and how certain genes, particularly SOX9, which plays a crucial role in facial development, are activated.

To confirm that these Neanderthal-specific differences were significant for facial development, scientists needed to demonstrate that the Neanderthal version could activate genes in the appropriate cells at the right developmental stage.

They introduced both Neanderthal and human versions of this region into zebrafish DNA and programmed the cells to emit different colors of fluorescent protein based on the activation of either region.

By monitoring zebrafish embryo development, researchers observed that cells responsible for forming the lower jaw were active in both human and Neanderthal regions, with the Neanderthal regions showing greater activity.

“It was thrilling when we first noticed the activity of specific cell populations in the developing zebrafish face, particularly near the forming jaw, and even more exhilarating to see how Neanderthal-specific variations could influence activity during development,” said Dr. Long.

“This led us to contemplate the implications of these differences and explore them through experimental means.”

Recognizing that Neanderthal sequences were more effective at activating genes, the authors questioned whether this would lead to enhanced target activity affecting the shape and function of the adult jaw, mediated by SOX9.

To validate this idea, they augmented zebrafish embryos with additional samples of SOX9 and discovered that cells involved in jaw formation occupied a larger area.

“Our lab aims to further investigate the effects of genetic differences using methods that simulate various aspects of facial development,” Dr. Long remarked.

“We aspire to deepen our understanding of genetic variations in individuals with facial disorders and improve diagnostic processes.”

“This study demonstrates how examining extinct species can enhance our knowledge of how our own DNA contributes to facial diversity, development, and evolution.”

The findings are published in the journal Development.

_____

Kirsty Utley et al. 2025: Neanderthal-derived variants enhance SOX9 enhancer activity in craniofacial progenitor cells, influencing jaw development. Development 152 (21): dev204779; doi: 10.1242/dev.204779

Source: www.sci.news

Researchers Explore Neanderthal DNA to Uncover Insights into Human Facial Development and Evolution

Scientist Hannah Long and her team at the University of Edinburgh have discovered that specific regions of Neanderthal DNA are more effective at activating genes related to jaw formation compared to human DNA, which might explain why Neanderthals had larger lower jaws.

Neanderthal. Image credit: Natural History Museum Trustees.

“The Neanderthal genome shows a 99.7% similarity to the human genome, suggesting that the differences between the species contribute to variations in appearance,” explained Dr. Hanna.

“Both the human and Neanderthal genomes comprise around 3 billion characters that code for proteins and regulate gene usage in cells. Therefore, pinpointing regions that affect appearance is akin to finding a needle in a haystack.”

Dr. Long and her collaborators had a targeted hypothesis regarding where to initiate their search. They focused on a genomic area linked to the Pierre Robin sequence, a condition characterized by a notably small jaw.

“Some individuals with Pierre Robin sequence exhibit significant deletions or rearrangements in this genomic region that disrupt facial development and impede jaw formation,” stated Dr. Hanna.

“We speculated that minor variations in DNA could subtly influence facial shape.”

Through the comparison of human and Neanderthal genomes, researchers identified that in a segment approximately 3,000 letters long, there are just three one-letter differences between the two species.

This DNA segment lacks any specific genes but regulates the timing and manner in which genes, particularly SOX9, a crucial factor in facial development processes, are activated.

To demonstrate the significance of these Neanderthal-specific differences for facial development, researchers needed to confirm that the Neanderthal region could activate genes in the correct cells at the appropriate developmental stage.

They introduced both Neanderthal and human variants of this region into zebrafish DNA concurrently and programmed the cells to emit different colors of fluorescent protein based on whether the human or Neanderthal region was active.

By monitoring zebrafish embryo development, researchers observed that the cells crucial for lower jaw formation were active in both regions, with the Neanderthal regions showing greater activity than those of humans.

“We were thrilled when we first detected the activity in a specific group of cells within the developing zebrafish face, near the jaw, and even more so when we realized that Neanderthal-specific differences could modify this activity during development,” Dr. Long noted.

“This led us to ponder the potential implications of these differences and how we may explore them experimentally.”

Recognizing that Neanderthal sequences were more adept at activating genes, the authors inquired whether this would correlate with heightened activity in target cells, influencing the shape and function of the adult jaw as governed by SOX9.

To test this hypothesis, they administered additional samples to zebrafish embryos. They found that the cells involved in jaw formation occupied a larger area.

“In our lab, we aim to investigate the effects of additional DNA sequence differences using methods that replicate aspects of facial development,” Dr. Long said.

“We aspire to enhance our understanding of sequence alterations in individuals with facial disorders and assist with diagnostic efforts.”

“This research illustrates that by examining extinct species, we can gain insights into how our own DNA contributes to facial variation, development, and evolution.”

Findings are detailed in the journal Development here.

_____

Kirsty Utley et al. 2025: Variants derived from Neanderthals enhance SOX9 enhancer activity in craniofacial progenitor cells that shape jaw development. Development 152 (21): dev204779; doi: 10.1242/dev.204779

Source: www.sci.news

Consultation Plans for Live Facial Recognition Usage Ahead of Broader Implementation

The new police minister informed the party’s annual meeting about a labor initiative that will involve consultation on the implementation of live facial recognition (LFR) technology prior to its broader rollout across the UK.

Home Affairs Minister Sarah Jones stated she would “establish certain parameters” regarding future government usage of this technology.

Advocates argue that the absence of a legal framework allows police to self-regulate technological applications and utilize algorithms in ways that may be biased against ethnic minorities and women.

The Equality and Human Rights Commission asserts that its implementation is illegal and violates European law.

During her address at the Tony Blair Institute’s Fringe Conference in Liverpool, Jones remarked:

“There is some guidance on how we should apply it, but further clarification is necessary regarding the appropriate contexts for its use and when it should be avoided, alongside establishing a proper framework.

“Currently, there is minimal structure governing its use. We must assess whether what exists is adequate and whether additional measures are required.”

Jones, who represents Croydon West, noted that Shadow Home Secretary and fellow MP Chris Philp endorsed its use after successful deployment to apprehend criminals in the Borough of Tessau-London, London.

“Chris Philp is very supportive and frequently discusses its benefits, and the results from Croydon demonstrate its effectiveness.

“We need to ensure clarity about what will be beneficial moving forward. If we are to expand its use, what guidelines should we follow for nationwide deployment?” she inquired.

“It’s crucial to engage in this dialogue, as we want to encourage public input and understand how it will be utilized both in Parliament and among the populace.

“This is a topic we will address; it will hold significant importance going forward. It initiated under the previous administration, and we must consider their perspectives.

A civil liberties group recently urged the Metropolitan Police to cease operating LFR cameras following a high court challenge led by anti-knife campaigner Sean Thompson. Thompson, a Black individual from the UK, was mistakenly identified by the LFR as a suspect and was subsequently approached by law enforcement due to his fingerprints.

Jones expressed that she was questioned about racial bias concerns.

“I believe we can provide reassurance to the public… We have successfully utilized it to detain hundreds of dangerous offenders, but it is imperative to ensure it is being deployed appropriately.”

Source: www.theguardian.com

Expert Rejection: Police Assert Research Backing Unbiased Live Facial Recognition Usage

The Metropolitan Police assert that their application of live facial recognition is devoid of bias, as echoed by a prominent technology specialist, but this claim has not been substantiated by the reports they reference in support of their litigation.

The MET plans to deploy the LFR in its most notable event this bank holiday weekend at the Notting Hill Carnival in West London.

According to The Guardian, the technology will be utilized at two locations leading up to the carnival, and the military has insisted on its implementation, despite the fact that LFR use is considered illegal, as declared by the Equality and Human Rights Commission.


This new assertion comes from Professor Pete Hussy, who led the only independent academic review of the police’s use of facial recognition; he is a former reviewer of Met’s LFR since 2018-19 and currently advises various law enforcement agencies in the UK and internationally on its application.

The Met contends that it has reformed the usage of LFR, as indicated in the 2023 research commissioned by the National Institute of Physics (NPL), claiming that it is now virtually free from bias. Nevertheless, Fussey responded:

“The sensitivity of the system can be adjusted for LFR’s operation. Higher sensitivity results in detecting more individuals, but such potential bias is influenced by race, gender, and age. Setting zero is the most sensitive while one is the least.”

The NPL report identified bias at a sensitivity level of 0.56, noting seven instances where individuals tested were mistakenly flagged as suspects, all of whom were from ethnic minority backgrounds.

These findings stemmed from a collection of 178,000 images entered into the system, with 400 volunteers passing by the cameras roughly 10 times, providing 4,000 opportunities for accurate recognition. They were included in an estimated crowd of over 130,000 at four locations in London and one in Cardiff. The tests were carried out in clear weather over 34.5 hours, though Fussey remarked this was shorter than tests conducted in some other countries where LFR is valued.

From this dataset, the report concluded that no statistically significant bias existed in settings above 0.6. This assertion has been reiterated by the MET to justify their ongoing use and expansion of LFR.

Hussey criticized this as insufficient to substantiate the MET’s claims, stating: “Councillors at the Metropolitan Police Service consistently argue their systems undergo independent testing for bias. An examination of this study revealed that the data was inadequate to support the claims made.”

“The definitive conclusions publicly proclaimed by MET rely on an analysis of merely seven false matches from a system scrutinizing the faces of millions of Londoners. Drawing broad conclusions from such a limited sample is statistically weak.”

Currently, the MET operates LFR at a sensitivity setting of 0.64, though they assert that the NPL studies did not yield erroneous matches.

Fussey stated: “Their own research indicates that false matches are not evaluated in settings claiming no bias that exceed 0.64.”

“Few in the scientific community suggest sufficient evidence exists to support these claims drawn from such a limited sample.”

Fussey added: “We clearly indicate that bias exists within the algorithm, but we assert that this can be mitigated through appropriate adjustments to the system settings. The challenge arises from the fact that the system has not been thoroughly tested under these varied settings.”

Lindsay Chiswick, the MET’s intelligence director, dismissed Hussy’s allegations, stating: “This is a factual report from a globally renowned institution. The Met Police’s commentary is grounded in the findings of an independent study,” she explained.

Skip past newsletter promotions

“If you utilize LFR with a setting of 0.64, as I currently am, there is no statistically significant bias.”

“We sought research to pinpoint where potential bias lies within the algorithm and employed the results to mitigate that risk.”

“The findings exemplify the degree to which algorithms can be used to minimize bias, and we consistently operate well above that threshold.”

During the Notting Hill carnival this weekend, warning signs will notify attendees about the use of LFR. The LFR system will be stationed next to the van containing the cameras linked to the suspect database.

Authorities believe utilizing the technology at two sites leading to the carnival will act as a deterrent. At the carnival itself, law enforcement is prepared to employ retrospective facial recognition to identify perpetrators of violence and assaults.

Fussey remarked: “Few question the police’s right to deploy technology for public safety, but oversight is crucial, and it must align with human rights standards.”

The MET claims that since 2024, the LFR has recorded a false-positive rate of one in every 33,000 cases. Although the exact number of scanned faces remains undisclosed, it is believed to be in the hundreds of thousands.

There have been 26 incorrect matches in 2024, with eight reported so far in 2025. The Met stated that these individuals were not apprehended as decisions on arrests rested with police officers, following matches produced by their computer systems.

Prior to the carnival, the MET arrested 100 individuals, recalled 21 to prison, and banned 266 from attendance. Additionally, they reported seizing 11 firearms and over 40 knives.

Source: www.theguardian.com

Met Chief Dismisses Proposal to Abandon Live Facial Recognition at Notting Hill Carnival

The Commissioner of the Metropolitan Police has reiterated calls during the Notting Hill Carnival this weekend to halt the use of live facial recognition cameras amid concerns about racial bias and ongoing legal disputes.

In a letter, Mark Lowry stated that the technology would be utilized “in a non-discriminatory manner” at Europe’s largest street carnival, employing an algorithm that “is not biased.”

This response came after letters from 11 anti-racist and civil liberty organizations were revealed in The Guardian, urging the Met to discontinue the technology’s use at events honoring African-Caribbean communities.

Among those organizations are Runnymede Trust, Liberty, Big Brother Watch, The Race on the Agenda, and Human Rights Watch. They emphasized in a letter to Rowley on Saturday that such technology would only “increase concerns about state authority and racial misconduct within your forces.”

Critics argue that the police lack a legal framework, allowing them to “self-regulate” their technological practices, which leads to the deployment of biased algorithms affecting ethnic minorities and women.

Last month, the Met announced plans to deploy a specialized camera at the exit of the two-day event in west London. Annually, the carnival attracts over two million attendees, making it the world’s second-largest street festival during the August bank holiday weekend.

In his correspondence with NGOs and charities, Rowley recognized that previous technology deployments at the 2016 and 2017 carnivals failed to foster public trust. The Met’s earlier facial recognition system has since been enhanced, with 102 individuals mistakenly identified as suspects without being arrested.

“We have made significant strides since then. The latest version of the algorithm has undergone substantial improvements with independent testing and validation, achieving a much higher standard,” said Laurie.

He mentioned that the technology would focus on “minority individuals” involved in severe crimes, such as violence and sexual offenses.

Laurie noted that in 2024, there were 349 arrests made at the event for serious offenses including murder, rape, and possession of weapons.

“These crimes pose a threat to those wanting to enjoy the carnival safely. The use of LFRs is part of a broader strategy to identify, disrupt, and prevent threats from minority groups,” he explained.

Civil Liberties Group urged the Met to cease the use of LFR cameras last month following a high court challenge by anti-knife activist Sean Thompson. Thompson, a Black man from the UK, was wrongly identified by LFR technology as a suspect and faced police questioning due to fingerprint misidentification.

Laurie’s letter did not reply to Thompson’s claims but countered the assertion that police operate without a legal framework, noting that the Equality Act 2010 obligates public institutions to eliminate discrimination. He also mentioned that the use of LFR technology is covered under the European Convention on Human Rights and the Data Protection Act.

In response to Laurie’s letter, Rebecca Vincent, interim director of Civil Liberties Group Big Brother Watch, remarked: “Participants in this cultural celebration.”

“Everyone wants to ensure public safety, but transforming the Carnival into a police lineup is not the solution.”

Quick Guide

Please contact us about this story





show


The best public interest journalism relies on direct accounts from knowledgeable individuals.

If you have anything to share regarding this subject, please contact us confidentially using the following methods:

Secure Messages in the Guardian App

The Guardian app offers a feature for sending tips. Messages are encrypted end-to-end and hidden within standard operations of the Guardian mobile app. This ensures observers cannot discern that you are communicating with us.

If you still need the Guardian app, you can download it (iOS/Android) and navigate to the menu. Select Secure Messaging.

SecureDrop, Instant Messenger, Email, Phone, and Post

If you’re able to browse the TOR network securely without being monitored, you can send messages and documents to the Guardian via the SecureDrop platform.

Lastly, our guide at theguardian.com/tips provides various secure contact methods and discusses the advantages and disadvantages of each.


Illustration: Guardian Design / Rich Cousins

Thank you for your feedback.


Source: www.theguardian.com

Neck and Facial Massage: A Natural Way to Detoxify Your Brain

Magnetic resonance image scan of the human brain

Phanie/Sipa Press/Alamy

A device designed for facial and neck massage suggests it might enhance the brain’s waste removal system and alleviate symptoms associated with conditions like Alzheimer’s disease.

Cerebrospinal fluid (CSF) envelops the brain and inflates it before moving through a network of delicate tubes known as grinft blood vessels. Research on mice indicates that this fluid clears waste produced by brain cells, including proteins linked to diseases like Alzheimer’s and Parkinson’s, such as beta-amyloid.

This has prompted researchers to consider whether increasing CSF flow could promote brain health. However, they note that the grinft vessels, previously only discovered deep within the neck, are difficult to access. Gou Young Koh, from the Advanced Science and Technology Research Institute in Korea, remarks on this challenge.

Recently, Koh and his team identified a network of grinft vessels located just five millimeters beneath the skin on the faces and necks of mice and monkeys. They made this breakthrough by administering fluorescent dyes that label the CSF and imaging the subjects under anesthesia. “We utilized a different kind of anesthesia than was applied in earlier studies. The previous anesthetic blocked the visualization of vessels close to the skin,” Koh explains.

In their effort to determine if massaging these vessels could boost CSF flow, the researchers developed a device with small rods attached to a 1 cm cotton ball. They used it to gently stroke down the face and neck of a 2-year-old mouse for a few months, applying strokes for one minute on younger mice. “A gentle facial and neck massage can compress the liquid and enhance the CSF flow,” Koh states.

After 30 minutes of massage, CSF flow was observed to increase nearly threefold in the brains of the mice compared to their flow prior to the massage. Furthermore, this process seemed to reverse age-related decreases in CSF flow. “After stimulation, the CSF flow in older mice appeared comparable to that of younger mice [who hadn’t received the massage],” Koh elaborates.

In their unpublished findings, the team observed similar outcomes in monkeys. They also identified glymphetic blood vessels in human cadavers, implying that massage could stimulate CSF flow in humans, as suggested by Koh.

However, due to anatomical differences between mice, monkeys, and humans, further investigations are necessary to confirm this, remarks Vesa Kiviniemi from Uru University in Finland. “It’s a slightly different scenario.”

Moreover, it remains uncertain whether increased CSF flow can genuinely mitigate brain aging or offer protection against neurodegenerative conditions like Alzheimer’s. Stephen Prucks of the University of Bern in Switzerland stated that Koh’s team aims to investigate this with mice that exhibit Alzheimer-like traits.

topic:

Source: www.newscientist.com

Police Expansion of Live Facial Recognition Cameras: A Shift Towards ‘General’ Surveillance

Authorities anticipate that live facial recognition cameras may soon be “prevalent” across England and Wales, as indicated by internal documents revealing nearly 5 million face scans conducted last year.

A joint investigation by the Guardian and Liberty investigates showcases the rapid integration of this technology into UK law enforcement practices.

The government is simplifying police access to a wide range of image repositories, including passports and immigration databases, for past facial recognition searches, alongside significant financial investments in new hardware.

Live facial recognition entails real-time identification of faces captured by surveillance cameras, compared against a police watch list.

Conversely, retrospective facial recognition software allows police to match archived images from databases with those recorded on CCTV or similar systems.

The implementation of this technology is believed to be widespread in urban areas and transportation hubs across England and Wales, as noted in funding documents produced by South Wales Police and shared by the Metropolitan Police under the Freedom of Information Act.

The inaugural fixed live facial recognition camera is set to be trialed this summer in Croydon, located south of London.

This expansion comes despite the absence of any mention of facial recognition in the relevant congressional legislation.

Critics contend that police are permitted to “self-regulate” this technology, while there have been instances where previous algorithms disproportionately misidentified individuals from Black communities.

Following a 2020 Court of Appeals ruling that deemed South Wales Police’s live facial recognition practices unlawful, the Police College issued guidance emphasizing that “thresholds must be carefully set to enhance the likelihood of accurate alerts while keeping false alert rates within acceptable limits.”

There remains no statutory framework directing the standards or technology applied in this context.

Earlier this month, Police Minister Diane Johnson informed Congress that “we must evaluate whether a tailored legislative framework is necessary to govern the deployment of live facial recognition technology for law enforcement,” but further details from the Home Office are still pending.

Facial recognition cameras have been tested in London and South Wales since 2016; however, the pace at which police have adopted this technology has surged over the past year.

A survey conducted by the Guardian and Liberty revealed:

  • Last year, police scanned nearly 4.7 million faces using live facial recognition cameras—over double the figures from 2023. Data indicates that a minimum of 256 live recognition vans were operational in 2024.

  • Mobile units comprising 10 live facial recognition vans can be dispatched anywhere in the UK within a matter of days to bolster national capabilities, with eight police forces having deployed this technology, while the Met has four such vans.

  • Authorities are exploring a fixed infrastructure to establish a “safety zone” by deploying a network of live facial recognition cameras throughout London’s West End. Met officials indicated that this remains a viable option.

  • The force has nearly doubled the number of retrospective facial recognition searches on the National Police Database (PND) from 138,720 in 2023 to 252,798. The PND contains administrative mug shots, including many held unlawfully for individuals not formally charged or convicted of any offenses.

  • Over the past two years, more than 1,000 facial recognition searches have utilized the UK passport database, with officers increasingly accessing 110 matches from the Home Office immigration database last year. Authorities concluded that using a passport database for facial recognition “presents no risk.”

  • The Home Office is collaborating with the police to develop a new national facial recognition system termed strategic facial matchers, which will enable searches across various databases, including custody images and immigration records.

Lindsey Chiswick, Met’s Intelligence Director General and the National Police Chiefs’ Council’s Facial Recognition lead, stated that five out of five London residents support the police’s utilization of advanced technologies like facial recognition cameras, based on the survey findings.

Recently, registered sex offender David Chenelle, a 73-year-old from Lewisham, was sentenced to two years after being caught alone with a 6-year-old girl through live facial recognition technology. He had previously served nine years for 21 offenses involving children.

Skip past newsletter promotions

In 2024, the Met arrested 587 individuals, with 424 of those arrests backed by live facial recognition technology, leading to formal charges.

Among those arrested, 58 registered sex offenders faced serious violations of their conditions, with 38 subsequently charged.

Chiswick noted: “Given the limited resources and time available, the demand is high, and we see criminals exploiting technology on an expansive scale.

“There’s a chance for law enforcement to evolve. Discussions about leveraging AI are abundant, but we must embrace the opportunities presented by technology and data.”

Chiswick emphasized that the Met’s approach is to “proceed cautiously and evaluate at each phase,” while noting that “there may be advantages to some form of framework or statutory guidance.”

The MET employs facial recognition cameras in contexts aimed at ensuring statistical significance regarding gender or ethnic bias in misidentification instances.

Chiswick remarked: “I refuse to utilize biased algorithms in London. Each instance carries weight. The government raises concerns: Is there no issue regarding artificial intelligence?

“When selecting an algorithm’s purchaser, determining the training data employed, and assessing the origin of the technology, testing it thoroughly is paramount; you are obliged to operate within a specific context.”

The Ministry of Home Affairs did not provide a comment upon request.

Source: www.theguardian.com

Face ID: A Useful Resource or a Source of Concern? The Subtle Integration of Facial Recognition in Law Enforcement

The future is arriving ahead of schedule in Croydon. While it may not initially seem like the UK’s forefront, North End is a pedestrian-friendly high street filled with typical pawn shops, fast-food restaurants, and a blend of branded clothing stores. It’s anticipated that this area will host one of the UK’s first permanent fixed facial recognition cameras.

Digital images of passersby will be captured discreetly and processed to derive biometric data, which includes facial measurements. This data will be rapidly compared against a watchlist via artificial intelligence, and a match will trigger an alert that might lead to an arrest.

As per the latest violence reduction strategy from the South London Borough, North End and its adjacent streets are identified as “major crime hotspots.” However, they do not rank among the most hazardous routes in the capital.

The crime rate here is the 20th worst among the 32 London Boroughs, excluding the City of London. Plans to launch permanent cameras for a trial phase later this summer are not an emergency measure; instead, North End and nearby London Roads might soon see more surveillance.

When approached about the surveillance initiative, most shopkeepers and visitors in the North End were unaware of the police’s plans or the underlying technology.

For many, the cameras appear as just another form of street furniture alongside signs promoting safe cycling. While some express concern, others reference studies indicating widespread exhaustion of the public facing rising crime rates.

The police began experimenting with facial recognition cameras in the UK and Wales in 2016. Recent documents released under the Freedom of Information Act (FOI) and police statistics shared with the Guardian reveal substantial growth in usage over the last year. This technology is evolving from a niche tool to a regular component of police strategies.

Last year, police scanned almost 4.7 million faces using live facial recognition cameras, with deployments more than doubling in 2023. In 2024, live facial recognition vans were utilized at least 256 times, up from 63 the previous year.

There’s speculation that mobile units of 10 live facial recognition vans may operate throughout the country.

Meanwhile, civil servants collaborate with law enforcement to develop a new national facial recognition system called strategic facial matchers. This platform will enable searches through various databases, including custody images and immigration files.

“The implementation of this technology could become a common sight in city centres and transit hubs across England and Wales,” states one funding document submitted by the South Wales police to the Home Department and released by Metropolitan Police under FOI.

Activists warn that this technology may disrupt everyday public life by subjecting individuals to impromptu identity checks facilitated by extensive facial recognition systems. Advocates of the technology acknowledge its risks but emphasize its importance for safety.

Recently, David Scheneller, a 73-year-old registered sex offender from Lewisham, who had served nine years for 21 offenses, was sentenced to two years in prison for breaching probation terms.

Officers were alerted by the live facial recognition cameras to Scheneller walking alone with his six-year-old child.

“He was on the watchlist due to his compliance conditions,” said Lindsay Chiswick, Metropolitan’s Intelligence Director and advisor to the National Police Chief of Facial Recognition.

“He formed a relationship with his mother over time and began picking up his daughter from school. If something went wrong that day, he was aware of the repercussions. This exemplifies how police could track him. Without facial recognition, recognizing him would have posed a challenge.”

Many see this as a compelling argument, but critics raise concerns about the unanticipated ramifications as law enforcement adopts technology without legislative guidance.

Madeline Stone from the NGO Big Brother Watch, who has observed mobile camera deployments, reported witnessing misidentifications of schoolchildren in uniforms undergoing “long, humiliating, and unnecessary police stops,” where they were compelled to verify their identities and provide fingerprints.

In these instances, the affected individuals were young Black boys, leaving them frightened and distressed, she noted.

Skip past newsletter promotions

“The effectiveness diminishes as the threshold rises,” Stone added. “The police might not prefer employing it in specific environments. There are no legal mandates requiring them to do so. The notion that police could unilaterally create their own guidelines for usage is truly alarming.”

A judicial review was initiated by Londoner Sean Thompson, with backing from Big Brother Watch, after he was wrongly identified as a person of interest due to the technology and detained for 30 minutes upon returning from a volunteer shift with the anti-knife initiative Street Father.

Additionally, Dr. Dara Murray, tasked with an independent evaluation of the trials by the Met in 2019, highlights the potential “chilling” effect this technology might have on society, suggesting that considerations must go beyond just the technology’s implementation.

“It’s akin to police tailing you, recording your interactions, where you go, how often, and for how long,” he remarked. “I believe most would be uncomfortable with such reality. Democracy thrives on dissent and discourse; if surveillance stifles that, it risks entrenching the status quo and limiting future opportunities.”

Live facial recognition is being utilized to apprehend individuals for traffic violations, growing cannabis, and neglecting community orders. Is this truly justified?

Fraser Sampson, former biometrics and surveillance camera commissioner in England and Wales until his position was dissolved in October 2023, currently serves as a non-executive director for FaceWatch, the leading UK firm in retail security systems designed to prevent shoplifting.

While he acknowledges the technology’s potential, he expresses concern that independent regulations concerning surveillance haven’t kept pace with its deployment by the state.

Sampson commented: “There’s an abundance of information about the technology’s functionalities, yet in practical terms—its application, the reason for its use, and the avenues for challenges or complaints—those clarity elements seem lacking.”

Chiswick noted her understanding of the concerns while recognizing the potential advantages of regulatory measures. The Met is cautiously making “small strides” that are continually reviewed, she stated. With limited resources, law enforcement needs to adapt and capitalize on the possibilities brought by AI. They are cognizant of potential “chilling effects” on society and have made it clear that cameras will not be deployed in protest areas.

“Will this become common? I cannot say,” Chiswick remarked. “We need to approach that assumption with caution. There are numerous possible scenarios; areas like the West End? It’s conceivable, instead of the static trials we’re conducting in Croydon, we could utilize it there. However, that’s not our current plan.”

She added: “I believe the integration of technology, data, and AI will continue to rise in the coming years, as personally, that’s how we can improve our operations.”

Source: www.theguardian.com

Ancient hominins discovered in Spain with facial bones dating back 1.1 million years

Who were the first inhabitants of Western Europe, what their physical characteristics were, and where they lived are some of the pending questions in the study of Eurasian settlements during the early Pleistocene epoch. Information on ancient humanity available from Western Europe is limited and limited to the Iberian Peninsula. Now, paleontologists have discovered a fragment of the midface of humanity at the site of the Sima del Elephante in Sierra de Atapuerc, Spain. Fossils, which were 1.4 million to 1.1 million years ago, represent the earliest human faces of Western Europe ever identified.

Archaeological excavations at the Sima del Elephante in Sierra de Atapuerc, Spain. Image credits: Maria D. Guillen/Iphes-Cerca.

“It is suggested that Eurasia was first settled by Hymonin at least 1.8 million years ago,” he said, institut Catetut Catetut Catetut Catetut de Paleoecologia Humana IEvolucióSocial, Rovira I Virgili, and Museo nacional de Ciencias natures, and colleagues

“Evidence of early human settlements in Western Europe is limited to highly fragmented fossil samples from the Iberian Peninsula, with few clues as to the appearance and classification of these human beings.”

“The fossils from the Spanish site, about 850,000 years ago, are Homo Alivisora species of early people with thin midfaces that resemble modern humans. ”

“In 2007, a Hominin Joborne (ATE9-1) was found at the Simadel Elephante site in northern Spain, between 12 and 1.1 million years ago, but it was not a definitive allocation. Homo Alivisor. ”

In the new study, the authors examined the fossil ruins of Hominin Midface from the Sima Del Elefante site.

The fragment labeled ATE7-1 consists of a substantial portion of the maxilla and zygote bone from the left side of an adult.

Using both physical evidence and 3D imaging techniques, the researchers reconstructed the fossil fragments, estimated to be between 1.4 million and 1.1 million.

They also discovered additional archaeological sites: stone tools and ruins of slaughtered animals.

“These practices demonstrate that the first Europeans had a close understanding of available animal resources and knew how to systematically utilize them,” Dr. Huguett said.

According to scientists, the ATE7-1 fossil does not display the “modern” midface features found in Homo Alivisor Fossils, but there are some similarities Homo Erectus system.

They tentatively allocated fossils Homo aff. Erectusshows affinity for Homo Erectuswithholding further evidence.

This finding may suggest that at least two people live in Western Europe Homo Early Pleistocene species: Homo aff. Erectusand later Homo Alivisor.

“The evidence is why it was assigned to the ''because it is still insufficient for a definitive classification. Homo aff. Erectus“Dr. Maria Martinon Torres, a researcher at the Centro Nacional de Investigation, said he is a researcher at Evolcion Humana in London and University College London.

“This designation recognizes the affinity of ATE7-1 Homo Erectus While it leaves the possibility that it belongs to another species open. ”

“Our findings demonstrate at least two different human invasions of Western Europe during the Pleistocene era, providing fascinating insights into the evolution of the genus. Homo. ”

“While their size is small, pioneering analysis of facial fragments greatly enriched our understanding of the origins and dynamics of the earliest Europeans on the continent.”

“More research and fossil samples are needed to investigate the relationships between these populations and further improve their classification,” the researchers concluded.

Their paper It was published in the journal this month Nature.

____

R. Huguett et al. The oldest human face in Western Europe. NaturePublished online on March 12, 2025. doi:10.1038/s41586-025-08681-0

Source: www.sci.news

Can a person’s name influence their facial features?

First names are social tags that are attached to us early in life. Previous studies have shown that an individual's facial appearance is indicative of their name. A new study explores the origins of this face-name matching effect – whether names are given based on innate facial features or whether an individual's facial appearance changes to match their name over time. Findings using both humans and machine learning algorithms show that while adults show a match between facial appearance and name, this pattern is not seen in children or in children's faces digitally aged to an adult appearance.

Zwebener othersThey investigated the possibility of a self-fulfilling prophecy effect, whereby an individual's facial appearance over time begins to resemble the social stereotype associated with their name. Image credit: Zwebner others., doi: 10.1073/pnas.2405334121.

“George Orwell famously said, 'By the time you're 50, everyone has a suitable face,'” said Reichman University researcher Yonat Zwebner and his colleagues.

“Research supports Orwell's observations and suggests that changes in facial appearance over the years may be influenced by a person's personality and behaviour.”

“Our current study aims to explicitly test developmental aspects of facial appearance by focusing on social processes by taking advantage of a recently identified effect, the face-name congruency effect. The face-name congruency effect suggests that names can be manifested in the appearance of a face.”

In the study, the authors asked 9- to 10-year-old children and adults to match people's faces with names.

The findings revealed that both children and adults correctly matched adult faces to their corresponding names at rates well above chance.

However, when it came to children's faces and names, participants were unable to make accurate associations.

In another part of the study, a large database of images of human faces was fed into the machine learning system.

The computer recognized that facial representations of adults with the same name were significantly more similar to each other than to facial representations of adults with different names.

Conversely, no significant similarities were found when comparing children with the same name to children with different names.

The researchers concluded that the similarities between people's faces and names are the result of a self-fulfilling prophecy.

Facial appearance changes over time to conform to the social stereotypes associated with names.

These stereotypes can form in a variety of ways, such as when a name is associated with a famous person or because of the connotations that biblical names have.

“Our study highlights the broader importance of this surprising effect – the profound influence of social expectations,” Dr Zwebner said.

“We have demonstrated that social construction, or structuring, does in fact exist, something that has been almost impossible to verify empirically until now.”

“Social constructs are so powerful that they can affect how people look.”

“These findings may suggest the extent to which other personal factors, such as gender and ethnicity, that may be even more important than a name, may shape a person's personality as they grow up.”

of result This month is Proceedings of the National Academy of Sciences.

_____

Yonat Zwebner others2024. Does a name shape the appearance of a face? PNAS 121 (30): e2405334121; doi: 10.1073/pnas.2405334121

Source: www.sci.news

Transforming Facial Appearance: The Impact of ‘Looksmaxxing’ on Men’s Features

FOr James, it started with muscles. He was about 16 years old, he was confident in his body shape and worried that he wasn't strong enough to attract girls. He went to bodybuilding forums and started doing his workouts. I don't remember when it happened, but at some point trolls started invading the forums. They were visitors from different online communities with different focuses.

“Their whole vibe was pretty mean,” says James, who did not want to give his real name. “They would take pictures of their great physiques that people posted and say, 'You guys forgot to work out your face!'”

Despite the meanness, James was flooded with curiosity on forums primarily focused on facial aesthetics. He discovers a new world, primarily a young man and his teenage boys, scouring each other's photos for possible flaws and fixes.

The forum was filled with brutal judgments, providing James with a compelling new outlet for his anxiety. “I was learning about issues I wasn't even aware of,” he says. “I had a short face, a short chin, a nose that was too wide, eyes that were too far apart, and a hairline that was too high. A lot of these things you don't notice until someone points them out to you.” I can't stop seeing you. ”

James was hooked on LooksMaxThing, an online community for people who want to beautify their faces. He began learning strange codes that members use to compare their features. Eye angle (eye angle). Meow meow (a tongue movement that is said to improve the shape of the jaw). “The ultimate goal is to improve SMV,” he says. In other words, sexual market value.




Kareem Shami promotes non-surgical soft maxing (2020) (left) And in 2023.

Looksmaxxing has been around for at least a decade, but in recent months it has exploded from obscure forums and Reddit pages to mainstream social media, especially TikTok. An incredibly chiseled chin, pouty lips, and cheekbones as high as the Egyptian pyramids are complemented by “hunter” eyes (slanted slightly downward toward the nose, meaning the corners of the eyes are positively slanted). It is highly prized along with other people.

I felt like an outcast and it triggered something in me.

Kareem Shami

“The majority of the groups we work with are now LookMax conscious,” says Mike Nicholson, a former teacher who runs a workshop program in schools called Progressive Masculinity. says. The day after a report by researchers from University College London and the University of Kent found that TikTok's algorithm amplified misogynistic content and helped normalize it on playgrounds, he talking to. (In response, TikTok said it removed the misogynistic content it had banned and questioned the report's methodology.)


“We approach this issue from a very sympathetic perspective,” Nicholson added. “But the world that these young people and boys live in is a world that is increasing their insecurities and leading them down this path that can lead to 'incel' ideology if they are not careful. –

James, who is in his 20s and works in finance in the UK, started participating in forums around 2015, when they were still niche. He started “Soft Max Thing,” which is tweaking hair styling, skin care regimens, diet, exercise regimens, and more. But as the site held up an increasingly harsh mirror, he began exploring a more extreme fix known as “hardmaxing.”

Source: www.theguardian.com

Wrongful Imprisonment Lawsuit Alleges Use of Facial Recognition Technology Following Sunglass Hut Robbery

A 61-year-old man is suing Macy’s and Sunglass Hut’s parent company, alleging that the store’s use of a facial recognition system misidentified him as the perpetrator of an armed robbery, leading to his false arrest. While in prison, he was beaten and raped, according to the complaint.

Harvey Eugene Murphy Jr. was arrested and charged with robbing a Houston-area Sunglass Hut of thousands of dollars worth of merchandise in January 2022, but his lawyers say he was living in California at the time of the robbery. According to his lawyer, he was arrested on October 20, 2023.

According to Murphy’s complaint, employees at Essilor Luxottica, Sunglass Hut’s parent company, worked with retail partner Macy’s to use facial recognition software to identify Murphy as the robber. The images sent through the facial recognition system came from a low-quality camera, according to the complaint. Houston police were investigating an armed robbery when an EssilorLuxottica employee called police and determined that one of the two robbers was using the technology, so the investigation could be discontinued. I told him. The employee also said the system indicated that Murphy had committed two other robberies, according to the complaint.

When Murphy returned to Texas from California, he went to the Department of Motor Vehicles (DMV) to renew his license. Murphy told the Guardian that within minutes of identifying himself as a DMV employee, he was contacted by a police officer and informed that there was a warrant out for his arrest on suspicion of aggravated robbery. He said Murphy was not given any details about the crime he allegedly committed other than the day the robbery occurred. He found himself more than a thousand miles away in Sacramento, California, at the time of the robbery.

“I almost thought it was a joke,” Murphy said.

Still he He was arrested and taken to the local county jail, where he was held for 10 days before being transported to the Harris County Jail for processing.

After several days in Harris County, the alibi was confirmed by both a public defender and a prosecutor, and the charges were ultimately dropped, according to the complaint.

Murphy has never been convicted of a crime. Nevertheless, he says the detention left deep scars. He claimed that he was brutally beaten and gang-raped by three other men inside the prison hours before his release. Murphy said he was threatened with death when he tried to call prison staff. After the alleged attack, Murphy remained in the same cell as them until his release.

“It was kind of scary,” Murphy said. “My anxiety is so high, I’m shaking all the time. And I just stood up in my bunk and faced the wall, just praying that something would happen and get me out of the tank.”

“This attack left him with permanent scars that he will have to live with for the rest of his life,” the complaint states. “All of this happened to Murphy because the defendants relied on facial recognition technology that is known to be error-prone and flawed.”

Murphy didn’t realize that facial recognition technology could be used as evidence against her until two weeks ago, when she began working with her attorney, Daniel Dutko.

Datko said he discovered in police documents that Sunglass Hut employees shared camera footage with Macy’s that was used to identify Murphy. Datko said Macy’s and Sunglass Hut then contacted police together. Although Macy’s has retail partnerships with eyewear brands at multiple stores, Macy’s was not involved in the robbery because Sunglass Hut is an independent store, he said.

“We’re very comfortable saying that facial recognition software is the only possible explanation and that’s the only reason. [Sunglass Hut] I was going to Macy’s to identify him,” Datko said.

Skip past newsletter promotions

Mr. Murphy’s case marks the seventh known case of false arrest using facial recognition in the United States, further highlighting flaws in the technology, which is already widely used in police departments and retail stores. However, all of the publicly known cases of false arrests using facial recognition to date have involved black victims. Murphy’s case marks the first known case in which a failure of this technique resulted in the wrongful arrest of a white man. Just last month, Rite Aid settled with the Federal Trade Commission over its use of a facial recognition system that previously misidentified Black, Latino, and Asian customers as “likely to be involved” in shoplifting. As part of the settlement, the pharmacy chain was banned from using facial recognition in its stores for five years. Then, in the summer of 2023, a woman named Portia Woodruff was arrested on suspicion of carjacking using a facial recognition system to authenticate her identity.

Macy’s has previously sued Regarding the use of facial recognition technology. In a 2020 lawsuit, a Chicago woman accused the company of violating Illinois’ biometric privacy law by collaborating with facial recognition provider Clearview AI without her or other customers’ consent.

Nathan Fried Wessler, deputy director of the American Civil Liberties Union’s Speech, Privacy, and Technology Project, said this is another example of the “extreme dangers of facial recognition technology.”

“We have seen case after case where police reflexively relied on unreliable facial recognition results, allowing the technology to make false matches and corrupting witness identification procedures,” Wessler said in a statement. “As the facts alleged in this case demonstrate, the consequences of wrongful arrest are dire. Lawmakers need to stop law enforcement and companies from dangerously relying on facial recognition results to put people behind bars. There must be.”

Murphy is seeking $10 million in damages.

Macy’s said it does not comment on pending litigation. EssilorLuxottica did not immediately respond to the Guardian’s request for comment.

Source: www.theguardian.com

Homo longhi’s facial reconstruction

Brazilian anthropologist reconstructs ancient human face Homolonghi This comes from a well-preserved skull discovered in northeastern China in the 1930s.



portrait of Homolonghi. Image credit: Cicero Moraes, doi: 10.6084/m9.figshare.24648591.

Homolonghi It is an extinct species of the genus homo It lived in Asia during the Middle Pleistocene.

The species, also known as dragon man, was identified from a nearly complete skull dating back 148,000 years.

This fossil was discovered in 1933 when a bridge was being built over the Songhua River in Harbin, Heilongjiang Province, China.

Due to unsystematic recoveries and long periods of time, information about the exact location and fossil layers has been lost.

Harbin's skull is huge, larger than all other archaic humans. It is also relatively long and low, lacking the spherical shape of modern human brain cases.

It has larger, almost square eye sockets, thick eyebrow ridges, a wide mouth, and oversized teeth.

Its intracranial volume is estimated to be 1,420 ml; homo sapiens Neanderthals, and bigger than before homo seed.

This specimen, also known as the Harbin skull, probably represents a man less than 50 years old.



front view portrait of Homolonghi. Image credit: Cicero Moraes, doi: 10.6084/m9.figshare.24648591.

“Uranium series dating gives the fossil a reliable date of 148,000 years ago, and its remarkable dimensions mean the skull is larger than all known archaic hominin skulls. ” he said. Cicero Moraesresearcher at the Ortogonline Treinamento em Desenvolvimento Profissional e Consultoria LTDA.

Regarding face approximation, Homo longhiMoraes used not only Harbin's skull, but also a computed tomography (CT) scan of Harbin's skull. homo erectus, homo sapiens and pan-troglodytes.

“Forensic facial reconstruction, or forensic facial approximation, is an auxiliary recognition technique that reconstructs/approximates the face in a skull, and is used when there is little information to identify an individual based on the remains. ” explained Moraes.



portrait side view Homolonghi. Image credit: Cicero Moraes, doi: 10.6084/m9.figshare.24648591.

“To be able to approximate the face, the jaw and missing teeth had to be reconstructed,” he added.

“For this purpose, we need a complete skull. homo erectus Teeth that underwent some structural modifications and were precisely adjusted in the tooth area were used. ”

“The remaining molars served as parameters for aligning other teeth as well as the alveolus in the upper jaw.”

“Thanks to the wide range of information obtained in previous studies, the reconstruction of the fossil and face is now possible. HomolonghiIn addition to undertaking some measurements and comparisons in the context of other species,” concluded Moraes.

of result Published in an online journal OrtogOnLineMag.

_____

Cicero Moraes. 2023. Facial shortcuts Homo longhi (Harbin, China, approximately 148,000 AP). OrtogOnLineMag 4(2); doi: 10.6084/m9.figshare.24648591

Source: www.sci.news

Rite Aid Prohibits Use of Facial Recognition Software for Shoplifting Impersonation

Rite aid It has been Banned US drugstore giant’s use of facial recognition software comes after Federal Trade Commission (FTC) finds ‘reckless use of facial surveillance system’ humiliates customers and ‘compromises confidential information’ was banned for five years.

F.T.C. orderU.S. Bankruptcy Court approval required after Rite Aid Filing for bankruptcy protection under Chapter 11 of the Federal Bankruptcy Code In October, it directed Rite Aid to delete images collected as part of its facial recognition system rollout and products built from those images. Companies must also implement robust data security programs to protect the personal data they collect.

Reuters 2020 report Details of how the drugstore chain secretly installed facial recognition systems in about 200 U.S. stores over an eight-year period starting in 2012, using “primarily low-income, non-white neighborhoods” as testbeds for the technology. Stated.

With the increase in FTC Focus on the abuse of biometric surveillance, Rite Aid was firmly targeted by government agencies. Among the allegations: Rite Aid partnered with two contracting companies to create a “watch list database” containing images of customers it said had engaged in criminal activity at one of its stores. Includes what you did. These images are often of low quality and are taken from CCTV or an employee’s mobile phone camera.

When a customer enters a store that appears to match an existing image in the database, employees receive an automated alert instructing them to take action, which in most cases involves “walking closer and identifying”; That means verifying the customer’s identity and asking them to leave. According to the FTC, these “matches” were often false positives, causing employees to falsely accuse customers of wrongdoing and causing “embarrassment, harassment, and other harm.”

“Following false positive alerts, employees may follow consumers in the store, search them, order them to leave, call the police, confront or remove consumers, and sometimes shoplift in front of their friends and family. and other misconduct,” the suit says.

Additionally, the FTC said Rite Aid did not notify customers that facial recognition technology was being used and specifically instructed employees to: do not have Reveal this information to your customers.

face off

Facial recognition software has emerged as one of the most controversial aspects of the AI-powered surveillance era. In recent years, cities have issued broad bans on the technology while politicians have fought to regulate how police use it. Meanwhile, companies like Clearview AI have been hit with lawsuits and fines around the world for massive data privacy violations involving facial recognition technology.

The FTC’s latest findings regarding Rite Aid also shed light on the biases inherent in AI systems. For example, the FTC says Rite Aid failed to reduce risks to certain consumers due to race. The technology is “more likely to generate false positives in stores located in predominantly Black and Asian communities than in predominantly white communities.” Observation notes.

Additionally, the FTC said Rite Aid failed to test or measure the accuracy of its facial recognition system before or after its implementation.

in press releaseRite Aid said it was “pleased to reach an agreement with the FTC” but disagreed with the core of the allegations.

“The allegations relate to a pilot program for facial recognition technology that we implemented in a limited number of stores,” Rite Aid said in a statement. “Rite Aid stopped using the technology at this small group of stores more than three years ago, before the FTC’s investigation into the company’s use of the technology began.”

Source: techcrunch.com