Home Office Acknowledges Issues with Facial Recognition Technology for Black and Asian Individuals

Ministers are under pressure to implement more robust safeguards for facial recognition technology, as the Home Office has acknowledged that it may mistakenly identify Black and Asian individuals more frequently than white people in certain contexts.

Recent tests conducted by the National Physical Laboratory (NPL) on how this technology functions within police national databases revealed that “some demographic groups are likely to be incorrectly included in search results,” according to the Home Office.

The Police and Crime Commissioner stated that the release of the NPL’s results “reveals concerning underlying bias” and urged caution regarding plans for a nationwide implementation.

These findings were made public on Thursday, shortly after Police Minister Sarah Jones characterized the technology as “the most significant advancement since DNA matching.”

Facial recognition technology analyzes individuals’ faces and cross-references the images against a watchlist of known or wanted criminals. It can be employed to scrutinize live footage of people passing in front of cameras, match faces with wanted persons, or assist police in targeting individuals on surveillance.

Images of suspects can be compared against police, passport, or immigration databases to identify them and review their backgrounds.

Analysts who evaluated the Police National Database’s retrospective facial recognition tool at lower settings discovered that “white subjects exhibited a lower false positive identification rate (FPIR) (0.04%) compared to Asian subjects (4.0%) and Black subjects (5.5%).”

Further testing revealed that Black women experienced notably high false positives. “The FPIR for Black male subjects (0.4%) is lower than that for Black female subjects (9.9%),” the report detailed.

The Police and Crime Commissioners Association stated that these findings reflect internalized bias. “This indicates that, in certain scenarios, Black and Asian individuals are more prone to incorrect matches than their white counterparts. Although the terminology is technical, it is evident that this technology is being integrated into police operations without adequate safeguards,” the report noted.

The statement, signed by APCC leaders Darryl Preston, Alison Rowe, John Tizard, and Chris Nelson, raised concerns why these findings were not disclosed sooner and shared with Black and Asian communities.

The report concluded: “While there is no evidence of adverse effects in individual cases, this is due to chance rather than a systematic approach. System failures have been known for a while, but the information was not conveyed to the communities impacted and key stakeholders.”

The government has initiated a 10-week public consultation aimed at facilitating more frequent usage of the technology. The public will be asked if police should have permission to go beyond records and access additional databases, such as images from passports and driving licenses, to track criminals.

Civil servants are collaborating with police to create a new national facial recognition system that will house millions of images.

Skip past newsletter promotions

Charlie Welton, head of policy and campaigns at Liberty, stated: “The racial bias indicated by these statistics demonstrates that allowing police to utilize facial recognition without sufficient safeguards leads to actual negative consequences. There are pressing questions regarding how many individuals of color were wrongly identified in the thousands of monthly searches utilizing this biased algorithm and the ramifications it might have.”

“This report further underscores that this powerful and opaque technology cannot be deployed without substantial safeguards to protect all individuals, which includes genuine transparency and significant oversight. Governments must halt the accelerated rollout of facial recognition technology until protections are established that prioritize our rights, aligning with public expectations.”

Former cabinet minister David Davis expressed worries after police officials indicated that cameras could be installed at shopping centers, stadiums, and transport hubs to locate wanted criminals. He told the Daily Mail: “Brother, welcome to the UK. It is evident that the Government is implementing this dystopian technology nationwide. There is no way such a significant measure could proceed without a comprehensive and detailed discussion in the House of Commons.”

Officials argue that the technology is essential for apprehending serious criminals, asserting that there are manual safeguards embedded within police training, operational guidelines, and practices that require trained personnel to visually evaluate all potential matches derived from the police national database.

A Home Office representative said: “The Home Office takes these findings seriously and has already acted. The new algorithm has undergone independent testing and has shown no statistically significant bias. It will be subjected to further testing and evaluation early next year.”

“In light of the significance of this issue, we have requested the Office of the Inspector General and the Forensic Regulator to review the application of facial recognition by law enforcement. They will evaluate the effectiveness of the mitigation measures, and the National Council of Chiefs of Police backs this initiative.”

Source: www.theguardian.com

Chemical Computers: Mastering Pattern Recognition and Multitasking

Molecules can be utilized for computational tasks

Shutterstock/Imageflow

Chemical computers composed of enzyme networks can carry out a range of functions, including temperature measurement and substance identification, all while avoiding the need for reconstruction after each use. This adaptability resembles biological systems more than traditional digital circuits, indicating a potential merger of computing and biological processes.

In nature, living organisms contain molecular systems that continuously integrate chemical and physical signals. For instance, cells detect nutrients, hormones, and temperature variations, adjusting to survive. Researchers have attempted to create analogs of this biological flexibility for years, including efforts to form logic gates with DNA; however, most artificial systems fall short due to their simplicity, inflexibility, or scalability challenges.

In a novel approach, researcher Wilhelm Huck from Radboud University in the Netherlands focused on allowing enzymes to interact autonomously rather than scripting every chemical step, leading to complex behaviors capable of recognizing chemical patterns.

The research team developed a system utilizing seven distinct enzymes embedded in tiny hydrogel beads found in small tubes. A liquid is introduced to these tubes, injecting short amino acid chains called peptides, which function as the “inputs” for the computer. As the peptides travel through the enzymes, each enzyme endeavours to cleave the peptide at designated sites along its length. When one cleavage occurs, it alters the peptide’s structure and the available cleavage sites, thereby affecting the actions of other enzymes.

This interdependence of reactions means that enzymes form a dynamic chemical network continually evolving, yielding unique patterns for the system to analyze. “Enzymes serve as the hardware while peptides act as the software. We address novel challenges based on the input provided,” noted Lee Dongyang from Caltech, who was not part of the study.

For instance, temperature influences the reaction rates of the enzymes. Elevated temperatures can accelerate certain enzymes faster than others, modifying the output’s mixture of peptide fragments. By employing machine learning algorithms to analyze these fragments, the researchers were able to correlate fragment patterns with specific temperatures.

Different chemical reactions can take place over various timescales, giving these systems a type of “memory” for previous inputs, enabling them to identify patterns over time. For example, they can distinguish between rapid and slow light pulses, allowing for both reactive and adaptive processing of changes in input.

The outcome is a versatile, dynamic chemical computer that interprets signals akin to a living organism rather than a static chemical circuit. “The same network undertook multiple roles seamlessly, including chemical categorization, temperature sensing with an average error margin of around 1.3°C from 25°C to 55°C, pH classification, and even responding to light pulse periodicity,” Li indicated.

The researchers were astonished by the effectiveness of the compact computer, with Huck expressing hopes for future advancements that might convert optical and electrical signals directly into chemical reactions, mimicking the behavior of living cells. “We started with just six or seven enzymes and six peptides,” he remarked. “Just imagine the possibilities with 100 enzymes.”

topic:

Source: www.newscientist.com

Consultation Plans for Live Facial Recognition Usage Ahead of Broader Implementation

The new police minister informed the party’s annual meeting about a labor initiative that will involve consultation on the implementation of live facial recognition (LFR) technology prior to its broader rollout across the UK.

Home Affairs Minister Sarah Jones stated she would “establish certain parameters” regarding future government usage of this technology.

Advocates argue that the absence of a legal framework allows police to self-regulate technological applications and utilize algorithms in ways that may be biased against ethnic minorities and women.

The Equality and Human Rights Commission asserts that its implementation is illegal and violates European law.

During her address at the Tony Blair Institute’s Fringe Conference in Liverpool, Jones remarked:

“There is some guidance on how we should apply it, but further clarification is necessary regarding the appropriate contexts for its use and when it should be avoided, alongside establishing a proper framework.

“Currently, there is minimal structure governing its use. We must assess whether what exists is adequate and whether additional measures are required.”

Jones, who represents Croydon West, noted that Shadow Home Secretary and fellow MP Chris Philp endorsed its use after successful deployment to apprehend criminals in the Borough of Tessau-London, London.

“Chris Philp is very supportive and frequently discusses its benefits, and the results from Croydon demonstrate its effectiveness.

“We need to ensure clarity about what will be beneficial moving forward. If we are to expand its use, what guidelines should we follow for nationwide deployment?” she inquired.

“It’s crucial to engage in this dialogue, as we want to encourage public input and understand how it will be utilized both in Parliament and among the populace.

“This is a topic we will address; it will hold significant importance going forward. It initiated under the previous administration, and we must consider their perspectives.

A civil liberties group recently urged the Metropolitan Police to cease operating LFR cameras following a high court challenge led by anti-knife campaigner Sean Thompson. Thompson, a Black individual from the UK, was mistakenly identified by the LFR as a suspect and was subsequently approached by law enforcement due to his fingerprints.

Jones expressed that she was questioned about racial bias concerns.

“I believe we can provide reassurance to the public… We have successfully utilized it to detain hundreds of dangerous offenders, but it is imperative to ensure it is being deployed appropriately.”

Source: www.theguardian.com

Expert Rejection: Police Assert Research Backing Unbiased Live Facial Recognition Usage

The Metropolitan Police assert that their application of live facial recognition is devoid of bias, as echoed by a prominent technology specialist, but this claim has not been substantiated by the reports they reference in support of their litigation.

The MET plans to deploy the LFR in its most notable event this bank holiday weekend at the Notting Hill Carnival in West London.

According to The Guardian, the technology will be utilized at two locations leading up to the carnival, and the military has insisted on its implementation, despite the fact that LFR use is considered illegal, as declared by the Equality and Human Rights Commission.


This new assertion comes from Professor Pete Hussy, who led the only independent academic review of the police’s use of facial recognition; he is a former reviewer of Met’s LFR since 2018-19 and currently advises various law enforcement agencies in the UK and internationally on its application.

The Met contends that it has reformed the usage of LFR, as indicated in the 2023 research commissioned by the National Institute of Physics (NPL), claiming that it is now virtually free from bias. Nevertheless, Fussey responded:

“The sensitivity of the system can be adjusted for LFR’s operation. Higher sensitivity results in detecting more individuals, but such potential bias is influenced by race, gender, and age. Setting zero is the most sensitive while one is the least.”

The NPL report identified bias at a sensitivity level of 0.56, noting seven instances where individuals tested were mistakenly flagged as suspects, all of whom were from ethnic minority backgrounds.

These findings stemmed from a collection of 178,000 images entered into the system, with 400 volunteers passing by the cameras roughly 10 times, providing 4,000 opportunities for accurate recognition. They were included in an estimated crowd of over 130,000 at four locations in London and one in Cardiff. The tests were carried out in clear weather over 34.5 hours, though Fussey remarked this was shorter than tests conducted in some other countries where LFR is valued.

From this dataset, the report concluded that no statistically significant bias existed in settings above 0.6. This assertion has been reiterated by the MET to justify their ongoing use and expansion of LFR.

Hussey criticized this as insufficient to substantiate the MET’s claims, stating: “Councillors at the Metropolitan Police Service consistently argue their systems undergo independent testing for bias. An examination of this study revealed that the data was inadequate to support the claims made.”

“The definitive conclusions publicly proclaimed by MET rely on an analysis of merely seven false matches from a system scrutinizing the faces of millions of Londoners. Drawing broad conclusions from such a limited sample is statistically weak.”

Currently, the MET operates LFR at a sensitivity setting of 0.64, though they assert that the NPL studies did not yield erroneous matches.

Fussey stated: “Their own research indicates that false matches are not evaluated in settings claiming no bias that exceed 0.64.”

“Few in the scientific community suggest sufficient evidence exists to support these claims drawn from such a limited sample.”

Fussey added: “We clearly indicate that bias exists within the algorithm, but we assert that this can be mitigated through appropriate adjustments to the system settings. The challenge arises from the fact that the system has not been thoroughly tested under these varied settings.”

Lindsay Chiswick, the MET’s intelligence director, dismissed Hussy’s allegations, stating: “This is a factual report from a globally renowned institution. The Met Police’s commentary is grounded in the findings of an independent study,” she explained.

Skip past newsletter promotions

“If you utilize LFR with a setting of 0.64, as I currently am, there is no statistically significant bias.”

“We sought research to pinpoint where potential bias lies within the algorithm and employed the results to mitigate that risk.”

“The findings exemplify the degree to which algorithms can be used to minimize bias, and we consistently operate well above that threshold.”

During the Notting Hill carnival this weekend, warning signs will notify attendees about the use of LFR. The LFR system will be stationed next to the van containing the cameras linked to the suspect database.

Authorities believe utilizing the technology at two sites leading to the carnival will act as a deterrent. At the carnival itself, law enforcement is prepared to employ retrospective facial recognition to identify perpetrators of violence and assaults.

Fussey remarked: “Few question the police’s right to deploy technology for public safety, but oversight is crucial, and it must align with human rights standards.”

The MET claims that since 2024, the LFR has recorded a false-positive rate of one in every 33,000 cases. Although the exact number of scanned faces remains undisclosed, it is believed to be in the hundreds of thousands.

There have been 26 incorrect matches in 2024, with eight reported so far in 2025. The Met stated that these individuals were not apprehended as decisions on arrests rested with police officers, following matches produced by their computer systems.

Prior to the carnival, the MET arrested 100 individuals, recalled 21 to prison, and banned 266 from attendance. Additionally, they reported seizing 11 firearms and over 40 knives.

Source: www.theguardian.com

Met Chief Dismisses Proposal to Abandon Live Facial Recognition at Notting Hill Carnival

The Commissioner of the Metropolitan Police has reiterated calls during the Notting Hill Carnival this weekend to halt the use of live facial recognition cameras amid concerns about racial bias and ongoing legal disputes.

In a letter, Mark Lowry stated that the technology would be utilized “in a non-discriminatory manner” at Europe’s largest street carnival, employing an algorithm that “is not biased.”

This response came after letters from 11 anti-racist and civil liberty organizations were revealed in The Guardian, urging the Met to discontinue the technology’s use at events honoring African-Caribbean communities.

Among those organizations are Runnymede Trust, Liberty, Big Brother Watch, The Race on the Agenda, and Human Rights Watch. They emphasized in a letter to Rowley on Saturday that such technology would only “increase concerns about state authority and racial misconduct within your forces.”

Critics argue that the police lack a legal framework, allowing them to “self-regulate” their technological practices, which leads to the deployment of biased algorithms affecting ethnic minorities and women.

Last month, the Met announced plans to deploy a specialized camera at the exit of the two-day event in west London. Annually, the carnival attracts over two million attendees, making it the world’s second-largest street festival during the August bank holiday weekend.

In his correspondence with NGOs and charities, Rowley recognized that previous technology deployments at the 2016 and 2017 carnivals failed to foster public trust. The Met’s earlier facial recognition system has since been enhanced, with 102 individuals mistakenly identified as suspects without being arrested.

“We have made significant strides since then. The latest version of the algorithm has undergone substantial improvements with independent testing and validation, achieving a much higher standard,” said Laurie.

He mentioned that the technology would focus on “minority individuals” involved in severe crimes, such as violence and sexual offenses.

Laurie noted that in 2024, there were 349 arrests made at the event for serious offenses including murder, rape, and possession of weapons.

“These crimes pose a threat to those wanting to enjoy the carnival safely. The use of LFRs is part of a broader strategy to identify, disrupt, and prevent threats from minority groups,” he explained.

Civil Liberties Group urged the Met to cease the use of LFR cameras last month following a high court challenge by anti-knife activist Sean Thompson. Thompson, a Black man from the UK, was wrongly identified by LFR technology as a suspect and faced police questioning due to fingerprint misidentification.

Laurie’s letter did not reply to Thompson’s claims but countered the assertion that police operate without a legal framework, noting that the Equality Act 2010 obligates public institutions to eliminate discrimination. He also mentioned that the use of LFR technology is covered under the European Convention on Human Rights and the Data Protection Act.

In response to Laurie’s letter, Rebecca Vincent, interim director of Civil Liberties Group Big Brother Watch, remarked: “Participants in this cultural celebration.”

“Everyone wants to ensure public safety, but transforming the Carnival into a police lineup is not the solution.”

Quick Guide

Please contact us about this story





show


The best public interest journalism relies on direct accounts from knowledgeable individuals.

If you have anything to share regarding this subject, please contact us confidentially using the following methods:

Secure Messages in the Guardian App

The Guardian app offers a feature for sending tips. Messages are encrypted end-to-end and hidden within standard operations of the Guardian mobile app. This ensures observers cannot discern that you are communicating with us.

If you still need the Guardian app, you can download it (iOS/Android) and navigate to the menu. Select Secure Messaging.

SecureDrop, Instant Messenger, Email, Phone, and Post

If you’re able to browse the TOR network securely without being monitored, you can send messages and documents to the Guardian via the SecureDrop platform.

Lastly, our guide at theguardian.com/tips provides various secure contact methods and discusses the advantages and disadvantages of each.


Illustration: Guardian Design / Rich Cousins

Thank you for your feedback.


Source: www.theguardian.com

Police Expansion of Live Facial Recognition Cameras: A Shift Towards ‘General’ Surveillance

Authorities anticipate that live facial recognition cameras may soon be “prevalent” across England and Wales, as indicated by internal documents revealing nearly 5 million face scans conducted last year.

A joint investigation by the Guardian and Liberty investigates showcases the rapid integration of this technology into UK law enforcement practices.

The government is simplifying police access to a wide range of image repositories, including passports and immigration databases, for past facial recognition searches, alongside significant financial investments in new hardware.

Live facial recognition entails real-time identification of faces captured by surveillance cameras, compared against a police watch list.

Conversely, retrospective facial recognition software allows police to match archived images from databases with those recorded on CCTV or similar systems.

The implementation of this technology is believed to be widespread in urban areas and transportation hubs across England and Wales, as noted in funding documents produced by South Wales Police and shared by the Metropolitan Police under the Freedom of Information Act.

The inaugural fixed live facial recognition camera is set to be trialed this summer in Croydon, located south of London.

This expansion comes despite the absence of any mention of facial recognition in the relevant congressional legislation.

Critics contend that police are permitted to “self-regulate” this technology, while there have been instances where previous algorithms disproportionately misidentified individuals from Black communities.

Following a 2020 Court of Appeals ruling that deemed South Wales Police’s live facial recognition practices unlawful, the Police College issued guidance emphasizing that “thresholds must be carefully set to enhance the likelihood of accurate alerts while keeping false alert rates within acceptable limits.”

There remains no statutory framework directing the standards or technology applied in this context.

Earlier this month, Police Minister Diane Johnson informed Congress that “we must evaluate whether a tailored legislative framework is necessary to govern the deployment of live facial recognition technology for law enforcement,” but further details from the Home Office are still pending.

Facial recognition cameras have been tested in London and South Wales since 2016; however, the pace at which police have adopted this technology has surged over the past year.

A survey conducted by the Guardian and Liberty revealed:

  • Last year, police scanned nearly 4.7 million faces using live facial recognition cameras—over double the figures from 2023. Data indicates that a minimum of 256 live recognition vans were operational in 2024.

  • Mobile units comprising 10 live facial recognition vans can be dispatched anywhere in the UK within a matter of days to bolster national capabilities, with eight police forces having deployed this technology, while the Met has four such vans.

  • Authorities are exploring a fixed infrastructure to establish a “safety zone” by deploying a network of live facial recognition cameras throughout London’s West End. Met officials indicated that this remains a viable option.

  • The force has nearly doubled the number of retrospective facial recognition searches on the National Police Database (PND) from 138,720 in 2023 to 252,798. The PND contains administrative mug shots, including many held unlawfully for individuals not formally charged or convicted of any offenses.

  • Over the past two years, more than 1,000 facial recognition searches have utilized the UK passport database, with officers increasingly accessing 110 matches from the Home Office immigration database last year. Authorities concluded that using a passport database for facial recognition “presents no risk.”

  • The Home Office is collaborating with the police to develop a new national facial recognition system termed strategic facial matchers, which will enable searches across various databases, including custody images and immigration records.

Lindsey Chiswick, Met’s Intelligence Director General and the National Police Chiefs’ Council’s Facial Recognition lead, stated that five out of five London residents support the police’s utilization of advanced technologies like facial recognition cameras, based on the survey findings.

Recently, registered sex offender David Chenelle, a 73-year-old from Lewisham, was sentenced to two years after being caught alone with a 6-year-old girl through live facial recognition technology. He had previously served nine years for 21 offenses involving children.

Skip past newsletter promotions

In 2024, the Met arrested 587 individuals, with 424 of those arrests backed by live facial recognition technology, leading to formal charges.

Among those arrested, 58 registered sex offenders faced serious violations of their conditions, with 38 subsequently charged.

Chiswick noted: “Given the limited resources and time available, the demand is high, and we see criminals exploiting technology on an expansive scale.

“There’s a chance for law enforcement to evolve. Discussions about leveraging AI are abundant, but we must embrace the opportunities presented by technology and data.”

Chiswick emphasized that the Met’s approach is to “proceed cautiously and evaluate at each phase,” while noting that “there may be advantages to some form of framework or statutory guidance.”

The MET employs facial recognition cameras in contexts aimed at ensuring statistical significance regarding gender or ethnic bias in misidentification instances.

Chiswick remarked: “I refuse to utilize biased algorithms in London. Each instance carries weight. The government raises concerns: Is there no issue regarding artificial intelligence?

“When selecting an algorithm’s purchaser, determining the training data employed, and assessing the origin of the technology, testing it thoroughly is paramount; you are obliged to operate within a specific context.”

The Ministry of Home Affairs did not provide a comment upon request.

Source: www.theguardian.com

Face ID: A Useful Resource or a Source of Concern? The Subtle Integration of Facial Recognition in Law Enforcement

The future is arriving ahead of schedule in Croydon. While it may not initially seem like the UK’s forefront, North End is a pedestrian-friendly high street filled with typical pawn shops, fast-food restaurants, and a blend of branded clothing stores. It’s anticipated that this area will host one of the UK’s first permanent fixed facial recognition cameras.

Digital images of passersby will be captured discreetly and processed to derive biometric data, which includes facial measurements. This data will be rapidly compared against a watchlist via artificial intelligence, and a match will trigger an alert that might lead to an arrest.

As per the latest violence reduction strategy from the South London Borough, North End and its adjacent streets are identified as “major crime hotspots.” However, they do not rank among the most hazardous routes in the capital.

The crime rate here is the 20th worst among the 32 London Boroughs, excluding the City of London. Plans to launch permanent cameras for a trial phase later this summer are not an emergency measure; instead, North End and nearby London Roads might soon see more surveillance.

When approached about the surveillance initiative, most shopkeepers and visitors in the North End were unaware of the police’s plans or the underlying technology.

For many, the cameras appear as just another form of street furniture alongside signs promoting safe cycling. While some express concern, others reference studies indicating widespread exhaustion of the public facing rising crime rates.

The police began experimenting with facial recognition cameras in the UK and Wales in 2016. Recent documents released under the Freedom of Information Act (FOI) and police statistics shared with the Guardian reveal substantial growth in usage over the last year. This technology is evolving from a niche tool to a regular component of police strategies.

Last year, police scanned almost 4.7 million faces using live facial recognition cameras, with deployments more than doubling in 2023. In 2024, live facial recognition vans were utilized at least 256 times, up from 63 the previous year.

There’s speculation that mobile units of 10 live facial recognition vans may operate throughout the country.

Meanwhile, civil servants collaborate with law enforcement to develop a new national facial recognition system called strategic facial matchers. This platform will enable searches through various databases, including custody images and immigration files.

“The implementation of this technology could become a common sight in city centres and transit hubs across England and Wales,” states one funding document submitted by the South Wales police to the Home Department and released by Metropolitan Police under FOI.

Activists warn that this technology may disrupt everyday public life by subjecting individuals to impromptu identity checks facilitated by extensive facial recognition systems. Advocates of the technology acknowledge its risks but emphasize its importance for safety.

Recently, David Scheneller, a 73-year-old registered sex offender from Lewisham, who had served nine years for 21 offenses, was sentenced to two years in prison for breaching probation terms.

Officers were alerted by the live facial recognition cameras to Scheneller walking alone with his six-year-old child.

“He was on the watchlist due to his compliance conditions,” said Lindsay Chiswick, Metropolitan’s Intelligence Director and advisor to the National Police Chief of Facial Recognition.

“He formed a relationship with his mother over time and began picking up his daughter from school. If something went wrong that day, he was aware of the repercussions. This exemplifies how police could track him. Without facial recognition, recognizing him would have posed a challenge.”

Many see this as a compelling argument, but critics raise concerns about the unanticipated ramifications as law enforcement adopts technology without legislative guidance.

Madeline Stone from the NGO Big Brother Watch, who has observed mobile camera deployments, reported witnessing misidentifications of schoolchildren in uniforms undergoing “long, humiliating, and unnecessary police stops,” where they were compelled to verify their identities and provide fingerprints.

In these instances, the affected individuals were young Black boys, leaving them frightened and distressed, she noted.

Skip past newsletter promotions

“The effectiveness diminishes as the threshold rises,” Stone added. “The police might not prefer employing it in specific environments. There are no legal mandates requiring them to do so. The notion that police could unilaterally create their own guidelines for usage is truly alarming.”

A judicial review was initiated by Londoner Sean Thompson, with backing from Big Brother Watch, after he was wrongly identified as a person of interest due to the technology and detained for 30 minutes upon returning from a volunteer shift with the anti-knife initiative Street Father.

Additionally, Dr. Dara Murray, tasked with an independent evaluation of the trials by the Met in 2019, highlights the potential “chilling” effect this technology might have on society, suggesting that considerations must go beyond just the technology’s implementation.

“It’s akin to police tailing you, recording your interactions, where you go, how often, and for how long,” he remarked. “I believe most would be uncomfortable with such reality. Democracy thrives on dissent and discourse; if surveillance stifles that, it risks entrenching the status quo and limiting future opportunities.”

Live facial recognition is being utilized to apprehend individuals for traffic violations, growing cannabis, and neglecting community orders. Is this truly justified?

Fraser Sampson, former biometrics and surveillance camera commissioner in England and Wales until his position was dissolved in October 2023, currently serves as a non-executive director for FaceWatch, the leading UK firm in retail security systems designed to prevent shoplifting.

While he acknowledges the technology’s potential, he expresses concern that independent regulations concerning surveillance haven’t kept pace with its deployment by the state.

Sampson commented: “There’s an abundance of information about the technology’s functionalities, yet in practical terms—its application, the reason for its use, and the avenues for challenges or complaints—those clarity elements seem lacking.”

Chiswick noted her understanding of the concerns while recognizing the potential advantages of regulatory measures. The Met is cautiously making “small strides” that are continually reviewed, she stated. With limited resources, law enforcement needs to adapt and capitalize on the possibilities brought by AI. They are cognizant of potential “chilling effects” on society and have made it clear that cameras will not be deployed in protest areas.

“Will this become common? I cannot say,” Chiswick remarked. “We need to approach that assumption with caution. There are numerous possible scenarios; areas like the West End? It’s conceivable, instead of the static trials we’re conducting in Croydon, we could utilize it there. However, that’s not our current plan.”

She added: “I believe the integration of technology, data, and AI will continue to rise in the coming years, as personally, that’s how we can improve our operations.”

Source: www.theguardian.com

Nobel Prize continues to lack diversity in science recognition

Nobel Prize winners are awarded medals

Zhenkong Chen / Alamy

It's the most celebratory time of the year, as some of the brightest minds in science win Nobel Prizes. Recent winners have a few things in common. They definitely have a great body of work. And they're all men, they live in high-income countries, and none of them are black.

Gary Lubukun and Victor Ambrose received the Physiology or Medicine Prize for their discovery of microRNAs and their role in gene regulation to help treat cancer. A series of papers led to this discovery, many of which listed Ambrose's wife, Rosalind Lee, as the author. The Nobel Committee for Physiology and Medicine We would like to recognize Ms. Lee on social media.but did not go as far as awarding a medal. They may think that one device per family is enough.

Lee's omission may seem familiar. In 1962, James Watson, Francis Crick, and Maurice Wilkins received the award for their discovery of the molecular structure of DNA. This was the opposite Of the three papers published in the same issue nature. One was co-authored by Wilkins, another was co-authored by Watson and Crick, and the third was an image captured by Rosalind Franklin of DNA with two strands. Prior to publishing the image, It ended up in the hands of Watson and Crick.I then told them that their DNA model was a double helix. Franklin was removed from the Nobel Prize trophy.

Perhaps the committee dislikes the name Rosalind. but 972 people won the Nobel Prize Since our founding in 1901, 64 were women. This year's physics prize, awarded to John Hopfield and Geoffrey Hinton for discoveries related to machine learning, had a particularly poor hit rate, with only five women winning the award so far.

At least women in science are getting some recognition. No black person has ever won a science Nobel Prize, and only 17 black people have won the peace, literature, and economics prizes combined. Many people argue that Charles Drew says: African American man discovers a way to store plasma long-termmedicine was supposed to win, but Percy Julian figured it out. How to synthesize medicines from plantsneglected because of chemistry.

Geography also appears to play an important role in determining the winner. More than half of the prizes I went to the people of North America.and the few winners from low-income countries, most of whom had immigrated to North America or Europe by the time they won the award.

Some might say that all of this simply reflects the demographics of science. Less than 30% of researchers are womenfor example. However, it is no use failing to give credit where credit is due, especially when the Nobel Committee for Physiology or Medicine has issued a warning to the paper led by Lee. As the key publication behind the success of Ruvkun and Ambros.

The Royal Swedish Academy of Sciences, which administers the prizes in physics and chemistry, at least recognizes that this lack of diversity is a problem. Starting in 2019, recommenders are required to: When choosing candidates, pay attention to gender, ethnicity, and geographypeople who can&#39t put themselves forward. Sounds good in theory, but since then, only six women and none of them have won in science, and none have been black.

You may be wondering why this is important. Awards are great honors, but they shouldn’t drive scientists. However, being a Nobel Prize winner opens doors for researchers and brings their work into the public consciousness. For many people, the annual Nobel Prize may be the only time they see a scientist&#39s name in the news headlines, but this award plays a huge role in shaping our perception of science.

Part of the problem is that the prize structure, dictated by Alfred Nobel&#39s will, tends to enforce a “great man of history” approach to science that does not reflect the realities of modern research. The rules state that no more than three people can share the award, but this does not explain why Lee was left out of the winning duo of Lubukun and Ambros. Additionally, donations cannot be received after death. Otherwise, Ms. Franklin, who died of ovarian cancer in 1958 at the age of 37, might have received the donations by now.

Of course, such issues are not new, and it seems unlikely that the Nobel Prize committee will deviate from the wishes of its sponsors, but that is no reason to ignore diversity. The committee needs to cast a wider net, not just for the sake of fairness, but if it wants to ensure that the awards continue to be taken seriously.

Alexandra Thompson is assistant news editor at New Scientist.

topic:

Source: www.newscientist.com

Amazon warehouse workers in Coventry narrowly miss out on union recognition in voting process

The TUC is determined to continue the fight for union recognition at Amazon despite workers at the US retailer’s Coventry warehouse being denied collective bargaining rights by a narrow margin of 29 votes.

In a significant vote that could have compelled Amazon to allow trade unions in the UK for the first time, 50.5% of the voting workers chose to reject GMB union representation, a result that could have been overturned if just 15 people had switched sides.

General secretary of the TUC, Paul Nowak, stated, “Amazon is actively working to prevent workers from having an independent voice at their workplace. This is not the end – our movement remains united to expose bad employers.”

According to union sources, Amazon had instilled a culture of fear and used intimidation tactics to suppress support among the 3,000 workers at the West Midlands site during the over a year-long battle for recognition.

GMB activists were permitted to present their case to workers in a timed meeting before the vote, while managers conducted separate information sessions to argue against approval.

GMB senior organizer Stuart Richards mentioned that the union would explore legal avenues. He added, “Amazon has been relentless in its attacks on its own employees. Workers have been subjected to pressure to attend lengthy anti-union seminars.”

Workers were granted the right to a binding vote by an independent Central Arbitration Committee in April after a GMB campaign, as Amazon had initially refused voluntary recognition. The voting process had oversight from independent advisers appointed by the CAC.

Had employees voted in favor of recognition, GMB would have had the authority to negotiate pay and working conditions on their behalf, marking the first time Amazon would have recognized a trade union in the UK.

Additionally, it would have been the first instance of employees of an internet retailer receiving recognition rights outside the US.

Mr. Richards expressed concerns about Amazon’s conduct and reiterated the ongoing determination of workers to seek justice.

“Although Amazon’s anti-union stance prevailed in this instance, the underlying issues around work intensity and pay that sparked this dispute remain unresolved,” noted Callum Kant, a senior lecturer at the University of Essex who studies the gig economy.

Under current regulations, a union cannot reapply for recognition for the same group of workers for three years after losing a vote. The government has indicated plans to ease the process of securing recognition as part of its new worker-friendly policies, though the impact on cases like Amazon’s remains uncertain.

Amazon responded, stating, “We appreciate all those who participated in this vote. At Amazon, we prioritize direct engagement and regular conversations with our employees. We value these relationships and strive to offer career growth opportunities in a safe environment with competitive pay and benefits.”

Source: www.theguardian.com

Amazon Workers in UK Warehouses Vote for Union Recognition: A Groundbreaking Moment

Located on a traffic island on the outskirts of Coventry, a small yet dynamic organizing team from the GMB union is challenging Amazon with homemade signs and orange bucket hats.

In a historic vote last week, over 3,000 Amazon workers were given the opportunity to participate in a vote that could potentially lead to the company recognizing a trade union in the U.K. for the first time. This is part of a series of battles worldwide over union recognition at the retail-to-cloud conglomerate founded by Jeff Bezos in 1994, now valued at over $2 trillion.

If successful, the GMB victory would grant workers the right to negotiate with Amazon on matters such as wages, hours, and holidays, something the famously anti-union company opposes. GMB also believes that recognition would give them an advantage in addressing health and safety concerns within the expansive Coventry warehouse, known as BHX4.

The ongoing campaign at BHX4 sheds light on the barriers unions face, despite promises from the new Labour government to bolster trade union influence.

Results of the vote, held after a month-long process, are expected on Monday. Workers have received support from local Labour Party MP Taiwo Owateme, who commended their efforts.


Kate Gorton, GMB member, union To raise awareness about voting, snacks, drinks and gifts were distributed to employees at the end of each shift. Photo: Fabio De Paola/Observer

At the approach to BHX4, a dedicated GMB team welcomes workers as they arrive for the night shift amidst the sounds of dhol drummers creating a festive atmosphere.

These workers are encouraged to vote “Yes!” and many show support by honking or signaling approval from their vehicles. The campaign aims to bring positive change to the work environment at BHX4.

Meanwhile, GMB activists at another entrance offer refreshments and free hats to workers, urging them to vote in favor of recognition.

Despite some workers bypassing the activists, many engage in conversation, accept hats and drinks, and express their support for GMB.

The GMB’s current push for full authorization is their second attempt after withdrawing the initial application last year, citing Amazon’s strategic hiring practices to influence the outcome.

As the vote results await, GMB and other trade unions remain hopeful for meaningful changes under Labour’s New Deal for Workers, advocating for stronger collective bargaining and workers’ rights.

Amidst Amazon’s resistance to unions globally, the Coventry activists persevere, optimistic that they can achieve historic recognition. The company maintains that employees have the choice to join a union and emphasizes its commitment to direct engagement with employees.

Source: www.theguardian.com

First union recognition vote begins for Amazon UK employees

GMB officials recently visited Amazon UK’s Coventry site, initiating a month-long balloting process that brings workers closer to gaining union recognition for the first time.

This move follows Amazon’s rejection of voluntary recognition, leading to an independent central arbitration panel granting GMB the right to hold a legally binding vote.

If employees vote in favor of recognition, GMB will represent them in negotiations regarding pay and working conditions, a historic first for Amazon in the UK.

Over 2,000 employees will participate in meetings with union representatives and company officials, presenting their cases starting Wednesday. The voting process will take place in workplaces from July 8th, with results announced after July 15th.

Amanda Gearing, senior GMB organiser, noted Amazon’s resistance to unionization efforts but highlighted the determination of Coventry workers to improve their conditions.

The ongoing struggle includes union members in Coventry facing challenges like QR codes revoking their union membership and strikes demanding higher wages and a seat at the negotiation table.

Black Friday actions last year saw union members from the US and Europe supporting Coventry workers, showcasing international solidarity in the fight for workers’ rights.

GMB’s success at Amazon’s New York site serves as a model for Coventry workers as they strive for recognition and fair representation.

The GMB aims to secure a majority vote in favor of recognition with at least 40% of frontline workers supporting the decision.

Skip Newsletter Promotions

The Labour Party’s commitment to empowering trade unions reflects a broader push for worker rights and representation.

The TUC’s general secretary, Paul Nowak, emphasized the importance of workers securing better pay and conditions through unionization, countering Amazon’s anti-union tactics.

In response, an Amazon spokesperson highlighted the company’s minimum wage increase and emphasized employees’ choice in joining a union.

Source: www.theguardian.com

Ed Dwight, the first black astronaut, finally receives recognition.

Ed Dwight spent his childhood on a farm on the outskirts of Kansas back in the 1930s. The airfield was nearby, and as a young child, he would often visit to watch the planes and the pilots. Most of the planes were returning from hunting trips, and the inside was stained with blood and empty beer cans on the floor.

“They would say to me, ‘Hey, can you clean my plane? I’ll give you a dime,’” Dwight, 90, recalls. But when he was eight or nine years old, he wanted more than a dime. He wanted to fly.

“My first flight was the most exciting thing in the world,” Dwight says with a laugh. “There were no roads or stop signs. You were free as a bird.”

It took years for Dwight to entertain the thought of becoming a pilot himself. “It was white people’s territory,” he says. But when he saw an image of a black pilot shot down in South Korea on the front page of a newspaper, he immediately decided that he wanted to fly.

With that decision, Dwight set in motion a chain of events that would nearly lead him to become one of the first astronauts. Dwight was hand-picked by President John F. Kennedy’s White House to join Chuck Yeager’s test pilot program at Edwards Air Force Base in California’s Mojave Desert as he rose through the ranks in the Air Force.

US Air Force Colonel Ed Dwight.
Provided by Ed Dwight/National Geographic, via AP

However, after Kennedy’s assassination, Dwight’s path to NASA disappeared and he was not selected for the space program. In recent years, Dwight has finally begun to receive recognition with the release of the new National Geographic Documentary “Space Race,” which tells the story of Dwight, a pioneer who was nearly one of the first black astronauts.

When the Soviet Union launched Sputnik into orbit in 1957, it influenced the formation of NASA. Dwight was not initially thinking about becoming an astronaut, but political leaders became conscious of the image that American astronauts could project about American democracy, and then Dwight got an unexpected invitation to train to become an astronaut. He received acclaim for appearing on the covers of black magazines such as Jet and Sepia, but faced hostility by police officers and other developments during his training. Dwight was the only one selected by the White House to train at that time, yet eventually was not among the 14 selected in 1963, following Kennedy’s assassination.

Air Force Col. Ed Dwight sits in the cockpit at the beginning of flight training in 1954.
Provided by Ed Dwight/National Geographic, via AP

Despite not becoming an astronaut, Dwight accomplished many things, including founding a construction company and earning a Master of Fine Arts degree in sculpture. He has created sculptures of notable black historical figures and had some of his sculptures flown into space aboard the Orion spacecraft. NASA even named an asteroid after him. For subsequent black astronauts, Dwight serves as an influential figure and is honored to be considered a pioneering black astronaut.

NASA astronaut Victor Glover, from left, Ed Dwight and Leland Melvin, on Thursday in Pasadena, California.
Chris Pizzello/AP

Dwight is filled with gratitude for the experiences he had and the opportunity to meet many influential figures throughout his life. He envisions that astronauts and influential leaders should have the chance to view the Earth from above to understand the futility of racism. “We’re flying around the galaxy in this little ball,” Dwight says.

For more information on NBC BLK, Sign up for our weekly newsletter.

Source: www.nbcnews.com

Wrongful Imprisonment Lawsuit Alleges Use of Facial Recognition Technology Following Sunglass Hut Robbery

A 61-year-old man is suing Macy’s and Sunglass Hut’s parent company, alleging that the store’s use of a facial recognition system misidentified him as the perpetrator of an armed robbery, leading to his false arrest. While in prison, he was beaten and raped, according to the complaint.

Harvey Eugene Murphy Jr. was arrested and charged with robbing a Houston-area Sunglass Hut of thousands of dollars worth of merchandise in January 2022, but his lawyers say he was living in California at the time of the robbery. According to his lawyer, he was arrested on October 20, 2023.

According to Murphy’s complaint, employees at Essilor Luxottica, Sunglass Hut’s parent company, worked with retail partner Macy’s to use facial recognition software to identify Murphy as the robber. The images sent through the facial recognition system came from a low-quality camera, according to the complaint. Houston police were investigating an armed robbery when an EssilorLuxottica employee called police and determined that one of the two robbers was using the technology, so the investigation could be discontinued. I told him. The employee also said the system indicated that Murphy had committed two other robberies, according to the complaint.

When Murphy returned to Texas from California, he went to the Department of Motor Vehicles (DMV) to renew his license. Murphy told the Guardian that within minutes of identifying himself as a DMV employee, he was contacted by a police officer and informed that there was a warrant out for his arrest on suspicion of aggravated robbery. He said Murphy was not given any details about the crime he allegedly committed other than the day the robbery occurred. He found himself more than a thousand miles away in Sacramento, California, at the time of the robbery.

“I almost thought it was a joke,” Murphy said.

Still he He was arrested and taken to the local county jail, where he was held for 10 days before being transported to the Harris County Jail for processing.

After several days in Harris County, the alibi was confirmed by both a public defender and a prosecutor, and the charges were ultimately dropped, according to the complaint.

Murphy has never been convicted of a crime. Nevertheless, he says the detention left deep scars. He claimed that he was brutally beaten and gang-raped by three other men inside the prison hours before his release. Murphy said he was threatened with death when he tried to call prison staff. After the alleged attack, Murphy remained in the same cell as them until his release.

“It was kind of scary,” Murphy said. “My anxiety is so high, I’m shaking all the time. And I just stood up in my bunk and faced the wall, just praying that something would happen and get me out of the tank.”

“This attack left him with permanent scars that he will have to live with for the rest of his life,” the complaint states. “All of this happened to Murphy because the defendants relied on facial recognition technology that is known to be error-prone and flawed.”

Murphy didn’t realize that facial recognition technology could be used as evidence against her until two weeks ago, when she began working with her attorney, Daniel Dutko.

Datko said he discovered in police documents that Sunglass Hut employees shared camera footage with Macy’s that was used to identify Murphy. Datko said Macy’s and Sunglass Hut then contacted police together. Although Macy’s has retail partnerships with eyewear brands at multiple stores, Macy’s was not involved in the robbery because Sunglass Hut is an independent store, he said.

“We’re very comfortable saying that facial recognition software is the only possible explanation and that’s the only reason. [Sunglass Hut] I was going to Macy’s to identify him,” Datko said.

Skip past newsletter promotions

Mr. Murphy’s case marks the seventh known case of false arrest using facial recognition in the United States, further highlighting flaws in the technology, which is already widely used in police departments and retail stores. However, all of the publicly known cases of false arrests using facial recognition to date have involved black victims. Murphy’s case marks the first known case in which a failure of this technique resulted in the wrongful arrest of a white man. Just last month, Rite Aid settled with the Federal Trade Commission over its use of a facial recognition system that previously misidentified Black, Latino, and Asian customers as “likely to be involved” in shoplifting. As part of the settlement, the pharmacy chain was banned from using facial recognition in its stores for five years. Then, in the summer of 2023, a woman named Portia Woodruff was arrested on suspicion of carjacking using a facial recognition system to authenticate her identity.

Macy’s has previously sued Regarding the use of facial recognition technology. In a 2020 lawsuit, a Chicago woman accused the company of violating Illinois’ biometric privacy law by collaborating with facial recognition provider Clearview AI without her or other customers’ consent.

Nathan Fried Wessler, deputy director of the American Civil Liberties Union’s Speech, Privacy, and Technology Project, said this is another example of the “extreme dangers of facial recognition technology.”

“We have seen case after case where police reflexively relied on unreliable facial recognition results, allowing the technology to make false matches and corrupting witness identification procedures,” Wessler said in a statement. “As the facts alleged in this case demonstrate, the consequences of wrongful arrest are dire. Lawmakers need to stop law enforcement and companies from dangerously relying on facial recognition results to put people behind bars. There must be.”

Murphy is seeking $10 million in damages.

Macy’s said it does not comment on pending litigation. EssilorLuxottica did not immediately respond to the Guardian’s request for comment.

Source: www.theguardian.com

Rite Aid Prohibits Use of Facial Recognition Software for Shoplifting Impersonation

Rite aid It has been Banned US drugstore giant’s use of facial recognition software comes after Federal Trade Commission (FTC) finds ‘reckless use of facial surveillance system’ humiliates customers and ‘compromises confidential information’ was banned for five years.

F.T.C. orderU.S. Bankruptcy Court approval required after Rite Aid Filing for bankruptcy protection under Chapter 11 of the Federal Bankruptcy Code In October, it directed Rite Aid to delete images collected as part of its facial recognition system rollout and products built from those images. Companies must also implement robust data security programs to protect the personal data they collect.

Reuters 2020 report Details of how the drugstore chain secretly installed facial recognition systems in about 200 U.S. stores over an eight-year period starting in 2012, using “primarily low-income, non-white neighborhoods” as testbeds for the technology. Stated.

With the increase in FTC Focus on the abuse of biometric surveillance, Rite Aid was firmly targeted by government agencies. Among the allegations: Rite Aid partnered with two contracting companies to create a “watch list database” containing images of customers it said had engaged in criminal activity at one of its stores. Includes what you did. These images are often of low quality and are taken from CCTV or an employee’s mobile phone camera.

When a customer enters a store that appears to match an existing image in the database, employees receive an automated alert instructing them to take action, which in most cases involves “walking closer and identifying”; That means verifying the customer’s identity and asking them to leave. According to the FTC, these “matches” were often false positives, causing employees to falsely accuse customers of wrongdoing and causing “embarrassment, harassment, and other harm.”

“Following false positive alerts, employees may follow consumers in the store, search them, order them to leave, call the police, confront or remove consumers, and sometimes shoplift in front of their friends and family. and other misconduct,” the suit says.

Additionally, the FTC said Rite Aid did not notify customers that facial recognition technology was being used and specifically instructed employees to: do not have Reveal this information to your customers.

face off

Facial recognition software has emerged as one of the most controversial aspects of the AI-powered surveillance era. In recent years, cities have issued broad bans on the technology while politicians have fought to regulate how police use it. Meanwhile, companies like Clearview AI have been hit with lawsuits and fines around the world for massive data privacy violations involving facial recognition technology.

The FTC’s latest findings regarding Rite Aid also shed light on the biases inherent in AI systems. For example, the FTC says Rite Aid failed to reduce risks to certain consumers due to race. The technology is “more likely to generate false positives in stores located in predominantly Black and Asian communities than in predominantly white communities.” Observation notes.

Additionally, the FTC said Rite Aid failed to test or measure the accuracy of its facial recognition system before or after its implementation.

in press releaseRite Aid said it was “pleased to reach an agreement with the FTC” but disagreed with the core of the allegations.

“The allegations relate to a pilot program for facial recognition technology that we implemented in a limited number of stores,” Rite Aid said in a statement. “Rite Aid stopped using the technology at this small group of stores more than three years ago, before the FTC’s investigation into the company’s use of the technology began.”

Source: techcrunch.com