AI Firm Secures High Court Victory in Copyright Dispute with Photo Agency

An artificial intelligence company based in London has achieved a significant victory in a High Court case that scrutinized the legality of an AI model using extensive copyrighted data without authorization.

Stability AI, led by Oscar-winning Avatar director James Cameron, successfully defended itself against allegations from Getty Images, claiming that it infringed on the international photography agency’s copyright.

This ruling is seen as a setback for copyright holders’ exclusive rights to benefit from their creations. Rebecca Newman, a legal director at Addleshaw Goddard, cautioned that it suggests “the UK derivative copyright system is inadequate to protect creators”.

There was evidence indicating that Getty Images were utilized in training Stability’s model, which enables users to generate images via text prompts. In certain instances, Stability was also found to violate Getty’s trademarks.

Judge Joanna Smith remarked that determining the balance between the interests of the creative industries and AI sectors holds “real social significance.” However, she could only address relatively limited claims as Getty had to withdraw parts of its case during the trial this summer.

Getty Images initiated legal action against Stability AI for violations of its intellectual property rights, claiming the AI company scraped and replicated millions of images with “complete indifference to the content of the training data.”


This ruling comes amid ongoing debates about how the Labour government should legislate on copyright and AI matters, with artists and authors like Elton John, Kate Bush, Dua Lipa, and Kazuo Ishiguro advocating for protections. In contrast, tech firms are seeking broader access to copyrighted material to develop more powerful generative AI systems.

The government is conducting a consultation regarding copyright and AI, stating: “The uncertainty surrounding the copyright framework is hindering the growth of both the AI and creative sectors. This situation must not persist.”

Lawyers at Mishcon de Reya, pursuing this matter, are contemplating introducing a “text and data mining exception” to the UK copyright law, which would enable copyrighted works to be utilized for training AI models unless rights holders opt-out.

Due to a lack of evidence indicating that the training took place in the UK, Getty was compelled to retract its original copyright claim. Nevertheless, the company proceeded with its lawsuit, asserting that Stability continues to use copies of visual assets, which it describes as the “lifeblood” of its business. The lawsuit alleges trademark infringement and “spoofing,” as some generated images bore Getty’s watermark.

Highlighting the complexities of AI copyright litigation, the group essentially argued that Stability’s image generation model, known as Stable Diffusion, constitutes an infringing copy, as its creation would represent copyright infringement if produced in the UK.

The judge determined that “AI models like Stable Diffusion that do not (and never have) stored or reproduced copyrighted works are not ‘infringing copies.'” She declined to adjudicate on the misrepresentation claims but ruled in favor of some of Getty’s trademark infringement claims regarding the watermark.

In a statement, Getty Images remarked: “We are profoundly worried that even well-resourced organizations like Getty Images face considerable challenges in safeguarding creative works due to the absence of transparency requirements. We have invested millions with one provider alone, but we must continue our pursuit elsewhere.”

“We urge governments, including the UK, to establish more robust transparency regulations. This is crucial to avoid expensive legal disputes and ensure creators can uphold their rights.”

Stability AI’s General Counsel, Christian Dowell, stated, “We are pleased with the court’s ruling on the remaining claims in this case. Although Getty’s decision to voluntarily withdraw most of the copyright claims at the trial’s conclusion left the court with only a fraction of the claims, this final decision addresses the core copyright issues. We appreciate the time and effort the court has dedicated to resolving the significant matters in this case.”

Source: www.theguardian.com

New Study Focuses on Historical Floods to Aid Property Owners in Evaluating Climate Risks

New research published on Monday indicates that the number of structures in the United States vulnerable to flooding may be greater than previously estimated.

A study, published in the Journal’s Future of Earth, revealed that 43% of North Carolina’s flooded buildings from 1996 to 2020 are outside the Federal Emergency Management Agency’s Special Flood Hazard Areas, which FEMA designates as having a 1% chance of flooding in any given year.

Experts and the study’s authors believe these findings may assist more communities in understanding their flood risks and insurance needs. As companies pull back from increasing insurance premiums and insuring homes in certain regions due to more frequent severe weather, obtaining flood insurance is becoming increasingly challenging, leaving property owners more exposed.

According to NBC News, experts stated that having access to reliable data sources could facilitate the creation of nationwide databases like the one outlined in the study.

FEMA’s maps serve as the main reference for identifying flood-prone zones and structures requiring insurance coverage. However, experts argue that this system is outdated and lacks resolution. A 100-year floodplain projection doesn’t fully encompass the areas that could be at risk for flooding, and the authors noted that climate change has compromised the reliability of previously authoritative weather and climate forecasts.

This study enabled researchers to pinpoint specific structures that have experienced multiple flooding incidents by utilizing address-level data. This information could inform property owners on whether to obtain flood insurance, even if they fall outside FEMA’s designated flood hazard zones. In the United States, FEMA reports that 99% of counties have experienced flooding, yet only 4% of homeowners hold flood insurance at least once.

“Detailed information about previously flooded areas can empower individuals to make informed decisions,” expressed Helena Margaret Garcia, the lead author of the study and a doctoral candidate at the University of North Carolina Chapel Hill.

FEMA and the National Flood Insurance Program did not respond to requests for comments.

FEMA is currently working on revamping its flood mapping system through the Future of Flood Risk Data initiative, aimed at employing more efficient, accurate, and consistent technology to assess the country’s flood risks comprehensively.

Water markers in Orocoke, North Carolina homes illustrate the increasing severity of flooding between hurricanes over time.
Courtesy of Helena Garcia

As global temperatures rise, more intense rainfall events are becoming common since warmer air can hold more moisture. Climate change has also contributed to rising sea levels, increasing vulnerability to flooding even in coastal and some inland areas. Hurricane Helen, the third deadliest hurricane of modern times, inflicted substantial flooding in North Carolina’s mountainous regions. Damage from the storm exceeded a total of $78 billion, according to the National Oceanic and Atmospheric Administration.

The recent severe rainfall has caused flash floods across many U.S. towns over the last two weeks, most recently impacting towns in New Jersey and New York City. In North Carolina alone, six deaths have been reported, while over 100 fatalities have occurred in Texas. Flash floods have historically accounted for the highest number of storm-related deaths in the U.S., averaging 125 deaths per year over recent decades, according to the National Weather Service.

A North Carolina climate scientist noted that the warm seawater feeding the storm could have been a contributing factor to last week’s Tropical Storm Chantal, which set rainfall and river level records in the state.

With the impacts of climate change, having access to up-to-date maps is essential, experts argue.

In their study, researchers mapped 78 flood events using data from 1996 to 2020 from the National Flood Insurance Program, which allows property owners to purchase insurance for potential flood-related losses. They also incorporated emergency service requests and volunteer-based geographic information obtained from social media, creating what they termed the “first of its kind” database. This historical flood data is often inaccessible to the public, which can hinder homeowners’ awareness of prior flood incidents affecting their properties.

Garcia stated that researchers plan to apply the same methodology from their study to create similar historical maps in various regions across the country, identifying areas previously overlooked concerning flood risk.

The research indicated that out of 90,000 structures, only 20,000 experienced flooding, highlighting a significant number that did not file NFIP claims. The study’s authors are now focused on quantifying the human impact, examining outcomes such as healthcare disruptions resulting from floods and the scale of evacuations caused by repeated flooding, as Garcia explained.

On July 7th, Eastgate Mall’s parking lot in Chapel Hill, North Carolina, was inundated following a tropical storm.
Courtesy of Antonia Sebastian/Helena Garcia

Antonia Sebastian, an assistant professor in the Department of Geology at UNC-Chapel Hill, noted that the study aspires to establish a “comprehensive record of past floods” by gathering data on relatively minor flood events that impact homeowners yet do not receive widespread media attention.

“Areas that have previously flooded are likely to flood again; it’s only a matter of time,” Sebastian remarked. “These sites are inherently dangerous.”

Chad Belginis, executive director of the Flood Plains Association who was not involved in the study, stated that FEMA’s maps are intended to implement the NFIP by utilizing 100-year flood plains to delineate zones for flood insurance coverage. He emphasized that analyzing historical flood research highlights the necessity for flood insurance in areas beyond the mapped FEMA flood hazard zones.

“FEMA’s flood map serves as a starting point, not an endpoint,” Berginnis clarified. “The maps present one kind of flood risk, but to fully grasp flood risk, it is crucial to seek out historical flood data.”

He mentioned that the database models illustrated in the study could be adopted in various other regions. A community manager indicated that identifying past flood sites could enhance community awareness regarding flood risks.

“When we consider the history of floods, [like those along the Guadalupe River], it emphasizes the dangers that have persisted over time,” they reflected. “I truly believe that these areas should be acknowledged as perilous, yet awareness of the actual flooding risks is often lacking.”

June Choi, a doctoral student in Earth Systems Science at Stanford University and not involved in the study, noted that the finding regarding many flooded buildings existing outside FEMA-designated risky areas likely applies to all states. She highlighted that while new databases might have limitations regarding changing building densities over time, the application of historical records and address-specific data still provides valuable insights for assessing future flood risks.

Source: www.nbcnews.com

France imposes a 250 million euro fine on Google for breaching intellectual property agreements

French regulators have fined Google €250m (£213m) for breaching agreements with media companies regarding online content reproduction.

The competition watchdog in France announced the fine on Wednesday, citing violations related to intellectual property rules concerning news media publishers. Regulators also raised concerns about Google’s AI services.

According to authorities, Google’s AI-powered chatbot Bird (now called Gemini) trained on content from publishers and news agencies without their knowledge. This action led to the fine.

The watchdog stated that the fine was for failing to fulfill commitments made to Google in 2022, accusing Google of not negotiating in good faith with news publishers for compensation for using their content.

As part of the settlement process, Google has agreed not to dispute the facts and is proposing corrective measures to address the shortcomings highlighted by the authorities.

The EU created “neighboring rights” copyright to enable print media to claim compensation for their content usage. France has been at the forefront of this issue, enacting laws to protect publishing rights against tech giants that monetize news content without sharing revenue.

The recent fine is a result of a copyright dispute in France over online content, initiated by complaints from leading news organizations and the news agency Agence France-Presse (AFP) in 2019.

Google’s AI chatbot Bird was criticized for using content from media outlets without permission, impacting fair negotiations between publishers and Google.

Amid efforts to protect content scraping by AI services without consent, Google has faced fines for not adhering to commitments and fair negotiation practices with publishers.

Google responded by highlighting its agreements with over 280 French news publishers under the European Copyright Directive. The company stated its commitment to constructive cooperation with publishers to connect people with quality content sustainably.

The statement emphasized the need for collaborative efforts with publishers but also expressed challenges in navigating regulatory changes. Despite the fine, Google aims to move forward positively in the content ecosystem.

Source: www.theguardian.com