The London-based firm Stability AI, specializing in artificial intelligence, argues that the copyright lawsuit initiated by global photography agency Getty Images poses a significant “obvious threat” to the AI generation industry.
Stability AI contested Getty’s claims in the London High Court on Monday, which center on issues of copyright and trademark infringement regarding its extensive collection of photographic works.
Stability enables users to create images based on text prompts. Among its directors is James Cameron, the acclaimed director of Avatar and Titanic. In response, Getty criticized those training AI systems as “tech nerds,” suggesting they disregard the ramifications of their technological advancements.
Stability retorted by asserting that Getty is pursuing a “fantasy” legal path, investing around £10 million to challenge a technology it views as an “existential threat” to their operations.
Getty syndicates around 50,000 photographers’ work to clients across more than 200 countries. It alleges that Stability trained its image generation models using an extensive database of copyrighted photographs. Consequently, a program named Stability Diffusion continues to produce images bearing watermarks from Getty Images. Getty maintains that Stability is “completely indifferent” to the sources of their training data, asserting that the system “is associated with pornography-related trademarks” and generates “AI garbage.”
Getty’s legal representatives noted that the contention over the unauthorized utilization of thousands of photographs, including well-known images of celebrities, politicians, and news events, “is not a conflict between creativity and technology where a victory for Getty Images spells the end for AI.”
They further stated: “The issue arises when AI companies like Stability wish to use these materials without compensation.”
Lindsay Lane KC, representing Getty Images, commented, “These were a group of tech enthusiasts enthusiastic about AI, yet indifferent to the challenges and dangers it poses.”
In her court filing on Monday, Getty contended that Stability had trained an image generation model using a database that included child sexual abuse material.
Stability is contesting Getty’s claims overall, with its attorney characterizing the allegations regarding child sexual abuse material as “abhorrent.”
A spokesperson for Stability AI stated that the company is dedicated to ensuring its technology is not misused. It emphasized the implementation of strong safeguards “to enhance safety standards and protect against malicious actors.”
After the newsletter promotion
This situation arises in the context of a broader movement among artists, writers, and musicians—including figures like Elton John and Dua Lipa—who are advocating for copyright protection against alleged infringement by AI-generated content that allows users to produce new images, music, and text.
The UK Parliament is embroiled in a related issue, with the government proposing that copyright holders should have the option to opt-out of the material used for training algorithms and generating AI content.
“Of course, Getty Images acknowledges that the entire AI sector can be a formidable force, but that does not justify permitting the AI models they are developing to blatantly infringe on their intellectual property rights,” Lane stated.
The trial is expected to span several weeks and will address, in part, the use of images by renowned photographers. This includes a photograph of former Liverpool soccer manager Jürgen Klopp, captured by award-winning British sports photographer Andrew Livesey, a photo of the Chicago Cubs baseball team by American sports photographer Gregory Shams, and images of actor and musician Donald Glover by Alberto Rodriguez, as well as photographs of actor Eric Dane and film director Christopher Nolan.
The case brings forth 78,000 pages of evidence, with AI experts summoned to testify from the University of California, Berkeley, and the University of Freiberg in Germany.
Source: www.theguardian.com
Discover more from Mondo News
Subscribe to get the latest posts sent to your email.