Get Paid for Every Post: Scientists Reveal How to Monetize Your Online Content

You can be compensated for your online posts, provided they are utilized for AI training.

According to Dr. Margaret Mitchell, the chief ethics scientist at Hugging Face, an open-source AI company, there is a pressing need for AI firms to trace AI-generated content back to its original creators.

“Many creators—including artists, writers, and everyday users—are losing out on compensation for their contributions,” she stated during her talk at AI Everything in Cairo, Egypt.

“I envision a future where we can truly identify the sources of input that make AI outputs possible and adequately reward them.”

Generative AI heavily relies on certain creators more than others. Some AI-generated works exhibit distinct links between input and output, such as a recognizable writing style or an artist’s signature.

Recently, renowned Japanese animator and film director Hayao Miyazaki criticized AI-generated images that mimic the unique style of his Studio Ghibli films.

But the issue extends beyond musicians and artists, as large-scale language models (LLMs) like ChatGPT and Google Gemini are trained on extensive online resources.

AI companies frequently “harvest” data from the internet to train their LLMs. – Credit: Getty

“We are all creators,” Mitchell emphasized, as reported by BBC Science Focus. It is essential that reward models recognize contributions from all online users, whether it’s a poem or a vacation sunset photo taken five years ago.

Fortunately, there are emerging models that can track the relationship between input and output, rewarding creators based on their contributions.

However, such a system is not yet in place, and existing AI business models hinder the funding required to develop it, Mitchell said, although some AI companies are exploring potential solutions.

For instance, in a document from 2021, AI company Anthropic’s CEO Dario Amodei proposed a “crazy idea” for a reward distribution model akin to the monetization platform Patreon, which was recently opened by court order.

Mitchell noted that existing LLMs could implement known technological strategies to facilitate this model. Clustering algorithms, for example, could help track similarities and attribute authorship.

To maintain user privacy, this model would require consent, allowing users to opt in for their data to be linked to their identity (with compensation) or remain anonymous.

“To foster innovation, we need to pave the way for this kind of research,” Mitchell asserted. “Currently, the path forward is completely closed.”

Read More:

Source: www.sciencefocus.com

Alabama Paid Millions to Law Firms for Prison Protection: AI-Generated Fake Citations Uncovered

Frankie Johnson, an inmate at William E. Donaldson Prison near Birmingham, Alabama, reports being stabbed approximately 20 times within a year and a half.

In December 2019, Johnson claimed he was stabbed “at least nine times” in his housing unit. Then, in March 2020, after a group therapy session, officers handcuffed him to a desk and exited the unit. Shortly afterward, another inmate came in and stabbed him five times.

In November that same year, Johnson alleged that an officer handcuffed him and transported him to the prison yard, where another prisoner assaulted him with an ice pick and stabbed him “five or six times,” all while two corrections officers looked on. Johnson contended that one officer even encouraged the attack as retaliation for a prior conflict between him and the staff.

In 2021, Johnson filed a lawsuit against Alabama prison officials, citing unsafe conditions characterized by violence, understaffing, overcrowding, and significant corruption within the state’s prison system. To defend the lawsuit, the Alabama Attorney General’s office has engaged law firms that have received substantial payments from the state to support a faulty prison system, including Butler Snow.

State officials have praised Butler Snow for its experience in defending prison-related cases, particularly William Lansford, the head of their constitutional and civil rights litigation group. However, the firm is now facing sanctions from a federal judge overseeing Johnson’s case, following incidents where its lawyers referenced cases produced by artificial intelligence.

This is just one of several cases reflecting the issue of attorneys using AI-generated information in formal legal documents. A database that tracks such occurrences has noted 106 identified instances globally, where courts have encountered “AI hallucinations” in submitted materials.

Last year, lawyers received one-year suspensions for practicing law in Florida’s Central District after it was found that they were citing cases fabricated by AI. Earlier this month, a federal judge in California ordered a firm to pay over $30,000 in legal fees for including erroneous AI-generated studies.

During a hearing in Birmingham on Wednesday regarding Johnson’s case, U.S. District Judge Anna Manasco mentioned that she was contemplating various sanctions, such as fines, mandatory legal education, referrals to licensing bodies, and temporary suspensions.

She noted that existing disciplinary measures across the country have often been insufficient. “This case demonstrates that current sanctions are inadequate,” she remarked to Johnson’s attorney. “If they were sufficient, we wouldn’t be here.”

During the hearing, attorneys from Butler Snow expressed their apologies and stated they would accept any sanctions deemed appropriate by Manasco. They also highlighted their firm policy that mandates attorneys seek approval before employing AI tools for legal research.

Reeves, an attorney involved, took full responsibility for the lapses.

“I was aware of the restrictions concerning [AI] usage, and in these two instances, I failed to adhere to the policy,” Reeves stated.

Butler Snow’s lawyers were appointed by the Alabama Attorney General’s Office and work on behalf of the state to defend ex-commissioner Jefferson Dunn of the Alabama Department of Corrections.

Lansford, who is contracted for the case, shared that the firm has begun a review of all previous submissions to ensure no additional instances of erroneous citations exist.

“This situation is still very new and raw,” Lansford conveyed to Manasco. “We are still working to perfect our response.”

Manasco indicated that Butler Snow would have 10 days to file a motion outlining their approach to resolving this issue before she decides on sanctions.

The use of fictitious AI citations has subsequently influenced disputes regarding case scheduling.

Lawyers from Butler Snow reached out to Johnson’s attorneys to arrange a deposition for Johnson while he remains incarcerated. However, Johnson’s lawyers objected to the proposed timeline, citing outstanding documents that Johnson deemed necessary before he could proceed.

In a court filing dated May 7, Butler Snow countered that case law necessitates a rapid deposition for Johnson. “The 11th Circuit and the District Court typically allow depositions for imprisoned plaintiffs when relevant to their claims or defenses, irrespective of other discovery disputes,” they asserted.

The lawyers listed four cases that superficially supported their arguments, but all turned out to be fabricated.

While some case titles were reminiscent of real cases, none were actually relevant to the matter at hand. For instance, one was a 2021 case titled Kelly v. Birmingham; however, Johnson’s attorneys noted that “the only existing case titled Kelly v. City of Birmingham could be uniquely identified by the plaintiff’s lawyers.”

Earlier this week, Johnson’s lawyers filed a motion highlighting the fabrications, asserting they were creations of “generative artificial intelligence.” They also identified another clearly fictitious citation in prior submissions related to the discovery dispute.

The following day, Manasco scheduled a hearing regarding whether Butler Snow’s counsel should be approved. “Given the severity of the allegations, the court conducted an independent review of each citation submitted, but found nothing to support them,” she wrote.

In his declaration to the court, Reeves indicated he was reviewing filings drafted by junior colleagues and included a citation he presumed was a well-established point of law.

“I was generally familiar with ChatGPT,” Reeves mentioned, explaining that he sought assistance to bolster the legal arguments needed for the motion. However, he admitted he “rushed to finalize and submit the motions” and “did not independently verify the case citations provided by ChatGPT through Westlaw or PACER before their inclusion.”

“I truly regret this lapse in judgment and diligence,” Reeves expressed. “I accept full responsibility.”

Damien Charlotin, a legal researcher and academic based in Paris, notes that incidents of false AI content entering legal filings are on the rise. Track the case.

“We’re witnessing a rapid increase,” he stated. “The number of cases over the past weeks and months has spiked compared to earlier periods.”

Thus far, the judicial response to this issue has been quite lenient, according to Charlotin. More severe repercussions, including substantial fines and suspensions, typically arise when lawyers fail to take responsibility for their mistakes.

“I don’t believe this will continue indefinitely,” Charlotin predicted. “Eventually, everyone will be held accountable.”

In addition to the Johnson case, Lansford and Butler Snow have contracts with the Alabama Department of Corrections to handle several large civil rights lawsuits. These include cases raised by the Justice Department during Donald Trump’s presidency in 2020.

The contract for that matter was valued at $15 million over two years.

Some Alabama legislators have questioned the significant amount of state funds allocated to law firms for defending these cases. However, this week’s missteps have not appeared to diminish the Attorney General’s confidence in Lansford or Butler Snow to continue their work.

On Wednesday, Manasco addressed the attorney from the Attorney General’s office present at the hearing.

“Mr. Lansford remains the Attorney General’s preferred counsel,” he replied.

Source: www.theguardian.com

Cybercrime: Record $1.1 billion paid in ransom by hacking victims last year

Ransomware gangs experienced a resurgence last year, with victims paying $1.1 billion to hackers, a record high according to a study.

Following a lull in 2022, cybercriminals intensified operations in 2023, targeting hospitals, schools, and major corporations worldwide.

Chainalysis, a cryptocurrency research firm, reported that ransom payments doubled compared to 2022, with $567 million paid out that year.

The report highlighted the “big game hunting” aspect of attacks last year, with a higher proportion of ransom payments exceeding $1 million as wealthier companies were targeted.

“2023 will be the year of a major resurgence in ransomware, with record payout amounts and a significant increase in the scope and complexity of attacks. This is a significant reversal from the decline observed in 2022,” Chainalysis said.

In a ransomware attack, hackers typically infiltrate a target’s computer system, infect it with malware, and encrypt files, rendering them inaccessible. New trends involve attackers extracting data such as staff and customer details from IT systems and demanding payment to unlock the files or delete stolen data copies.

Chainalysis attributed the decline in payments in 2022 to factors including Russia’s invasion of Ukraine. Most ransomware groups are linked to Eastern Europe, the former Soviet Union, and Russia. Some fraudsters have been disrupted or turned ransomware into politically motivated cyberattacks.

The FBI disrupted the Hive ransomware group by obtaining their decryption keys and preventing victims from paying a $130 million ransom. Chainalysis also cited research showing a rise in the number of attackers and ransomware variants involved in attacks over the past year.

“The main thing we’re seeing is an astronomical increase in the number of attackers conducting ransomware attacks,” said Alan Liska, an analyst at cybersecurity firm Recorded Future.

According to Recorded Future, 538 new ransomware variants are expected in 2023, indicating the emergence of new and independent groups. The Clop group emerged as a key player last year by claiming responsibility for the hack of payroll provider Zellis, affecting customers like British Airways, Boots, and the BBC.

The British Library is still recovering from a ransomware attack by the rebranded group Rhysida that targeted the library in October.

The growth of ‘ransomware-as-a-service’, renting malware to criminals in exchange for a share of the profits, and the activity of ‘initial access brokers’ who sell vulnerabilities in potential targets’ networks to ransomware attackers have become trends.

Ellie Ludlum, a partner specializing in cybersecurity at British law firm Pinsent Masons, anticipates the rise in attacks to continue. “This increase is expected to continue in 2024, with continued focus on mass data exfiltration by threat actor groups, which may result in increased ransom payments by affected companies,” she stated.

Source: www.theguardian.com