Numerous news outlets have removed articles authored by freelance journalists suspected to be using AI-generated content.
On Thursday, Press Gazette reported that at least six publications, including Wired and Business Insider, have taken down articles from their platforms after it was revealed that pieces written under the name Margaux Blanchard were AI-generated.
Wired published an article in May titled “I fell in love playing Minecraft. The game became a wedding venue.” Shortly after, the article was retracted with an editor’s note stating that “after further review, the Wired editorial team determined that this article did not meet editorial standards.”
According to Press Gazette, which reviewed the WIRED article, “Jessica Hu” is said to be “a Chicago-based commander.” However, both Press Gazette and The Guardian were unable to verify Hu’s identity.
Press Gazette further reported that in April, Business Insider published two essays by Blanchard, one of which discussed the complexities of remote work for parents. After Press Gazette alerted Business Insider about the author’s credibility, the platform deleted the article, displaying a note that read, “This story has been deleted because it did not meet Business Insider standards.”
In a comment to The Guardian, a Business Insider representative stated:
In an article released by Wired, the management acknowledged the oversight, saying, “If anyone can catch an AI con artist, it’s Wired. Unfortunately, we’ve encountered this issue.”
Wired further explained that one of its editors received a pitch about the “rise of niche internet weddings” that had “all the signs of a great Wired story.”
After initial discussions on framing and payment, the editors assigned the story, which was published on May 7.
However, it soon became evident that the writers were unable to provide enough details needed for payment processing. The outlet noted that the writer insisted on payment via PayPal or check.
Subsequent investigations revealed the story was fabricated.
In the Thursday article, Wired noted, “I made an error here. This story did not undergo a proper fact-checking procedure or receive top editing from a senior editor. I acted promptly upon discovering the issue to prevent future occurrences.”
Press Gazette reported that Jacob Philady, editor of a new magazine named Dispatch, was the first to warn of fraudulent activity related to Blanchard’s article. He mentioned earlier this month that he received a pitch from Blanchard, claiming “Gravemont, a decommissioned mining town in Colorado, has been repurposed as one of the world’s most secretive training grounds for death investigations.”
In the pitch shared with Press Gazette, Blanchard stated, “I want to tell the story of a scientist, a former cop, and a former miner who now deal with the deceased daily. I explore ethical dilemmas using real individuals in staged environments, not as mourners but as true archivists.”
She asserted, “I’m the right person for this because I’ve previously reported on concealed training sites, have contacts in forensic circles, and know how to navigate sensitive, closed communities with empathy and discretion.”
Philady informed Press Gazette that the pitch sounded AI-generated, and he could not find any information about Gravemont. The Guardian was also unable to confirm the details regarding the dubious town.
When questioned about how she learned of the town, Blanchard replied, “I’m not surprised you couldn’t find much. Gravemont doesn’t promote itself. I initially interviewed someone irrelevant to a retired forensic pathologist.”
She continued, “Over the following months, I further pieced the story together by requesting public records, speaking with former trainees, and sifting through forensic association meeting materials, none of which were mentioned in print.
“This is a location that exists in the collective memory of the industry, but remains under the radar enough to avoid extensive coverage, which is precisely why I believe it resonates with interested readers,” Blanchard added.
Philady told Press Gazette that despite the pitch seeming “very convincing,” he suspected she was “bulk.” He requested Blanchard to provide her standard rates and how long she would be in the field.
In response, Blanchard ignored Philady’s request for public records, indicating instead that she would “ideally spend five to seven days on location” and would require around $670 for payment.
Last Friday, Philady confronted Blanchard via email, stating he would publish a false story if she did not respond. Press Gazette further revealed that Blanchard did not reply to his request for evidence of her identity.
This instance of false AI-generated articles follows an earlier incident in May when the Chicago Sun-Times ran a section containing a fake reading list produced by AI.
Marcob Scalia, a journalist for King Features Syndicate, relied on AI to create the list, expressing, “It was silly; 100% my fault. I merely republished this list generated by an AI program… usually, I ensure that everything is sourced and vetted appropriately. I definitely fell short of that task.”
Meanwhile, in June, the Utah Court of Appeals sanctioned an attorney after it was found that they had used ChatGPT to cite a non-existent trial.
Source: www.theguardian.com
