For Political Cartoonists, the Irony Was That Facebook Didn’t Recognize Irony

As Facebook has become more active at moderating political speech, it has had trouble dealing with satire.
Days later, Facebook sent Mr. Bors a message saying that it had removed “Boys Will Be Boys” from his Facebook page for “advocating violence” and that he was on probation for violating its content policies.
At the same time, misinformation researchers said, Facebook has had trouble identifying the slipperiest and subtlest of political content: satire. While satire and irony are common in everyday speech, the company’s artificial intelligence systems — and even its human moderators — can have difficulty distinguishing them. That’s because such discourse relies on nuance, implication, exaggeration and parody to make a point.
That means Facebook has sometimes misunderstood the intent of political cartoons, leading to takedowns. The company has acknowledged that some of the cartoons it expunged — including those from Mr. Bors — were removed by mistake and later reinstated them.
“If social media companies are going to take on the responsibility of finally regulating incitement, conspiracies and hate speech, then they are going to have to develop some literacy around satire,” Mr. Bors, 37, said in an interview.
Emerson T. Brooking, a resident fellow for the Atlantic Council who studies digital platforms, said Facebook “does not have a good answer for satire because a good answer doesn’t exist.” Satire shows the limits of a content moderation policy and may mean that a social media company needs to become more hands-on to identify that type of speech, he added.
In a statement, Facebook did not address whether it has trouble spotting satire. Instead, the company said it made room for satirical content — but only up to a point. Posts about hate groups and extremist content, it said, are allowed only if the posts clearly condemn or neutrally discuss them, because the risk for real-world harm is otherwise too great.
Facebook also created a process so that only verified buyers could purchase political ads, and instituted policies against hate speech to limit posts that contained anti-Semitic or white supremacist content.
Last year, Facebook said it had stopped more than 2.2 million political ad submissions that had not yet been verified and that targeted U.S. users. It also cracked down on the conspiracy group QAnon and the Proud Boys, removed vaccine misinformation, and displayed warnings on more than 150 million pieces of content viewed in the United States that third-party fact checkers debunked.
“At a point, I suspect Facebook got tired of this dance and adopted a more aggressive posture,” Mr. Brooking said.
Political cartoons that appeared in non-English-speaking countries and contained sociopolitical humor and irony specific to certain regions also were tricky for Facebook to handle, misinformation researchers said.
Mr. Hall said his intent was to draw an analogy between how Mr. Netanyahu was treating the U.S. representatives and Nazi Germany. Facebook took the cartoon down shortly after it was posted, saying it violated its standards on hate speech.
“If algorithms are making these decisions based solely upon words that pop up on a feed, then that is not a catalyst for fair or measured decisions when it comes to free speech,” Mr. Hall said.
Adam Zyglis, a nationally syndicated political cartoonist for The Buffalo News, was also caught in Facebook’s cross hairs.
After the storming of the Capitol in January, Mr. Zyglis drew a cartoon of Mr. Trump’s face on a sow’s body, with a number of Mr. Trump’s “supporters” shown as piglets wearing MAGA hats and carrying Confederate flags. The cartoon was a condemnation of how Mr. Trump had fed his supporters violent speech and hateful messaging, Mr. Zyglis said.
Facebook removed the cartoon for promoting violence. Mr. Zyglis guessed that was because one of the flags in the comic included the phrase “Hang Mike Pence,” which Mr. Trump’s supporters had chanted about the vice president during the riot. Another supporter piglet carried a noose, an item that was also present at the event.
“Those of us speaking truth to power are being caught in the net intended to capture hate speech,” Mr. Zyglis said.
The takedowns, which have resulted in “strikes” against his Facebook page, could upend that. If he accumulates more strikes, his page could be erased, something that Mr. Bors said would cut 60 percent of his readership.
“Removing someone from social media can end their career these days, so you need a process that distinguishes incitement of violence from a satire of these very groups doing the incitement,” he said.
Mr. Bors said he had also heard from the Proud Boys. A group of them recently organized on the messaging chat app Telegram to mass-report his critical cartoons to Facebook for violating the site’s community standards, he said.
“You just wake up and find you’re in danger of being shut down because white nationalists were triggered by your comic,” he said
Facebook has sometimes recognized its errors and corrected them after he has made appeals, Mr. Bors said. But the back-and-forth and the potential for expulsion from the site have been frustrating and made him question his work, he said.
“Sometimes I do think about if a joke is worth it, or if it’s going to get us banned,” he said. “The problem with that is, where is the line on that kind of thinking? How will it affect my work in the long run?”
Category: Technology
Source: New York Times