On Oct. 7, the day Hamas attacked Israel, the hashtag #HitlerWasRight appeared on X, the platform formerly known as Twitter. Over the next month, more than 46,000 posts featured the hashtag, often alongside language that called for violence against Jews.
At the same time, the hashtag #DeathtoMuslims also spiked on X and was shared tens of thousands of times, according to a review by The New York Times.
Antisemitic and Islamophobic hate speech has surged across the internet since the conflict between Israel and Hamas broke out. The increases have been at far greater levels than what academics and researchers who monitor social media say they have seen before, with millions of often explicitly violent posts on X, Facebook, Instagram and TikTok.
Antisemitic content soared more than 919 percent on X and 28 percent on Facebook in the month since Oct. 7, according to the Anti-Defamation League, a Jewish advocacy group. Anti-Muslim hate speech on X jumped 422 percent on Oct. 7 and Oct. 8, and rose 297 percent over the next five days, said the Institute for Strategic Dialogue, a London-based political advocacy group.
On fringe platforms like 4chan, Gab and BitChute, antisemitic and Islamophobic content rose nearly 500 percent in the 48 hours after Oct. 7, according to the Global Project Against Hate and Extremism, a nonprofit that tracks hate speech and extremism. And the surge has been global, with antisemitic posts also widely shared on state-backed social platforms in China.
The outpouring has been both driven by deep-seated emotions over the violence and stoked by extremists looking to further their own agendas, said researchers who study social media. On far-right messaging groups online, discussions about the opportunity to indoctrinate far-left activists into antisemitism have been active, according to messages viewed by The Times. Russia, Iraq and Iran have also spread antisemitic messages alongside misinformation about the war.
“Hate actors have leaped at the chance to hijack social media platforms to broadcast their bigotry and mobilize real-world violence against Jews and Muslims, heaping even more pain into the world,” said Imran Ahmed, director of the Center for Countering Digital Hate, which monitors social media for hate speech.
The discourse online has created a climate of fear and intimidation that may have influenced tense confrontations and violence in the real world, researchers warned, adding that causation can be difficult to prove. In the United States, Europe and Canada, the authorities have documented numerous acts of violence against Jews, Muslims and their places of worship in recent weeks.
Some of the antisemitic and anti-Islam posts have been shared and liked hundreds of thousands of times, even though they appear to violate the rules of social media platforms, many of which ban hate speech.
The content has been most prominent on X, according to the Anti-Defamation League and other researchers. In an analysis by the Anti-Defamation League of 162,958 posts on X and 15,476 posts on Facebook from Sept. 30 to Oct. 13, the surge in antisemitic content on X far exceeded that of Facebook. Nearly two million posts with the hashtag #IsraeliNewNazism appeared on X in that period, and another 40,000 posts featured the hashtag #ZionistsAreEvil or #ZionistsAreNazis.
More than 46,000 posts with the hashtag #HitlerWasRight also appeared over the last month on X, according to Memetica, a digital investigations firm. In previous months, the hashtag appeared fewer than 5,000 times a month. Two other hashtags — #DeathtotheJews and #DeathtoJews — showed up more than 51,000 times in the last month, compared with 2,000 the month before.
The hashtag #LevelGaza appeared nearly 3,000 times on X in the week after the Oct. 7 attacks, up from fewer than a dozen in September, Memetica also found. There were also thousands of posts on the platform with the hashtags #MuslimPig and #KillMuslims.
Other sites, including TikTok and Facebook, have also experienced surges in hate speech but have removed the content that was flagged to them, researchers said. The hate speech that remained was often more veiled, such as a TikTok trend of using “Austrian painter” as code for Adolf Hitler.
A TikTok spokeswoman said that the “Austrian painter” videos violated the app’s policies and that videos with the hashtag were removed after The Times brought them to the company’s attention. From Oct. 7 to Oct. 13, she added, TikTok took down 730,000 videos for violating hate speech rules.
X did not respond to a request for comment. Meta, which owns Facebook, Instagram and WhatsApp, referred to a blog post on how the company is enforcing its policies against hate speech.
Messaging apps such as Telegram have also been used to seed hate speech in the conflict. On Oct. 7, a Hamas-linked Telegram channel shared an image of a paraglider descending with a Palestinian flag and the words “I stand with Palestine.” The image referred to the Hamas gunmen who used paragliders to enter the Nova music festival in Israel, where more than 260 people were killed in the Oct. 7 attacks.
Within 24 hours, the image was shared thousands of times on X, Instagram, Facebook and TikTok, according to ActiveFence, a cybersecurity company that advises social media platforms. Underneath some of the posts on Facebook and Instagram were comments such as “they should have killed more” and “kill more Jews.”
By Oct. 9, a group called NatSoc Florida had created a T-shirt with the image, according to ActiveFence. The image soon spread to 4chan and later appeared in variations with Pepe the Frog, a cartoon character that has been appropriated by white supremacists.
The meme quickly spread through organizations that were primed to embrace antisemitic or racist causes, including those not directly involved in the conflict between Israel and Gaza, said Noam Schwartz, ActiveFence’s chief executive.
“The meme is very, very good,” he said. “It’s a terrible thing to say, but it’s recognizable, like an icon.”
Telegram did not respond to a request for comment.
On several far-right Telegram channels and on 4chan, some users have recently discussed the war as an opportunity to spread antisemitic sentiment to people who are normally ideological opposites. One Telegram channel included instructions for far-right users who espouse antisemitism to post sympathetically about the deaths of Palestinians in Gaza to draw in left-wing activists.
“Once you get them there, blame the Jews,” one person wrote.
Adi Cohen, the chief operating officer of Memetica, said the rise in antisemitic posts reflected a convergence of goals by far-right and far-left activists.
“Some of them explicitly say this is an opportunity to gloat and celebrate the killing of Jews online,” he said. “They are trying to lure an audience to their content, and this is a huge growth moment for them.”
Sheera Frenkel is a reporter based in the San Francisco Bay Area, covering the ways technology impacts everyday lives with a focus on social media companies, including Facebook, Instagram, Twitter, TikTok, YouTube, Telegram and WhatsApp. More about Sheera Frenkel
Steven Lee Myers covers misinformation for The Times. He has worked in Washington, Moscow, Baghdad and Beijing, where he contributed to the articles that won the Pulitzer Prize for public service in 2021. He is also the author of “The New Tsar: The Rise and Reign of Vladimir Putin.” More about Steven Lee Myers
Source: Read Full Article