Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.
World News

Real-world events trigger online hate towards unrelated groups, study finds | Online abuse

Real-world events such as murders and political protests can trigger an increase in online hate speech directed at seemingly unrelated groups. According to the researchers, this finding could help online moderators better predict when hateful content is most likely to be posted and what they should be looking for.

Previous research has linked offline events to later spikes in hate speech and violent hate crime, but these studies have largely focused on moderate platforms, such as Twitter and Facebook (now Meta), which have policies to identify and remove such content.

To better understand the triggers and the relationship between mainstream platforms and less moderate platforms, Professor Yonatan Lupu of George Washington University in Washington DC and his colleagues used a machine learning tool to examine conversations between users of 1,150 online hate communities published between June 2019 and December 2020. Some of these communities were on Facebook, Instagram and VKontakte. Others were on the less moderate platforms Gab, Telegram and 4Chan.

The study, which was published on PLOS ONE, found that offline events such as elections, assassinations and protests can trigger huge spikes in online hate speech activity.

There was a direct relationship between the event and the type of hateful content it triggered, but not always. The assassination of Iranian General Qassem Suleimani in early 2020 led to an increase in Islamophobic and anti-Semitic content in the following days.

The biggest spike in hate speech linked to the murder of George Floyd and the Black Lives Matter protests it sparked. Race-related hate speech increased by 250% after these events, but there was also a more general wave of hate online.

“One interesting thing about this particular event is that the increase [in race-related hate speech] lasted,” Lupu said. “Even through the end of 2022, the frequency with which people use racist hate speech about these communities has not returned to what it was before the killing of George Floyd.

“The other interesting thing is that it also seemed to activate various other forms of hate speech online, where the connection to what’s happening offline isn’t as clear.”

For example, hate speech targeting gender identity and sexual orientation – a topic with little intuitive connection to murder and protests – increased by 75%. Sexist and anti-Semitic hate speech has also increased, as has content related to nationalism and ethnicity.

The research was unable to prove causation, but its findings suggest a more complex relationship between triggering events and online hate speech than previously assumed.

One factor could be the extent of media coverage related to the events in question. “The volume and variety of online reactions to offline events depends, in part, on the visibility of those events in other media,” Lupu said.

He suspects, however, that this is not the only factor. “We can’t say for sure, but I think there’s something about the way hate is being constructed right now in English-speaking societies, so racism is kind of at the heart of it. When racism activates – if it activates strongly enough – then it spreads in all directions.

Catriona Scholes, director of knowledge at anti-extremism technology company Moonshot, said she noticed a similar pattern related to anti-Semitic hate speech.

For example, protests against a planned storytime event in Columbus, Ohio in December have led to an increase in anti-LGBTQ+ hatred – as well as an increase in threats and hostility towards the community. Jewish.

“It’s possible to harness this kind of data to move from a reactive to a proactive approach in protecting individuals and communities,” Scholes said.

Lupu said content moderation teams on mainstream platforms should monitor fringe platforms for emerging trends. “What happens on 4Chan doesn’t stay on 4Chan. If they talk about something on 4Chan, it will happen on Facebook. It also suggests that content moderation teams should think about what’s going on in the news, and what it might trigger, to try to prepare their response.

A particularly important question for future research is what other types of offline events are likely to be followed by broad, indiscriminate cascades of online hate, he said.

theguardian Gt

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button