As a gunman terrorized a supermarket in Buffalo, New York over the weekend, the violence was, once again, broadcast live online. The latest mass shooting in the United States left 10 dead and raised familiar questions about the responsibility of social media companies in amplifying extremism that can lead to violence and in disseminating images of these incidents deeply disturbing afterwards.
The 18-year-old suspect, currently in custody, has been linked to a manifesto and Discord posts in which he describes his radicalization on 4chan, praises the mosque shooter in Christchurch, New Zealand, and exposes his plan to kill the people of Black Buffalo. The shooter live-streamed the violence on Twitch, opting for the platform over Facebook because it doesn’t require viewers to log in, according to documents reviewed by TechCrunch.
It’s impossible to say whether the ability to livestream a mass shooting could inspire someone to commit violence they otherwise wouldn’t have, but technology offers extremists an audience – and an archive of their actions that have a life far beyond the initial horror. These gruesome legacies can persist, inspiring others to commit similar acts of violence.
Social media platforms have grappled with mass shootings for years, leveraging the usual combination of AI and human moderators, but these systems still may not stop viral content from spreading and multiplying when it matters.
The live stream from Saturday’s shooting was deleted by Twitch within minutes, but the footage has already been copied and uploaded elsewhere. Versions of the video circulated widely on Facebook, where some users who flagged the video observed that the social network told them the content did not violate its rules. A widely shared clip of the video was uploaded to video hosting site Streamable, where it was viewed more than three million times before the company took it down for violating its terms of service. Facebook did not respond to a question from TechCrunch about why its moderation systems gave the video the green light.
While the alleged shooter declared his interest in streaming on mainstream social media sites, he described spending the most time on 4chan, an online forum known for its complete lack of content moderation and the extremist views therein. find a comfortable home. He also spent time documenting his plans on a private Discord server, again raising the difficult questions of where platforms should draw the line in moderating private spaces.
In an interview with NPR, New York Governor Kathy Hochul called on social media companies to monitor content more aggressively to intercept extremists. Hochul proposed a “trigger system” that would alert law enforcement when social media users express a desire to harm others.
“Everything is telegraphed. This was written in a manifesto which was posted on social media platforms. The information was there,” Hochul said. “They need to put algorithms in place that will very quickly identify the second piece of information released so that it can be tracked down by law enforcement authorities. They have the resources to do it. They need to own this, because otherwise this virus will continue to spread. »
But in the case of the Buffalo mass shooting, the suspect’s plans were privately shared on a messaging app and openly posted on a website known to refuse to moderate content it is not legally required to. to delete.
And, as many people have pointed out in the wake of the Buffalo tragedy, the “virus” described by Hochul is already here. The alleged shooter was inspired to action by an ideology known as the ‘great substitute’, once a fringe belief embraced by avowed white supremacists who stoke racist fears about emerging majorities of the non-white population in countries like the United States.
With these ideas now as easy to find on cable news or in Congress as on 4chan, no algorithm can deliver us from the violence they inspire.