There is a vast ecosystem of toxic online content.
By Chloe Reichel / 10.30.2018
Shortly before the mass shooting in a Pittsburgh synagogue on Oct. 27, which killed 11 people and injured six, police say the suspected shooter posted anti-Semitic content on the social network Gab–a haven for extremists that has billed itself as “the home of free speech online.” Launched in 2016, the site was taken offline Oct. 29 after being banned by its hosting service, Joyent.
A flurry of media coverage focused on the silencing of the site and its uncertain future.
But to focus on the ban misses the point, according to Benjamin T. Decker, a research fellow investigating disinformation and media manipulation at Harvard Kennedy School’s Shorenstein Center for Media, Politics and Public Policy and a former research coordinator at Storyful.
Gab, which has over 300,000 users and 148,000 Twitter followers, and the man authorities identified as the gunman, Robert Bowers, are just two actors in a vast ecosystem of toxic online content, Decker said.
“I think that the biggest element missing from mainstream media reporting on Gab is the larger understanding of the different types of content on the site and the degrees to which toxic content and ideologies are shared and promoted,” he said. “These aren’t limited to rogue actors who commit heinous acts of violence in the real world. They are the social posts of thousands, if not tens of thousands, of users who create, engage with and share this type of content, both on Gab and then across to other social media networks.”
Decker offered the example of #HeroRobertBowers, a hashtag that was shared over 50 times on Gab over the weekend of the shooting and, at the time of this writing, three times on Twitter.
In other words, Decker sees the media’s focus on individual users or platforms as deemphasizing just how widespread these “alternative” ideologies are.
“If it’s not Gab, it’s going to be another platform that rises, because the sentiment and the ideas fostered by these communities are not going to be fazed by technology,” he said. “They’ll find a new space.” Some sites go underground to the so-called “dark web,” accessible only through certain applications and browsers or with prior authorization, he said.
Efforts are underway to identify and thwart toxic content online. Researchers are developing machine learning, natural language processing and other artificial intelligence strategies in pursuit of these goals.
But Decker said this ignores the bigger issue: “Technology may help to put a Band-Aid on the bleeding, but it doesn’t provide a holistic response to improve the overall state of our digital public square. The problems in our digital public square are inherently social problems.”
Decker said there are tens of thousands of posts that are not dissimilar to those made on Bowers’ Gab account in the days and hours before the shooting.
“At the end of the day, when we’re talking about a digital environment like Gab or another fringe media space, these are large portions of American internet users who are consuming this,” he said. “We need to be talking about that far more than we talk about how one user killed people. The fact that one user killed people can give us a lens into which we begin to tell that story to the public.
He continued: “As journalists and researchers, we need to be aware of the malleability of conversations on the Internet. We need to recognize that some memes need to be taken seriously, and there’s not a lot that we can begin to do beyond talking to each other and establishing some basic parameters for how we report on these types of digital content.”
These parameters echo tips from Decker’s colleague, Cameron Hickey, who spoke with Journalist’s Resourcein August about reporting on figures who have been “de-platformed,” or removed from social media sites.
- Consider the consequences of republishing hateful content.
“What’s most important for journalists to understand is the reach that they have through their audiences,” Decker said. “And by including a screenshot of an image or a screenshot of a meme, you have to understand that you are also contributing to the amplification of content. Or by using certain keywords like ‘alt-right,’ or ‘incel,’ or others, you’re actually contributing to the search engine optimization of narratives of hate. So it’s important to recognize where we can limit the further amplification of certain images, keywords, symbols or other dog whistles in order to pivot that narrative into something that is constructive and beneficial for society, rather than feeding the hate machine more.”
- Don’t jump to conclusions, especially not from isolated examples of users or sites. Consider their place within a complex, broad social sphere.
Decker said that posts attributed to Bowers’ Gab account link President Donald Trump with the actions of the government in a way that other fringe communities do not — often blaming rather than praising him.
“I wouldn’t say it necessarily separates him from the rest of these communities, but, it is a particularly interesting case study to look further into, because the nuanced complexity is something that we haven’t really seen before,” Decker added. “So again, it’s important not to jump to conclusions. Take a step back and think about how these types of messages or use cases fit into a larger framework of understanding.”
Originally published by the Journalist’s Resource under the terms of a Creative Commons Attribution-NoDerivatives 4.0 International license.