Facebook funds research on spread of hate speech
Dr. Jeff Damasco, Science Writer, Illinois Physics
Like so many Americans, Bradlyn could not believe his eyes when he saw footage of the Unite the Right rally in Charlottesville, Virginia, back in August 2017. Richard Spencer, the American neo-Nazi who coined the term “alt-right” to describe the white-supremacist movement, was a featured speaker at the rally and led the riotous march through the University of Virginia campus. Spencer is president of the National Policy Institute (NPI), a white-supremacist think tank promoting racial cleansing and the establishment of a “white empire.” During the rally and march, white supremacists carried confederate battle flags or flags emblazened with the Nazi swastika and chanted their ideology, “You will not replace us!” and “Blood and soil!” Many of the protesters resorted to violence when face-to-face with counter-protesters. Some of the counter-protesters met violence with violence—but not Heather Heyer, a counter-protester who practiced nonviolent resistance.
During the rally, white supremacist James Alex Fields Jr. deliberately drove his car into a crowd of counter-protesters, injuring at least 19 and fatally injuring Heather Heyer. Fields was later convicted of first-degree murder, five counts of aggravated malicious wounding, three counts of malicious wounding, and one count of hit and run; he is now serving a life-plus-419-year prison sentence. The arresting officer described his car as “splattered” with blood and flesh.
Heyer was an avid advocate for equality among all people. Her last Facebook post before dying in that fateful rally was, “If you’re not outraged, you’re not paying attention.”
The Unite-the-Right rallying cry had also been amplified on Facebook and other social-media platforms. But since that fateful day, Facebook has actively worked to minimize hate speech on its platform, as have several other social-media platforms.
In May 2019, representatives from Facebook, Twitter, and Alphabet (the parent company of Google) hosted a summit in Paris with leaders from the United Kingdom, Canada, France, New Zealand, and other nations, making a commitment to develop and implement rules, algorithms, and direct-intervention policies to curb the uploading, promotion, amplification, and distribution of violent extremism on social-media platforms. The initiative calls for hate speech to be immediately and permanently taken down. Most recently, Facebook banned self-described alt-right extremists from their platform. Some point out, this “deplatforming” also comes at a cost: it pushes white supremicists to segregated social-media sites, not tempered by the voices of everyone else.
By 2016, Bradlyn, Blackburn, and colleagues had started looking deeply at hate speech on social media. They noted that research on hate speech in social media had focused primarily on improving the accuracy of labeling, but had not analyzed its spread and propagation as a function of time. In their research, the team applied tools commonly used in the field of physics—namely machine learning and mathematical and statistical analyses—to trace the origination and propagation of racist language and white-supremacy ideology.
Bradlyn and his collaborators recently released a series of articles, quantitative studies of how white-supremacist communities on Twitter, Reddit, Gab, and 4chan have spread hate and misinformation via digital messages and memes. Memes are humorous images, gifs, or text that are copied, often with slight variations, and spread rapidly by Internet users. The alt-right, whose humor relies on a shared ideology of hate, anti-Semitism, and racism, has its own brand of memes, like the Happy Merchant, a meme depicting a caricature of a Jewish man based on stereotypical negative attributes.
In one of their recent articles, “A Quantitative Approach to Understanding Online Antisemitism,” Bradlyn and his collaborators track the propagation of hate speech as a function of time and demonstrate how the spread of memes labeled as hate speech is associated with major events outside of the internet. The researchers found that communities like 4chan’s /pol/ exert a surprisingly large influence on mainstream communities, despite their relatively small user base. Studying the forum posts and memes generated by fringe communities that self-identify as alt-right, the researchers performed change-point analyses to ascertain the amount of influence particular fringe communities have on social media.
Change-point analysis starts from the basic idea that on any given day, the number of posts with a particular topic are expected to approximately follow a particular pattern of mean and variance—a statistically normal distribution. By finding the days with the most significant changes to the distribution’s mean and variance, change points can be identified. Change-point analyses revealed how significant change points in use of words like “white” on social media were associated with the election of President Trump in 2016 and the Unite the Right rally of August 2017, and words like “Jew” were associated with the start of the partial travel ban and Israel Prime Minister Benjamin Netanyahu’s attack on a peace conference, both in 2017.
To analyze the influence of fringe communities, the researchers used a Hawkes process model, which enabled them to identify the 4chan forum /pol/ as the primary source from which the Happy Merchant meme spread to communities on other social media, including Twitter, the The_Donald subreddit, Gab (an extremist-friendly social media site), and others. A Hawkes process is a mathematical model of “self-exciting” events: occurrences that trigger more of the same occurrences shortly thereafter. Examples include earthquakes followed by aftershocks, market shifts that trigger stock trading, or mafia violence that invokes retaliations— the occurrence of these events increases the probability of the occurrence of near-future events.
For a given collection of Hawkes processes, each unique process may excite others, in addition to itself. The measure of this ability to excite other processes is called a weight. To demonstrate that /pol/ was the primary source of the spread of hateful memes, the researchers showed that the weights describing /pol/ as the source community surpassed other weights describing other social media platforms as the source.
Bradlyn notes, “Within the context of our model, this shows that /pol/ has a surprisingly large influence on the propagation of antisemitic memes through the mainstream internet.”
The research by Bradlyn and Blackburn demonstrates how hate speech is seeded and catalyzed across social media, information that’s vital to developing the most effective and responsive anti-hate policies. In their recent article, the scientists pose questions regarding one method for reducing hate speech: deplatforming. Deplatforming is the act of preventing controversial individuals from expressing themselves. In May 2019, Facebook banned Alex Jones, a conspiracy theorist and founder of Infowars; Louis Farrakhan, a black nationalist minister with anti-Semitic views; and Milo Yiannopoulos, a British political commentator and former editor for Breitbart News (https://www.nytimes.com/2019/05/02/technology/facebook-alex-jones-louis-farrakhan-ban.html).
But, Bradlyn and Blackburn point out, deplatforming hateful voices may actually engender new white-supremicist digital communities.
“We have to see what other communities pop up,” Bradlyn notes, “New fringe communities certainly will not advertise on mainstream social media—not that they could if they were deplatformed. It’s an open question how deplatforming will affect the dissemination of hateful speech and memes on mainstream platforms.”
Blackburn adds, “Deplatforming seems like a great idea on the surface, but it might actually make the problem worse by strengthening communities’ conspiratorial ideology, and also by pushing them towards decentralized platforms that are harder to monitor and contain.”
On July 4th, 2019, Gab migrated to Mastodon, a decentralized social networking service.
The Facebook research award will allow Bradlyn and Blackburn to dig deeper into this important socio-technological issue.
Illinois Physics Professor Barry Bradlyn studies viscosity in quantum Hall systems, topological photonic crystals, and other topological systems. He received his bachelor’s degree from MIT in 2009 and his doctoral degree from Yale in 2015. He completed his postdoctoral work at the Princeton Center for Theoretical Science, studying topological insulators and Weyl semimetals.
Alongside his theoretical condensed matter research, Bradlyn has developed an expertise in the propagation of hate speech in fringe communities and its spread to mainstream communities on social media.
During his time at Princeton, Bradlyn met one of his collaborators in social-media research, Jeremy Blackburn, now a professor of computer science newly appointed to the faculty at State University of New York at Binghamton; formerly a professor at the University of Alabama at Birmingham. Blackburn also studies toxic behavior in popular online video games.
Blackburn introduced Bradlyn to two collaborations of researchers working on socio-technological issues. First, the Network Contagion Research Institute works to expose hate speech on digital social networks. It’s co-founder and director, Joel Finkelstein, a psychologist and neuroscientist, has collaborated with Bradlyn and Blackburn on past social-media research projects. And secondly, the iDRAMA Lab (The International Data-driven Research for Advanced Modeling and Analysis Lab), where Bradlyn was introduced to Savvass Zannettou, a Ph.D. candidate from Cyprus University of Technology studying the relationship between technology and people.