Skip To Main Content

Header Holder

Header Top

Header Bottom

Header Logo Container

Toggle Menu Container

Search Canvas Container

Close Canvas Menu

horizontal-nav

Breadcrumb

Keith Burghardt ’08

Keith Burghardt ’08
GDS Communications Staff

Analyzing Online Hate Speech and Extremism

Georgetown Days Magazine, Spring 2024

Keith Burghardt '08 seeks to answer questions at the heart of a polarized world: What makes people join online extremist groups and, just as importantly, what might convince them to leave?

Keith, a computer scientist at the University of Southern California Information Sciences Institute since 2018, creates AI tools that analyze human behavior and social networks. By analyzing messages on Reddit and X (formerly Twitter), he studies extremism and misinformation, including how people become radicalized online and how that affects their actions offline. He also examines how they might be encouraged to stop spreading extremist content, including racist or homophobic comments.

Keith said GDS fostered his love of research, while his father, John Burghardt, helped spark his interest in the STEM field. (John taught English at GDS for 49 years before retiring in June.) 

Georgetown Days spoke with Keith about his current research. The interview has been edited for clarity and length.

WHAT WAS IT LIKE AS A STEM KID GROWING UP WITH AN ENGLISH TEACHER DAD?

My dad is very math-focused as well. For example, he led a philosophy club at GDS, and he would raise questions such as how to interpret deep math-based philosophy like Gödel’s incompleteness theorems or [the theory of] relativity. It was through my dad that I got interested in STEM, although I wanted to become anything from an archeologist to an engineer. My dad, as an English teacher, also put a lot of emphasis on making sure I could talk about complex things in layman’s terms and that I could write well; for many years I was anything but a good writer. I only later appreciated how critical these skills are in STEM.

HOW DID YOU GET INTERESTED IN EXTREMISM AND HATE SPEECH?

It really started in 2020 with the incredible anti-vaccine backlash [early in the COVID-19 pandemic]. There’s been a lot of research into why people become anti-vaccine. I worked with other researchers to create an algorithm to predict how people on social media would become anti-vaccine in the future. We found this algorithm could make surprisingly accurate predictions up to a year before people shared anti-vaccine posts. After that, we started to focus more on users at the periphery of mainstream social networks, especially people who are more likely to believe extremist viewpoints. This finally led us to study things like: What drives people to join hate groups? What was the effect of joining? And are users recruited to a harmful online group or are they self-motivated to join?

WHY DO PEOPLE JOIN ONLINE HATE GROUPS? 

That’s a question we’re only beginning to understand. In my research on why people become anti-vaccine, we saw people who share such posts are angrier, and they tend to believe in conspiracy theories. We suspect these intrinsic behaviors may play a role in why people also join extremist groups, such as hate groups.

Conspiracy theories drive some of the oldest types of hate, including antisemitism. Some other theories can also help explain their decisions. For example, the uncertainty-identity theory says that someone’s uncertainty about what they should do or think can drive people to join these groups. The rise of Nazism in Germany is such a case study, as Hitler came into power during an economic depression in Germany. While these are risk factors, we don’t know which factors are key to the very first steps along the path towards extremism, such as joining online hate groups, and how these online groups act as gateways to offline extremism, like domestic terrorism. There is some really strong evidence, however, that social media can be a driver of terrorism. So, the short answer is, I don’t know. The long answer is, there could be a few different causes, and more research is critical to test them.

WHY MIGHT SOCIAL MEDIA BE SO HOSPITABLE TO HATE SPEECH?

I think it’s the low barrier to entry, as well as the anonymity. We might expect that the number of people who are truly hateful and are really out to hurt someone is relatively scattered. And most people avoid hate speech in real life due to obvious consequences. Online groups, where a lot of this hate speech spreads, are substantially easier to join than offline groups like the KKK, and you can connect with, or antagonize, anyone in the world anonymously, without as many social repercussions.

HOW HOPEFUL ARE YOU THAT WE CAN CURTAIL ONLINE HATE?

Some people are much easier to deradicalize than others. There’s evidence that countering hate speech—even just saying that what people are doing is wrong—can be effective, as can moderating social media. Making it difficult or socially costly to form these hate groups seems to be a potentially useful strategy. For example, when Reddit simply banned hateful forums, hate speech on Reddit dropped dramatically. The ability to talk in these groups makes people more hateful than they would otherwise be. Even if it’s difficult to deradicalize individuals, you can at least keep them at a lower level of hate with the proper tools.

YOU ANALYZED MESSAGES ON X AFTER ELON MUSK ACQUIRED IT IN OCTOBER 2022. WHAT DID YOU FIND CAUSED A SURGE IN HATE SPEECH?

We found hate speech roughly doubled in the month after Elon Musk bought Twitter. But we saw an increase in hate speech even before then. My suspicion is that this increase was driven largely by people believing that they could get away with it. Why hate speech continued to be high for at least the several weeks we studied it could be due to a lack of effective moderation. While we don’t completely know why it increased, I believe that more moderation can help reduce hate on the site, especially given recent posts I’ve read.

HOW DOES YOUR GDS EDUCATION AFFECT YOU TODAY? 

GDS really influenced how I appreciate math and physics. For example, Andy Lipps, who retired recently, didn’t just teach the required math for AP Calculus but went into depth about the history of it and its various fields. This sparked my interest in math. I was also influenced by Dr. Kevin Cornell, who taught physics. He was instrumental in my decision to pursue a degree in physics after high school. I was also affected by friends from diverse backgrounds and by my father, who at one time served as a GDS diversity co-coordinator. Both my data and friends gave me a greater awareness of the harmful experiences many students faced, including what it meant to be living in an entire system that wasn’t built for you.

HOW DOES GDS INFORM WHAT YOU DO NOW?

Kevin Cornell probably informed me the most. He told me, “You can use simple math to understand the world around you.” This idea really stuck with me. I can trace a direct line from his classes to my decision to get a physics PhD [at the University of Maryland]. While my main research has moved away from physics, I still write physics papers on the science of cities. I was also a high school student at GDS when social media became important. What I noticed, both good and bad, informed the research I do to this day regarding the benefits of social media, as well as its risks. 

WHAT ARE YOU MOST EXCITED TO RESEARCH NEXT?

There’s been work understanding the harms of users joining extremist groups and the potential risk factors for people joining them. But how to reduce extremism and keep people away from these groups in the long run is really unexplored. When we think about extremism, it’s intuitive to ask what drove people to commit those actions. Who would believe in things that are so outlandish that they would harm other citizens to perpetuate their false beliefs? Why would they believe that? We want to find out how to drive people away from extremism, potentially with the help of new AI tools.

There are no resources to display
Keith Burghardt ’08
  • Alumni