USC study focuses on hate speech and online extremism – USC News


Researchers at the USC Dornsife College of Letters, Arts and Sciences had theorized that a high degree of consistency of similar moral concerns within online communities is associated with an increase in radical intent and extremism – that is, willingness to engage in illegal or violent political action participate .

In a study published this week in Social Psychological and Personality Science, they found that the level of shared moral concerns, or “moral convergence” within an online cluster, predicts the number of hate speech posted by members.

“Our research team examined how morality motivates people to behave in a variety of ways, from donations during a disaster to extreme measures, even violence, to protect their group,” said the study’s lead author, Mohammad Atari, who recently defended his doctoral thesis in the Department of Psychology at USC Dornsife and is now a postdoctoral fellow at Harvard University. “They feel that others are doing something morally wrong and it is their sacred duty to do something about it, even if it means posting hate speech and committing hate crimes.”

Scientists first analyzed posts on an alternative social media network called Gab, which is popular with old-right and right-wing extremists. The platform, which claims to advocate free expression and not be moderated on hate speech, provided researchers with a unique opportunity to examine the dynamics that could lead to radicalization.

They found that Gab users who shared a similar moral profile with their immediate group – meaning they shared values ​​and thought similarly about basic moral issues like care, fairness, loyalty, purity, and authority – were more likely to hate speech spread and use language designed to dehumanize or even incite violence against members of the outgroup.

Social media extremism linked to shared values ​​and morals

The researchers replicated the Gab study’s observations by looking at another extremist network on the online Reddit community. They analyzed a subreddit called “Incels” – involuntary celibate men who blame women for their inability to find sexual partners – and found that those who were morally more like-minded produced more hateful, misogynistic speeches.

A few years ago, scientists from the USC and other institutions jointly developed a model for recognizing moralized language. It is based on an earlier deep learning framework for a computer program that can reliably identify texts that raise moral concerns related to different types of moral values ​​and their opposites. The values ​​as defined by the Moral Foundations Theory focus on caring / harm, fairness / fraud, loyalty / betrayal, authority / subversion, and purity / humiliation.

Moral Foundations Theory is a social and cultural psychological theory that explains the evolutionary origins of human moral intuitions on the basis of innate gut feelings rather than logical reasoning.

Morality connects us and gives our society structure and direction … But morality also has a dark side.

Morteza Dehghani, USC Dornsife Associate Professor of Psychology and Computer Science

“Morality connects us and gives our society structure and direction to take care of those in need and a vision for a just and prosperous future for the group. But morality also has a downside, because extreme forms of it can lead to the opposite of many of these positive principles, ”said Morteza Dehghani, Associate Professor of Psychology and Computer Science. He heads USC Dornsife’s Computational Social Science Lab, where he and others examine how morality is intertwined with prejudice and hatred.

Social media platforms help generate extremism and enable extremists to find each other and, as Dehghani describes, “to nourish each other’s world visions and anger against the outgroup”.

Experimental studies have further shown the role of morality in online extremism

In three controlled experimental studies, the research team also showed that the radical intent to protect the group at all costs, even through the use of violent means, when people believe that others in their hypothetical or real-world group share their views on moral issues is theirs radical intentions to protect the group at all costs. When participants in the US study were led to believe that other Americans shared their moral views, they became more willing to “fight and die” for their country and the values ​​it represents.

“These results underscore the role of moral convergence and family ties in radicalization and emphasize the need for a diversity of moral worldviews within social networks,” said Atari.

But that’s easier said than done, he admitted. More study is needed to identify the most effective interventions for online communities to bring in diverse views that may be key to stopping radicalization.

#StoptheSteal had its roots in online radicalization

The real-life threats of online radicalization were recently illustrated with the storming of the U.S. Capitol on January 6th. Those who believed the 2020 presidential election had been stolen from former President Donald Trump were organized online using the hashtag #StoptheSteal on Facebook and Gab, which served as the hub for organizing the uprising.

When people are motivated by morals regardless of their political affiliation, it tarnishes their judgment.

Mohammad Atari, Lead author of the study

These radicalization studies were under way before the January 6 uprising. Even so, Atari said the January 6 events further motivated the research team trying to understand online radicalization.

He added that identifying as conservative or liberal doesn’t necessarily predict who is prone to radicalization. “When people are motivated by morals regardless of their political affiliation, it tarnishes their judgment,” said Atari.

USC researchers across many disciplines examine political polarization and radicalization – how it starts and how it can be weakened.


The study was funded by the National Science Foundation CAREER BCS-1846531.

More stories about: Politics, research


Comments are closed.