Emotional Misinformation - The Interplay of Emotion and Misinformation Spreading on Social Media
This project investigated how emotions and group identities shape attitudes and the spread of misinformation on social media. Emotions grab attention and make people share content, while beliefs are strongly anchored in the social groups we identify with. Because emotions and group identity-driven dynamics dominate on social media, we asked how they interact with belief in misinformation and sharing of it. We examined if emotions increase susceptibility, if misinformation triggers emotions, and if psychological or algorithmic interventions that address emotions or group identity can curb misinformation. To explore these questions, we used online experiments, two large Twitter datasets including links to trustworthy and untrustworthy news, and computer simulations of politicians’ news sharing behavior.
Our Twitter data showed that anger is frequent in discussions of political news, and even more frequent around untrustworthy news. Yet, only about 5 % of posts linking to news point to untrustworthy sources, and a minority of user accounts for most of that traffic. Compared to trustworthy news sources, misinformation had small, but causal effects on anger in online discussions, and increased retweets, but decreased likes and comments. However, in our experiments, emotions did not necessarily make people gullible. Instead, participants stuck to their own views, which are strongly linked to the groups they identify with. Anger arose when a headline clashed with those views. Most readers correctly recognized false claims and felt angry because they disagreed with them.
Group identity-based psychological interventions proved effective to change attitudes. When we reminded people that trust in complementary and alternative medicine is historically tied to right-extremist groups, their attitudes shifted – a relevant result because such trust is linked to vaccine skepticism. Finally, our analysis of news sharing between politicians shows that both inand outgroup mentions, as well as anger increase the probability of retweeting, while sadness decreases it – similarly for trustworthy and untrustworthy news. A computer simulation based on these results closely reproduced the real-world sharing dynamic between politicians on Twitter, thereby allowing to test algorithmic changes in the future.
Converging with evidence from recent studies, our results call for a strategic shift: worry less about misinformation and invest more in making high‑quality journalism visible, appealing and trustworthy. Most people are not misinformed but uninformed and skeptical towards new information; emotions usually serve a social purpose rather than make people irrational. Because the vast majority neither believes nor shares false claims, interventions should focus on politicians, media outlets and the small set of “super‑spreader” accounts that drive most misinformation. These actors are harder to reach but have a much larger influence on misperceptions and polarization. Misinformation is mainly a symptom of polarization – one tool opposing groups use against each other. It will persist as long as societies remain divided, and lasting progress therefore depends on broader reforms – fair political representation, reduced elite polarization and power struggles and genuinely trustworthy media and institutions.