As political polarization deepens in the United States, the language people use to discuss politics online increasingly reflects an exaggerated black-and-white mindset. A recent analysis of millions of social media posts found that indicators of mental distortion, along with political extremism, increased between the 2016 and 2020 presidential elections. This study communication psychologyhighlighting the growing overlap between extreme ideological views and the rigid thinking patterns often treated in psychotherapy.
Psychologists use the term cognitive distortion to describe thought patterns in which “individuals think about themselves, their future, and the world in inaccurate and overly negative ways.” These habits include overgeneralizing, catastrophizing, and viewing situations in absolute terms. For example, if a high school student fails one test and immediately decides that their entire academic future is ruined, they will be devastated. If a person assumes that a co-worker ignores them in the hallway out of malice rather than distraction, that person is mind-reading.
In clinical practice, mental health professionals target these distortions through treatments such as cognitive behavioral therapy. Recognizing and adjusting these stubborn beliefs can help patients break negative mental habits. By teaching individuals to replace absolutist thoughts with objective facts, therapists help patients manage emotional disorders such as depression and anxiety.
The way people express their extreme political views often reflects this same psychological habit. Partisans may apply blanket negative labels to political opponents. They can also make dire and unfounded predictions about the future of the country if a particular candidate wins the election. To investigate whether these two phenomena are related, a team of researchers analyzed voters’ digital language habits.
Andy Edinger, a researcher at Indiana University, led the study along with colleagues from Indiana University and the City University of New York. They wanted to understand what it actually meant to think from a polarized perspective. The research team sought to determine whether psychological concepts used in clinical therapy can help explain the growing ideological divide observed in recent public discussions.
To conduct their analysis, Edinger and colleagues examined a large collection of posts from the social media platform Twitter (now known as X). The data included messages discussing presidential candidates in the weeks leading up to the 2016 and 2020 elections. The researchers focused specifically on a core group of about 100,000 users who actively posted during both election cycles. This shared group allowed the team to track how individual people’s behavior changed over a four-year period.
To measure distorted thinking, the researchers applied a specialized linguistic analysis tool developed independently by mental health experts. This tool scans text for a recognized dictionary of 241 specific word sequences. These phrases serve as markers for different types of cognitive distortions. By aggregating how often these phrases appeared in users’ posts, the team calculated an overall prevalence score for distorted language.
The team also needed a mathematical method to measure political ideology and polarization. They achieved this by analyzing users’ social networks and the content they choose to share. The researchers estimated both users’ political leanings and degree of ideological extremism by mapping which political influencers individuals consistently echoed through retweets.
The results showed widespread and significant changes in the way social media users communicate. Between 2016 and 2020, the average prevalence of distorted language across user groups increased by more than 43%. Looking at changes within individual accounts, the average user saw a 76% increase in the frequency of posts containing at least one cognitive distortion marker.
This escalation was not limited to one or two bad mental habits. This increase persisted across all categories of cognitive distortions measured in the study. Categories that saw the biggest spikes included emotional reasoning, overgeneralization, catastrophizing, and mind reading. In the context of political debate, mindreading often takes the form of assuming that opposing voters have secret malicious intentions.
The researchers then looked at how these changes were related to political ideology. They showed that users who became more politically isolated and radicalized between 2016 and 2020 were also more likely to use distorted language. As political polarization has intensified, so has the reliance on rigid and exaggerated language.
The data revealed somewhat different patterns for users at different ends of the political spectrum. Among left-leaning individuals, there was a stable and visible relationship between the level of ideological extremism and the use of distorted language. As their views became more polarized over the four-year period, their use of cognitive distortions increased at the same pace.
Right-leaning users showed a different trajectory. In 2016, users on the political right were already shown to initially have higher baseline rates of distorted language compared to users on the left. Because they started from a higher point, the continued slide toward extreme polarization had a less pronounced effect on their likelihood of using cognitive distortions in 2020. Based on the researchers’ model, this may indicate a saturation effect where the first language was already very strict.
The team also examined the timeline of these changes to see which behaviors typically emerge first. They found that people who frequently used distorted language in 2016 were more likely to be politically polarized by 2020. In contrast, being highly polarized in 2016 was not a strong predictor of adopting new distorted language in 2020. This dynamic demonstrates how rigid, black-and-white thinking patterns do not simply reflect ideological divisions, but can actively foster ideological divisions over time.
Although the timeline suggests a directional relationship, the authors note that their findings are strictly correlational. Observational data from a single social media platform cannot clearly confirm that cognitive distortions cause political polarization. Additionally, changes to the websites’ own moderation policies and algorithms over the past four years may have influenced what types of content are promoted or suppressed.
The researchers also make clear that their analysis does not mean politically vocal people experience clinical mental health disorders. This study does not diagnose depression or anxiety, but rather measures specific styles of communication and thinking. The relationship between actual mental illness and political behavior remains another area of ongoing research.
Despite these limitations, this study provides a new way to view social rifts. The authors point to existing theory that suggests that today’s digital environment may inadvertently teach people to internalize the very thought patterns that therapists are trying to cure. If a society adopts a backward version of cognitive behavioral therapy, the effects can go beyond individual stress and threaten broader democratic institutions.
Going forward, the team hopes to incorporate data from a wider variety of digital platforms. Exploring these thought patterns in different online environments could help confirm the long-term trends seen in this study. Recognizing these psychological habits in public discussions may ultimately help develop targeted interventions to reduce hostility in online spaces.
The study, “Cognitive distortions are associated with increased political polarization,” was authored by Andy Edinger, Johan Bohlen, Hernan A. Maxe, and Matteo Serafino.

