Social media algorithms are not politically neutral and can actively shape an individual’s political opinions. Recent research published in journals nature present evidence that turning on an algorithmic feed on Platform X shifted users’ political views to the right. Turning off the algorithm did not reverse this effect. This suggests that algorithms can leave a lasting footprint on people’s information environments.
X, formerly known as Twitter, is the leading platform for political news and public conversation. The platform offers two main ways to display your content. A chronological feed simply displays posts from accounts you actively follow in the exact order they were posted, with the most recent first.
Algorithmic feeds suggest and order content based on complex mathematical rules. It shows posts from unknown accounts and prioritizes items designed to keep users engaged, such as posts with the most likes and comments. The scientists behind the new study wanted to understand whether these customized feeds actually change the way people see the world.
Previous research on other social networks has found that temporarily turning off feed algorithms does not change political attitudes. The researchers speculated that this may have happened because the initial exposure to the algorithm leaves a lasting mark on the user’s behavior. They also wanted to conduct independent research without the direct involvement of technology companies. This independence allows us to see exactly how our algorithms impact real users on X.
“Feed algorithms determine what billions of people see on social media every day. Whether feed algorithms also shape people’s thoughts is one of the most important unanswered questions in the social sciences,” said study author Philine Widmer, assistant professor at the Paris School of Economics.
“A large-scale prior study conducted in collaboration with Meta during the 2020 US election found that turning off algorithms had no measurable impact on political attitudes. So perhaps algorithms don’t matter for politics? Our study suggests the situation is more nuanced.”
“Previous studies have only tested in one direction, which was to turn off the algorithm for users who have perhaps been exposed to it for years. But what happens when the algorithm is turned on? And can exposure to algorithmic content leave a lasting trace that persists even after the algorithm is removed? These questions have not been quantitatively tested.”
Rather than just looking at general party loyalty, Widmer and her colleagues focused specifically on specific policy preferences and views on current events. To test this, they conducted a seven-week field experiment in the summer of 2023.
They recruited 4,965 active X users based in the US. All participants completed a survey at the beginning of the study to collect baseline data on political affiliation, social media habits, and overall happiness. The study sample was diverse, but slightly biased toward a highly educated demographic.
Approximately 46% of participants identified themselves as Democrats and 21% as Republicans. Participants were randomly assigned to use either an algorithmic feed or a time series feed for the entire 7 weeks. They received monetary rewards for adhering to the assigned feed settings.
At the end of the study, participants completed a final survey to measure changes in their opinions on specific policies and current news events. The final investigation included questions about the criminal investigation into Donald Trump and the ongoing war in Ukraine. The researchers also measured emotional polarization. Emotional polarization is a term used to describe the extent to which people dislike or distrust people who belong to the opposing political party.
In addition to the survey, the scientists collected data on the exact accounts participants chose to follow during the study. A small group of users also installed the browser extension on their computers.
With this extension, researchers can now securely record the exact posts that appear in users’ feeds without relying on platforms sharing internal data. Switching users from a chronological feed to an algorithmic feed increased overall engagement with the platform. For example, posts that appear in algorithmic feeds received significantly more likes, reposts, and comments than posts that appear in chronological feeds.
After being exposed to this highly engaging feed, users’ political opinions visibly changed to become more conservative. Users who switched to the algorithmic feed began prioritizing issues favored by Republicans, such as immigration and inflation, over issues favored by Democrats.
These users also became critical of the criminal investigation into Donald Trump that was making headlines at the time. They tended to view the investigation as unacceptable or contrary to the rule of law. Their views on the Ukraine war have also changed to a relatively pro-Kremlin position.
“The main point is that social media feed algorithms are not politically neutral,” Widmer told SciPost. “In an experiment with US-based X users in the summer of 2023, turning on X’s algorithmic feed shifted their political opinions to the right.”
“In terms of standardized effect sizes, our estimates for political opinion range from about 0.08 to 0.12 standard deviations, which is a small effect according to Cohen’s convention. However, it is noteworthy that these effects emerged after just seven weeks. People have been exposed to algorithmic feeds for years. As researchers, we have no choice but to make people 2. We rarely have the opportunity to observe more than a week, and in this relatively short period of time we have found a change in opinion about current politics.”
Browser data helped explain the specific mechanisms behind this change. After analyzing the collected posts using language processing tools, the researchers found that X’s algorithm actively promoted conservative content and posts by political activists. Specifically, posts annotated as conservative were more likely to appear in algorithmic feeds than liberal posts.
At the same time, the algorithm buried or demoted posts from traditional news outlets. Posts from news organizations appear significantly less frequently in algorithmic feeds than in chronological feeds.
Surprisingly, switching from algorithmic feedback to time-series feeds had little effect on political opinions.
“The most impressive finding for us was the asymmetry,” Widmer said. “We expected the algorithms to have some effect on political attitudes, but we didn’t expect the effect to be so clearly unidirectional. We switched the algorithms in response to changing opinions, but they didn’t change back when we switched them off. This asymmetry had not been previously documented.”
Researchers suggest that this one-sided effect is caused by newly followed accounts. This algorithm placed highly attractive right-wing activists directly in front of users, and people started following those specific profiles. Even when participants switched back to chronological settings, their feeds still showed posts from accounts they actively followed.
Users were already following new conservative activists while the algorithm was active, so their chronological feeds remained filled with that perspective. Changes in their daily information intake took hold and left a lasting impact on what they see every day.
“So the mechanism we propose is that users will continue to follow accounts published by the algorithm even when the algorithm is turned off,” Widmer told PsyPost. “Thus, this algorithm may leave a permanent footprint on the user’s information environment.”
“For readers, this means that what you see on social media is not a neutral reflection of the world or the accounts you choose to follow. Algorithms actively shape your information diet, but that change can become entrenched, meaning it cannot be easily reversed.”
As with all studies, there are some limitations that should be considered. This study was conducted specifically for X in the summer of 2023 among users based in the United States. “We don’t want readers to think that every algorithm on every platform produces the same effect, or that the same effect will always be found in different time periods or in different situations,” Widmer said. Different algorithms are built with different goals and may have different effects over other time periods and under different company ownership.
The study also focused on active users who log in regularly. The impact is likely to be small for people who only use the platform occasionally and have little exposure to content. Ideally, future studies should observe people for months or years to see whether long-term exposure ultimately changes deeper identities.
“This research was carried out completely independently of X, with funding from public research funds (Swiss National Science Foundation),” Widmer pointed out. “We did not work with the platform and did not have access to their internal data.”
The study, “Political Implications of X-Feeding Algorithms,” was authored by Germain Gauthier, Roland Hodler, Philine Widmer, and Ekaterina Zhuravskaya.

