Recent research published in journals Computers in human behavior It suggests that automated news chatbots programmed to provide a balanced perspective can gain the trust of people from a variety of ideological backgrounds. This study provides evidence that people who strongly believe in conspiracy theories tend to respond well to chatbots, seeing them as useful tools for reading diverse news. These findings point to new ways in which technology can help reduce social divisions by breaking through information bubbles and exposing people to multiple perspectives.
In recent years, generative artificial intelligence has changed the way people interact with information online. Generative artificial intelligence refers to computer systems that can process large amounts of text and generate human-like responses. News chatbots utilize similar technology to act as automated conversational agents. These programs allow users to browse topics and display real-time text summaries of news articles in a chat window.
The authors of the new study wanted to see if these chatbots could help solve a growing problem in the modern media environment. People often engage selectively, meaning they only click on news that aligns with their existing beliefs. Over time, this practice tends to create echo chambers and increase political and social polarization.
When people are exposed to only one side of a story, they often become defensive or ignore other points of view. Scientists wanted to know if a neutral, automated chatbot could encourage people to step outside their comfort zone. They suspected that people viewed machines as more objective than human journalists.
“People who believe in conspiracy theories tend to distrust mainstream media, believing it to be biased or agenda-driven,” says study author Shreya Dubey (@sdubey03), a postdoctoral fellow at the Amsterdam School of Communication Studies at the University of Amsterdam.
“We wanted to test whether a chatbot that is perceived as more neutral than traditional news outlets would be better received by this group. We designed a chatbot that presented both mainstream and alternative news stories, and investigated whether people who believe in conspiracy theories would trust and use the chatbot more compared to those who do not hold such beliefs.”
Specifically, the scientists developed a custom chatbot named Infobot. The program was designed to show users eight different news headlines about climate change.
Four of the headlines represented mainstream scientific views supporting climate action. The other four headlines represent alternative viewpoints, including arguments against climate action and theories that climate change is a hoax. Users can scroll through the headings and click on any article to read a quick summary generated by the chatbot.
After reading the summary, the article disappeared and the user was prompted to select another article. The software tracked the articles users selected and the time they spent reading them. In the first study, scientists collected a sample of 177 adults living in the United States.
They divided these participants into two groups based on their responses to a questionnaire about common conspiracy theories. The final sample included 93 individuals with low general conspiracy theory beliefs and 84 individuals with high general conspiracy theory beliefs. Participants were instructed to interact with the Infobot and read summaries of at least four articles.
They then answered survey questions assessing the chatbot’s ease of use, usefulness, and potential risks. We also assessed overall trust in the program, general attitude toward the program, and intention to use such a tool in the future. The data showed that participants who found chatbots useful and trustworthy were more likely to have positive attitudes toward them.
This positive attitude directly predicted their intention to use the news chatbot again. Unexpectedly, scientists found that people high in general conspiracy theory beliefs trust chatbots more than those with low conspiracy theory beliefs. The high-belief group also reported more positive attitudes and increased intentions to use the program in the future.
Both groups read similar numbers of mainstream and alternative articles. However, the software revealed that people with stronger conspiracy theory beliefs actually spent significantly less time reading mainstream summaries compared to alternative summaries. Researchers noticed a potential flaw in their initial study.
They were grouping people based on general conspiracy beliefs rather than specific beliefs about climate change. In fact, the two groups did not differ significantly in their actual beliefs about anthropogenic global warming. To solve this, scientists conducted a second study.
In the second study, researchers recruited 58 participants. This time, we specifically examined beliefs about climate change. The sample included 35 individuals with low climate change conspiracy theory beliefs and 23 individuals with high climate change conspiracy theory beliefs.
The procedure was almost the same as the first experiment. However, participants had to enter a special code from the chatbot to prove they had paid attention to the summary. The second study replicated the first study’s findings.
Again, trust and perceived usefulness predicted positive attitudes toward chatbots. Participants with high conspiracy theory beliefs about climate change trusted chatbots more and showed stronger intentions to use them than participants with low conspiracy theory beliefs. The scientists noted that both groups had a generally positive response to the program, but the high-belief group was consistently more enthusiastic.
Researchers believe this may occur because people who strongly believe in conspiracy theories often feel that mainstream media is biased against them. People may have viewed chatbots as a fair and unbiased source of information because they presented an alternative view on equal footing with mainstream science.
“Most of us, regardless of our beliefs, tend to think that our opinions are formed objectively and based on good information,” Dubey told SciPost. “Our findings suggest that chatbots that present multiple points of view will feel refreshingly balanced to all people, including those who don’t trust mainstream media.”
“But this raises an uncomfortable question: Is balance always desirable? Although climate change is not truly contested among scientists, our chatbot juxtaposed mainstream and alternative views. This approach has led to widespread acceptance of the tool, but it also risks creating false equivalencies. That means giving fringe or misleading views the same weight as scientific consensus. The very features that make our chatbots so appealing can end up legitimizing misinformation.”
“The real payoff, then, is that it’s a tension worth enduring. Tools that feel balanced and neutral may be the best way to reach people across ideological divides, but ‘balancing’ on issues like climate change is not itself a neutral act,” Dubey said.
While the findings offer hope for reducing polarization, the researchers noted several limitations. First, this study only compared people at the extremes of the conspiracy theory belief spectrum. Individuals with moderate beliefs were excluded from the main analysis, so the results may not be representative of the entire population. Second, participants only interacted with the chatbot once in a controlled research environment.
It is unclear whether their positive attitudes persist even after repeated use over weeks or months. It also remains to be seen whether people will voluntarily choose to use a balanced news chatbot in the real world if they have access to a highly personalized social media feed.
Future research should investigate exactly which features of chatbots make them attractive to different groups. Scientists could also investigate whether giving users some control over the ratio of mainstream to alternative news increases their willingness to engage with opposing viewpoints.
The study, “Examining the perceived credibility and usefulness of balanced news chatbots among individuals with a variety of conspiracy theories,” was authored by Shreya Dubey, Paul E. Ketelaar, Tilman Dingler, Hannah K. Peetz, and Hein T. van Schie.

