A series of three surveys conducted in the US revealed that Americans believe that 43% of Reddit users post extremely harmful comments and 47% of Facebook users share false news online. But in reality, only 3-8.5% of users create such content. The paper is PNAS Nexus.
Social media contains many posts that share misleading or completely untrue content. Some users post harmful comments on other people’s posts. These are insulting, hateful, or offensive comments. Sharing false news and posting harmful comments are two types of behavior that matter because they harm real people, damage reputations, and cause fear and anger.
False news can spread very quickly because people often share dramatic information before checking to see if it’s true. Harmful comments can make online spaces hostile and prevent rational discussion. This behavior can also increase conflict between groups, as people begin to see others as enemies rather than as people.
What’s interesting is that research shows that both of these types of behavior are driven by a small number of users who are very active and post frequently. A recent study found that the 1% of Reddit communities seeking conflict generate 74% of all conflict content across the platform. Similarly, another study found that 60% of hate speech on Twitter comes from a small community of users. These findings reflect what appears to be a broader pattern across social media platforms. This means that the majority of problematic content is created by a small but vocal minority of users.
Study author Angela Y. Lee and colleagues examined Americans’ beliefs about the number of social media users contributing harmful content and examined the consequences of such beliefs. They hypothesized that people would overestimate the prevalence of harmful users on social media. Conversely, this misperception can foster excessive cynicism toward one’s fellow citizens. These authors suggest that when people believe that many of their fellow Americans post harmful content, they may develop more negative views of society and perceive greater moral decline than actually exists.
To explore this further, the study authors conducted three surveys of American adults via CloudResearch Connect, aligned with each country’s age, gender, race, and ethnicity quotas. The total number of participants across the three studies was 1,090.
The first study asked participants to read about two research studies that determined how many times Reddit accounts posted harmful content and how many times Facebook users posted false news on the platform. Participants then provided estimates about the number of social media users who created such content.
In Study 2, participants read about the Google system used to detect harmful language. I also looked at 20 comments from actual Reddit users, half of which were seriously harmful and the other half were not. They were then asked to identify comments that Google’s system would classify as harmful.
Study 3 was an experiment in which participants in one condition read a text explaining how scientists discovered that most people do not share harmful content online. This was the misrecognition correction condition. The other experimental condition was a control condition, which was about how Reddit was founded. The texts read by the control group did not mention online toxicity. After this, participants in both conditions completed measures of social media use, cynicism, generalized trust, perceptions of moral degradation, and beliefs about the type of content that should be disseminated on social media.
The results showed that, on average, participants believe that 43% of all Reddit users post extremely harmful comments and 47% of Facebook users share false news online. In fact, platform-level data shows that most of these forms of harmful content come from between 3 and 8.5 percent of users, a small but highly active group.
The experiment revealed that participants in the misconception correction condition tended to view their fellow U.S. citizens as less morally degraded than participants in the control condition. They also felt more positive and were more likely to understand that others don’t want harmful online content. However, there were no differences between the two groups regarding cynicism and general trust in humanity.
“Our results revealed that people are unaware that most of the harmful content on social media is created by a small, prolific group of users. Rather, they believe that the amount of harmful content on social media is the result of large numbers of users participating in harmful behavior,” the study authors concluded.
This study contributes to scientific knowledge about Americans’ perceptions of social media and its users. However, it is important to note that this study only included participants in the United States and focused only on two types of harmful behavior on two platforms. Therefore, the findings may not be fully generalizable to other countries, other cultures, or other social media platforms.
The paper, “Americans overestimate the number of social media users who post harmful content,” was written by Angela Y. Lee, Eric Newman, Jamil Zaki, and Jeffrey Hancock.

