In 2022, following Elon Musk’s acquisition of Twitter—now rebranded as X—the platform underwent significant changes that reshaped its approach to content moderation and misinformation control. Most notably, X dramatically reduced its content moderation team by 80%, pivoting instead to a crowdsourced fact-checking system known as Community Notes. Originally introduced as a pilot program on Twitter, Community Notes empowers everyday users to propose contextual annotations or corrections on specific posts suspected of containing inaccurate or misleading information. When these notes receive affirmative votes from a diverse group of users, as validated by X’s proprietary algorithm, the annotations become visible appended to the original posts. Since their implementation on X, Community Notes have garnered attention and adoption by other major social media platforms, including Meta and YouTube, reflecting a growing trend toward collaborative content verification across the digital landscape.
In a groundbreaking study led by researchers at the University of Washington’s Information School, the real-world impacts of Community Notes on X were rigorously analyzed. Tracking 40,000 posts that had notes proposed between March and June 2023, the team sought to understand how the presence of Community Notes influenced user engagement and the diffusion—or spread—of content through social networks. A crucial finding emerged: posts appended with Community Notes were significantly less likely to go viral. Specifically, within 48 hours of a note being added, reposts dropped by 46%, while likes decreased by 44%. These results suggest that Community Notes play a measurable role in mitigating the viral spread of potentially false information, signaling an important advancement in the fight against misinformation online.
Martin Saveski, the study’s senior author and an assistant professor at the University of Washington, emphasized the multifaceted nature of misinformation spread on social media platforms. According to Saveski, while Community Notes effectively reduce engagement metrics that commonly signal endorsement—such as reposts and likes—they represent only one tool within a broader strategy needed to combat the complex dynamics of false content dissemination. The team’s research, published in the prestigious journal Proceedings of the National Academy of Sciences in September 2025, provides empirical evidence supporting the value of Community Notes but also underscores the necessity of integrating diverse approaches for a comprehensive solution.
The study’s methodology involved a detailed comparative analysis between posts with helpful notes attached and those without. Out of the 40,000 posts with suggested notes, 6,757 were evaluated as helpful and subsequently displayed on the platform. The researchers monitored these posts for a 48-hour period following the appending of notes, assessing key indicators such as likes, reposts, replies, and views. The nuanced metrics paint a detailed picture: while reposts and likes saw the most substantial declines (46% and 44% respectively), replies fell by 22% and views by 14%. These figures highlight that engagement signaling active support or endorsement of the content diminishes more dramatically than passive consumption, such as viewing.
A critical aspect of misinformation’s virality lies in how widely and quickly content spreads within and beyond an account’s immediate follower network. Isaac Slaughter, lead author and UW doctoral student, explained that Community Notes altered the diffusion patterns of posts significantly. Posts flagged with notes saw reduced interaction from users distant in the social graph—those who do not follow the original poster or are several degrees removed. Conversely, users closer to the source, such as direct followers, exhibited a lesser degree of reduced engagement, implying that misinformation’s core support base may remain somewhat resilient despite fact-checking efforts. This finding points to the social network structure’s role in moderating the effectiveness of interventions like Community Notes.
Another enlightening revelation from the research was the differential impact of Community Notes based on content type and popularity. Notes added to posts containing altered media—such as doctored photos and videos—had a stronger suppressive effect on engagement than those appended to text-only posts. Additionally, Community Notes appended to highly popular posts showed even greater reductions in engagement and diffusion metrics. Timing also proved essential: notes that appeared promptly—within hours after posting—were markedly more effective. Delays nearing or exceeding 48 hours diminished the notes’ capacity to alter user behavior, reflecting the rapid lifecycle typical of viral misinformation.
Through examining these granular dynamics, Saveski’s research lab is now focusing on developing technological solutions to expedite the note attachment process. Accelerating the speed at which Community Notes appear could enhance their capacity to curb the spread of false information before it gains irreversible momentum. This work aligns with the broader imperative in social media governance to deploy real-time, scalable moderation tools that can operate effectively in fast-moving digital environments.
Despite these promising findings, the researchers also acknowledged limitations tied to data accessibility and platform policy changes. Since the study focused strictly on posts with notes proposed in early 2023, its scope does not encompass recent upgrades to Community Notes mechanisms or variations implemented on other social networks. Furthermore, X’s decision to end free access to its application programming interface (API) poses a significant obstacle for ongoing academic research, restricting independent scrutiny and longitudinal studies that are essential for understanding these moderation tools’ evolving efficacy.
The research team called attention to the broader challenges facing distributed moderation systems across platforms. Saveski raised pivotal questions about the sustainability of separate, siloed fact-checking communities operating independently on platforms such as X, TikTok, and Instagram. The motivation and bandwidth of users to contribute may wane if moderation efforts become fragmented. Additionally, cross-platform collaboration and data sharing could amplify the impact and scale of these systems, yet such cooperation remains limited. While X has made strides in opening its code and datasets, other major platforms have yet to commit to similar transparency, highlighting a critical bottleneck in efforts to collectively tackle misinformation.
At its core, this study provides a striking validation of Community Notes as a valuable asset in the arsenal against online misinformation, showcasing quantifiable reductions in user engagement with misleading content. The work by Saveski, Slaughter, and their colleagues—including co-authors Axel Peytavin from Stanford University and Johan Ugander from Yale University—represents a significant step forward in understanding how participatory moderation technologies shape information ecosystems. Funded in part by the University of Washington Information School’s Strategic Research Fund and the Army Research Office’s Multidisciplinary University Research Initiative, their findings pave the way for further innovation in real-time fact-checking and community-driven content governance.
As the digital information landscape continues to evolve, these insights could inform policymakers, platform designers, and the global public about the continuing effort needed to preserve information integrity. Community Notes may not be a panacea, but their demonstrated ability to curb engagement with misinformation highlights the promise of harnessing collective social intelligence to foster healthier online discourse.
Subject of Research: The effectiveness of Community Notes in reducing engagement and diffusion of false information on social media, specifically on X (formerly Twitter).
Article Title: Community notes reduce engagement with and diffusion of false information online
News Publication Date: 18-Sep-2025
Web References:
- DOI link
- Community Notes Wikipedia
- NYTimes article on social media fact-checking
- The Verge article on API changes
References:
Slaughter, I., Saveski, M., Peytavin, A., & Ugander, J. (2025). Community notes reduce engagement with and diffusion of false information online. Proceedings of the National Academy of Sciences, 10.1073/pnas.250341312
Keywords:
Social media, misinformation, content moderation, fact-checking, Community Notes, online virality, information diffusion, algorithmic governance, user engagement, altered media, digital communication, social network analysis