In the sprawling digital landscapes of social media, the rapid dissemination of misinformation remains a formidable challenge for society. A groundbreaking study by Wu, Wu, and Xiao, soon to be published in Humanities and Social Sciences Communications, dives deeply into the mechanisms behind why users propagate misinformation online. Integrating two powerful theoretical frameworks—affordance theory and flow theory—the research sheds new light on the intricate psychological and technological factors that prompt individuals to share misleading or false content on social media platforms.
At the heart of this novel investigation lies the concept of social media affordances—functional properties of digital platforms that enable specific user behaviors. Unlike traditional views that have largely celebrated the benefits of these affordances, such as enhanced engagement, Wu and colleagues’ work highlights a darker facet: these very features that empower user interaction may paradoxically facilitate the spread of misinformation. The researchers argue that examining social media through the lens of affordances offers more than a superficial understanding; it provides a critical framework for discerning how platform design can inadvertently foster misinformation sharing.
One of the most striking findings emerges around the role of ‘metavoicing,’ a distinctive affordance encapsulating information interaction. Metavoicing equips users not only to express their own opinions but also to react actively to others’ views. The study reveals that this affordance wields considerably stronger effects on users’ cognitive and affective involvement than traditional factors such as information accessibility (the mere availability of data) and social association (the relational ties between users). This suggests that the dynamic and interactive qualities of metavoicing amplify users’ mental engagement and emotional investment in the content, thereby increasing the likelihood of misinformation proliferation.
Delving deeper into the psychological pathways, the authors differentiate between cognitive involvement—the rational processing and evaluation of information—and affective involvement, which encompasses emotional engagement and feelings inspired by the content. Their rigorous analysis shows that affective involvement exerts a more substantial influence on users’ decisions to share misinformation. This emotional impetus resonates with broader scholarly assertions that social media consumption is often driven more by hedonic, pleasure-seeking motives than by utilitarian, objective considerations. Emotional contagion—wherein feelings expressed in online discourse flow rapidly from sender to receiver—can heighten users’ affective involvement, intensifying their sharing behavior.
Interestingly, the anticipated moderating role of users’ cognitive ability—the extent of their capacity to process and scrutinize information—proved insignificant in reducing misinformation sharing. This counterintuitive outcome may stem from the prevailing hedonic motivations underpinning social media use. As users often seek entertainment or relaxation, they might bypass rigorous evaluation processes, particularly when immersed in a ‘flow experience’—a state of deep engagement characterized by effortless attention and intrinsic enjoyment. The fragmented and fast-paced nature of social media further complicates the application of cognitive filtering, rendering even cognitively adept individuals vulnerable to misinformation.
This nuanced differentiation of flow experience into cognitive and affective involvement constitutes a major theoretical advance. Previous studies tended to treat flow as a monolithic construct, but the present research illuminates its multidimensionality. By separately examining the flow’s cognitive and emotional facets, the authors enrich our understanding of how immersive online experiences shape users’ propensity toward misinformation sharing. Specifically, the pronounced predictive power of affective involvement underscores the critical role emotions play in driving online behavior, especially in environments saturated with sensationalist and emotionally charged content.
The theoretical implications of merging affordance and flow theories transcend mere academic interest; they forge a comprehensive conceptual framework to decode misinformation dynamics on social media. Affordance theory elucidates which platform features trigger user behaviors, while flow theory explains the psychological processes—how users cognitively and emotionally engage with digital content—which ultimately influence their sharing decisions. Together, these lenses form a holistic model that captures the complexity of misinformation sharing more effectively than either theory alone.
Practically, the findings deliver urgent messages for social media designers and platform managers. While features like metavoicing and easy information accessibility enhance user engagement, they also inadvertently heighten cognitive and affective involvement, serving as tinder for misinformation spread. The study recommends designing technological nudges—such as periodic reminders fostering rational reflection—to temper users’ emotional arousal and carefulness in assessing content before sharing. This delicate balance between fostering user engagement and safeguarding information quality poses a significant design challenge.
Beyond platform design, managing users’ emotional competence emerges as a powerful lever in curbing misinformation. The research identifies emotional ability—the capacity to regulate and manage one’s feelings—as a negative moderator in the link between affective involvement and misinformation sharing. In other words, users with higher emotional intelligence are less prone to share misinformation when emotionally involved. This insight suggests that social media managers and policymakers should invest in educational programs and provide in-app tools that cultivate emotional awareness and self-regulation among users, thereby bolstering digital literacy and resilience against emotionally charged misinformation.
Fact-checking mechanisms aligned with users’ emotional competencies could also enhance misinformation mitigation. Platforms could implement algorithms to identify users with relatively low emotional regulation capacities and subject their shared content to additional scrutiny or serve them fact-checked information proactively. Such personalized interventions, informed by behavioral insights, promise more targeted and effective misinformation management strategies.
However, the study’s reliance on cross-sectional survey data brings limitations. While relationships among social media affordances, flow experiences, and misinformation sharing are compellingly established, the data cannot definitively prove causal pathways or capture temporal changes in users’ behaviors. Longitudinal research is called for to observe how misinformation sharing evolves over time and in response to shifting motivations and technological updates.
Additionally, though social media affordances explain roughly one-third of the variance in misinformation sharing behavior, the authors acknowledge that broader contextual dimensions—including technological, political, and societal factors—play influential roles in the diffusion of misinformation. The interplay of these macro-level variables with individual psychological processes remains an important area for future exploration, potentially through multidisciplinary approaches that merge insights from political science, sociology, and communication studies.
Moreover, the partial mediation of cognitive and affective involvement in the affordance-misinformation relationship signals that other intermediary variables may be at play. Future studies should explore additional mediators such as trust, perceived credibility, and normative influences, offering a more intricate map of misinformation sharing mechanisms. Qualitative investigations could complement these efforts, providing richer narratives of users’ decision-making in complex digital ecosystems.
In framing social media as a “double-edged sword,” this research highlights the paradox in digital connectivity: functionalities designed to foster community and communication can simultaneously act as conduits for misinformation. The deeper understanding cultivated here emphasizes that technology’s sociopsychological impacts cannot be disentangled from its design features, underscoring an urgent need for interdisciplinary collaboration among technologists, psychologists, and policymakers.
Ultimately, the fusion of affordance and flow theories pioneered by Wu and colleagues produces not only academic insights but actionable pathways for addressing one of the most pressing dilemmas of our time: how to stem the tide of misinformation while preserving the vibrant, interactive essence of social media. As platforms continue to evolve and users deepen their digital immersions, such integrated theoretical frameworks will be critical for navigating the tangled realities of online information ecosystems.
The emerging picture is clear: misinformation sharing is not merely a failure of facts or cognitive lapses but a complex behavioral phenomenon anchored in users’ immersive experiences and interactive capabilities afforded by social media platforms. Recognizing this complexity equips society with better tools to design interventions, tailor educational initiatives, and ultimately foster a healthier digital public sphere.
This insightful work paves the way for a broader and more balanced discourse about social media’s role in shaping information flows. By combining technical affordances with human emotional and cognitive processes, Wu et al. provide a foundational model that future researchers and practitioners can build upon to cultivate more trustworthy, engaging, and emotionally intelligent social media environments.
Article References:
Wu, M., Wu, T. & Xiao, Y. Why people share misinformation on social media? An integration of affordance and flow theories.
Humanit Soc Sci Commun 12, 1129 (2025). https://doi.org/10.1057/s41599-025-05511-6