As artificial intelligence continues to revolutionize various sectors, the integration of artificial intelligence-generated content (AIGC) tools in medical education is garnering growing attention. A recent comprehensive study has shed light on the intricate factors influencing medical students’ propensity to embrace these emerging technologies. By employing sophisticated analytical methodologies and multi-theory perspectives, the research unpacks how medical master’s and doctoral students navigate their acceptance and use of AIGC tools, which are rapidly transforming scientific research and learning in the healthcare domain.
One of the pivotal findings from this study underscores the critical role of social influence as a powerful catalyst driving medical students’ adoption of AIGC. Through mechanisms such as compliance, internalization, and identification, social networks and opinion leaders exert significant sway over usage intentions. In an age where social media campaigns and expert endorsements proliferate, the collective sentiment around AIGC technology acts as a decisive force, encouraging students to integrate AI-generated content into their academic and research workflows. This dynamic highlights how communal dialogues within medical education ecosystems can accelerate technological acceptance in a field traditionally anchored in established methodologies.
Contrary to prevalent concerns surrounding technology adoption, this research reveals a somewhat counterintuitive insight: perceived risk does not pose a significant barrier to AIGC usage among medical students. While these individuals are acutely aware of potential drawbacks—ranging from data privacy issues to inaccuracies inherent in AI-generated information—their decision-making balances these risks against tangible benefits. Convenience, efficiency in completing academic assignments, and the practical utility of AIGC often outweigh apprehensions. This phenomenon exemplifies the complex interplay between rational risk assessment and behavior, where theoretical knowledge of potential hazards coexists with pragmatic choices favoring innovative tools.
Performance Expectancy and Effort Expectancy emerge as the most influential determinants shaping medical students’ willingness to engage with AIGC. The former relates to the anticipated benefits and usefulness of AI tools in enhancing work efficiency and learning outcomes, while the latter emphasizes the ease of use and accessibility of these tools. This dual emphasis suggests that students are not just focused on what AIGC can achieve but also how effortlessly it can be integrated into their existing workflows. The simplistic design and user-friendly interfaces of many AIGC platforms lower adoption barriers, enabling more rapid diffusion across medical educational settings.
Interestingly, acceptance of innovative technologies, though positively correlated with AIGC usage, exerts a less pronounced impact than might be expected. For medical students, the embrace of cutting-edge technologies is intertwined with the unique demands and traditions of medical education and clinical practice. The gradual incorporation of AI tools into curricula requires ongoing cultural shifts and adjustments in pedagogical approaches to fully harness their transformative potential. This nuanced relationship indicates that willingness to adopt technology is not merely a function of general openness but is intricately connected to deep-rooted professional values and standards.
The study’s examination of perceived trust reveals an unexpected disconnect: despite trust being a crucial facilitator in many technology acceptance frameworks, here it does not significantly predict AIGC usage intentions among medical students. This finding speaks volumes about the cautious, evidence-based mindset ingrained in medical professionals in training. Given AIGC’s current limitations—such as incomplete understanding of complex clinical contexts and concerns about reliability—students tend to rely more heavily on traditional knowledge and clinical judgment. Furthermore, the nascent stage of AI integration within medical education means that structured training and exposure necessary to build confident trust in these tools remain limited.
Facilitating conditions represent a positive and vital influence on medical students’ predisposition towards adopting AIGC technologies. These conditions encompass supportive factors such as availability of tailored assistance, real-time performance feedback, automation of tedious tasks, and widespread accessibility. By effectively lowering hurdles and enhancing user experience, facilitating conditions create an enabling environment that promotes smooth integration of AI into academic and clinical workflows. This underscores the importance of not only developing cutting-edge AI tools but also ensuring infrastructure and support systems are robust enough to support their widespread use.
The study further highlights significant interrelations among various predictors of AIGC adoption. For instance, effort expectancy directly enhances performance expectancy, especially among traditional Chinese medicine students. This relationship suggests that when students perceive AI systems as easy to use, their confidence in the technology’s utility and benefits naturally increases. Such synergy is crucial for fostering a positive feedback loop, where intuitive interfaces encourage exploration, which in turn enhances performance perceptions, thereby driving higher adoption rates.
While acceptance of innovation influences effort expectancy substantially, its effect on performance expectancy is comparatively moderate. Medical students, despite being generally receptive to novel technologies, often encounter a gap between theoretical acceptance and practical impact. The complexity of clinical workflows, limited AIGC training, and cultural factors all contribute to this disparity. Students recognize AI’s potential but remain cautious, as AIGC tools cannot yet fully replicate the nuanced clinical reasoning required in medical practice. This highlights pressing educational challenges that need to be addressed for AIGC to achieve deeper traction in healthcare education.
The ethical implications of integrating AIGC within medical education and practice warrant careful consideration. Accuracy and reliability are paramount, as reliance on erroneous information can lead to clinical misjudgments with serious consequences. Consequently, AI systems must be iteratively trained and updated with high-quality medical datasets reflecting the latest evidence. The risk of excessive dependence on AI also raises concerns around the potential erosion of clinical judgment and hands-on skills among emerging healthcare professionals. Balancing these technologies as assistive tools rather than replacements is essential for safeguarding both educational quality and patient safety.
Notably, the findings indicate a complex landscape where multiple factors, including social influence, perceived utility, and contextual facilitating conditions, converge to shape AIGC adoption. This multi-dimensional ecosystem suggests that simplistic interventions targeting a single variable may fall short. Instead, holistic strategies that incorporate social dynamics, educational reforms, robust infrastructure, and clear ethical guidelines are needed to drive meaningful adoption and optimize benefits.
Furthermore, the study’s insights into the unique perceptions held by medical students underscore the importance of tailoring AI-driven educational innovations to specific disciplinary contexts. Medical education’s rigorous knowledge requirements, entrenched clinical standards, and professional uncertainties pose unique challenges not found in other disciplines. Bridging this gap requires integrating AI literacy into medical curricula, providing systematic hands-on training, and fostering cultural shifts that emphasize AI as a complementary resource.
This research also opens avenues for future exploration, particularly around how educational institutions can foster stronger links between acceptance of innovation and practical performance expectations. Developing targeted interventions aimed at skill-building and enhancing technological fluency among medical students may help overcome current disparities. Additionally, longitudinal studies tracking how evolving career trajectories influence AI adoption patterns will better inform policy and instructional design.
In conclusion, this landmark study offers invaluable perspectives on the nuanced interplay of psychological, social, and contextual factors governing medical students’ engagement with AI-generated content tools. It highlights the substantial role social influence and facilitating conditions play while drawing attention to areas where perceived risk and trust diverge from expected patterns. Crucially, it calls for concerted efforts to embed AIGC more seamlessly within medical education, ensuring ethical standards, practical usability, and cultural acceptance are all adequately addressed. As these transformations unfold, the medical community stands on the cusp of a profound shift, where artificial intelligence may become an indispensable partner in shaping the future of healthcare education and delivery.
Subject of Research:
Exploring the factors influencing medical students’ willingness to adopt artificial intelligence-generated content (AIGC) tools in academic research and education.
Article Title:
Unlocking medical students’ adoption of AIGC tools: a multi-theory perspective.
Article References:
Zhou, Z., Qi, H., Li, Z. et al. Unlocking medical students’ adoption of AIGC tools: a multi-theory perspective. Humanit Soc Sci Commun 12, 1870 (2025). https://doi.org/10.1057/s41599-025-06078-y
Image Credits:
AI Generated

