In an era where digital education has become ubiquitous, the challenge of maintaining student engagement and enhancing learning outcomes remains paramount. A groundbreaking study conducted at The Hong Kong University of Science and Technology (HKUST) offers fresh insight into this critical issue by exploring how brief, personalized pre-lecture interactions can modulate the neurological and cognitive processes underpinning online learning. Led by Prof. LI Ping, Dean of the School of Humanities and Social Science and Chair Professor of Psychology and Cognitive Science, this research reveals that even short one-on-one conversations—whether with human instructors or AI-based systems—can significantly improve neural synchrony and cognitive gains during subsequent video lectures.
Massive open online courses (MOOCs) and pre-recorded video lectures have become foundational components of higher education worldwide. However, despite their scalability and accessibility, these modalities often suffer from diminished learner engagement and suboptimal outcomes. The widespread shift to video-based learning exacerbated by the COVID-19 pandemic has amplified these concerns, prompting the necessity to identify interventions that enhance student attention and retention in virtual learning environments. This pioneering HKUST study interrogates the role of preparatory social interactions in priming the brain for effective knowledge acquisition.
The experimental design involved 57 university students randomly allocated into three groups: one with no pre-lecture interaction, one engaging briefly with a human instructor, and another interacting with an AI-powered digital instructor. The human and AI sessions lasted between 8 to 10 minutes and aimed to scaffold the students’ cognitive readiness for the material. Notably, the AI instructor, driven by the advanced GPT-4 model, incorporated multimodal components including speech recognition, dynamic content generation, synthesized speech output, and real-time animated facial expressions designed to closely mimic natural human interaction. Participants were fully informed they were interacting with AI, ensuring an authentic experimental context regarding social perception and engagement.
Following these interventions, all participants viewed the same 14-minute video lecture while undergoing simultaneous functional magnetic resonance imaging (fMRI) and eye-tracking assessments. This rigorous neuroscientific setup enabled the research team to map the intricate correspondence between gaze behavior, synchronized neural activity across distributed brain networks, and behavioral learning outcomes measured through recall, comprehension, and knowledge transfer tasks. The data revealed compelling evidence for enhanced neural alignment in the human and AI interaction groups compared to those without prior interaction.
Neuroimaging results showed elevated synchronization in brain regions integral to information processing, cognitive control, and socio-emotional integration in both interaction groups. These areas included the superior temporal sulcus (STS), known for its role in social cognition and language comprehension, and the posterior cingulate cortex (PCC), a central node in the brain’s default mode network that orchestrates top-down attentional regulation. Such neural synchrony indicates that preparatory conversations create a shared cognitive framework, optimizing the brain’s receptivity to upcoming educational content.
Eye-tracking data complemented this picture by revealing a notable difference in gaze behavior. Students interacting with human instructors exhibited significantly greater gaze alignment, reflecting coordinated visual attention which is tightly linked to social engagement and joint attention mechanisms. Intriguingly, although AI-interacted students demonstrated lower gaze alignment and reported reduced feelings of social closeness, their learning outcomes were statistically indistinguishable from those in the human interaction group. This dissociation underscores fundamentally different neural pathways converging on similar educational success.
The implications are profound: human-led interactions appear to scaffold learning through both cognitive and social-emotional processing channels mediated by visual attention dynamics. In contrast, AI-led engagements predominantly activate top-down cognitive control mechanisms, leveraging computational prowess to enhance knowledge retrieval and personalized support without replicating full social mimicry. This finding reshapes assumptions about the necessity of authentically human social presence in digital pedagogy and highlights AI’s potent role as an educational facilitator.
Perhaps the most novel conceptual advancement is the elucidation of a multi-stage “eye-brain-behavior correspondence” cascade. The study shows bidirectional feedback loops whereby synchronous eye movements reinforce neural alignment, which in turn sustains coordinated attention. This recursive dynamic fosters a learning-specific brain state conducive to memory encoding and conceptual assimilation. The superior temporal sulcus functions as a nexus linking externally observable gaze patterns with internal social-linguistic processing, while the posterior cingulate cortex exerts control over attentional focus and integration within intrinsic brain networks.
Dr. PENG Yingying, HKUST Postdoctoral Fellow and the study’s lead author, highlights that these findings represent a critical bridge between cognitive neuroscience and educational technology. They provide empirical validation for AI’s capacity to evoke meaningful brain states supportive of deep learning, despite the absence of traditional social cues like mutual gaze or empathic engagement. This insight opens opportunities for designing AI systems that strategically blend emotional resonance with adaptive cognitive scaffolding tailored to individual learners.
From an educational technology perspective, this research portends a future where AI tutors can dynamically monitor subtle neural and behavioral indicators, modulating instructional delivery with human-like sensitivity. Prof. Li envisions AI systems capable of perceiving a student’s fluctuating engagement, detecting nuanced shifts in attention or affect, and responding with calibrated interventions to foster a sense of being “seen” and “heard” in the virtual classroom. By achieving this synthesis, AI could enhance not only cognitive performance but also social connectedness—a linchpin of effective pedagogy.
Moreover, the demonstrated equivalence of AI and human interaction in driving learning gains validates scalable solutions for global educational challenges. As digital curricula expand to serve mass audiences, integrating AI-mediated social scaffolding could mitigate isolation and disengagement endemic to online platforms. The research provides a neuroscientific foundation that assures educators and policymakers that AI-based pre-lecture conversations are more than mere novelty; they represent a substantive advancement in the science of teaching and learning.
Future research trajectories will need to extend this paradigm by exploring longitudinal effects, diverse content domains, and adaptive AI systems incorporating multimodal affective and cognitive monitoring. Understanding how the human brain adapts over repeated AI interactions will be crucial for refining these pedagogical tools to balance efficiency with empathetic responsiveness. This work lays conceptual and methodological groundwork for bridging human-centered educational psychology with cutting-edge artificial intelligence.
In summary, the HKUST study challenges and refines our understanding of social interaction’s role in learning by empirically demonstrating that AI can effectively replicate critical cognitive pathways involved in human instruction. The convergence of eye-tracking, fMRI data, and behavioral outcomes documents a nuanced neurocognitive architecture supporting online education’s effectiveness, irrespective of the instructor’s biological substrate. This research marks a milestone in the evolving landscape of education, neuroscience, and AI, heralding a future of enriched, socially attuned digital learning environments.
Subject of Research: Not applicable
Article Title: Scaffolding human and AI instruction: Neural alignment and learning gains in online education
News Publication Date: 30-Apr-2026
Web References: https://www.sciencedirect.com/science/article/pii/S0896627326002746
References: DOI 10.1016/j.neuron.2026.04.005
Image Credits: HKUST
Keywords: Social sciences, AI in education, neural synchrony, cognitive neuroscience, online learning, eye-tracking, fMRI, GPT-4, educational technology, brain alignment

