In the pulsating heart of a bustling fourth-grade classroom in Chicago, a novel experiment is quietly rewriting the rules of how social-emotional learning (SEL) is imparted to young minds. This is not a story of robots mimicking human emotions or embedding themselves into children’s fictional social worlds. Rather, it is a bold inquiry into how robots, through straightforward, factual dialogue devoid of emotional pretense, can influence empathy, conflict resolution, and problem-solving skills among children. The outcomes, emerging from this unprecedented research, promise to challenge deep-seated assumptions within educational technology circles.
At the forefront of this investigation is a research team from the University of Chicago’s Department of Computer Science, led by PhD student Lauren Wright under the mentorship of Assistant Professor Sarah Sebo. This partnership was forged with the vital cooperation of Chicago Public Schools (CPS), providing real-world access to classrooms and educational practitioners, alongside strategic support from Kiljoong Kim at Chapin Hall, who facilitated key institutional collaborations essential for the project’s cross-disciplinary success.
What distinguishes this research is its foundational approach—rather than starting with a preconceived robotic prototype, the team first immersed themselves in the dynamics of SEL within CPS classrooms. Interviews and observations with educators revealed a significant gap: group lessons on SEL, typically held once weekly, often fail to engage every student due to the sheer scale and varied learning needs within the classroom. Teachers lamented the absence of effective one-on-one interaction, crucial to deepen student understanding and internalization of SEL concepts.
Addressing this gap, the research posed a provocative question: can robots supplement teachers in providing individualized SEL instruction without the need for simulated human-like emotional engagement? To explore this, an experimental study was designed involving fifty-two fourth-grade students divided into three groups. One cohort interacted with robots delivering SEL lessons imbued with fictional, emotion-rich dialogue. A second cohort engaged with robots employing strictly factual language, openly acknowledging their lack of feelings or social experiences. A control group continued with the standard classroom curriculum without robotic involvement.
The methodology leveraged established SEL frameworks, primarily adapting lessons from Second Step, a curriculum well integrated into CPS classrooms. Robots transformed these group lessons into personalized, context-sensitive dialogues, allowing teachers to focus their attention on other students or areas of instruction. This setup enabled a naturalistic evaluation of robotic efficacy in augmenting the teachers’ efforts rather than replacing them.
Intriguingly, results indicated that both groups interacting with robots demonstrated significant improvements in their grasp of SEL vocabularies and problem-solving frameworks compared to peers receiving traditional instruction alone. More surprising was the finding that students exposed to the factual robot dialogue—devoid of emotional mimicry—often exhibited deeper engagement and more authentic application of lesson language. This revelation flies in the face of the dominant paradigm that robots must simulate human-like emotion to be effective tutors in SEL.
Lauren Wright reflects on this counterintuitive outcome, highlighting that the conventional strategy of endowing robots with fictional personalities may inadvertently distract students or inhibit their comfort in using SEL language. The straightforward honesty of factual robots, by contrast, appears to create a clearer pedagogical pathway, emphasizing conceptual understanding over emotional simulation. This challenges assumptions long held by designers of educational robotics.
From a technological standpoint, these findings signal a shift in how robotic agents might be designed for educational purposes. Instead of complex affective computing architectures aimed at mimicking emotional expressions and social backstories, simpler dialogue systems that maintain transparency and factual accuracy could deliver superior learning outcomes. This approach also alleviates ethical concerns surrounding children’s attachments to anthropomorphized machines, aligning educational technology with emerging societal demands for responsible AI deployment.
Furthermore, this research foregrounds the vital role of robots as supplements, not substitutes, in education. Assistive robotic tools can extend teachers’ capacity to provide individualized attention—a scarce resource in classrooms marked by increasing student-to-teacher ratios. Importantly, these technologies do not undermine the indispensability of human educators, whose empathetic and nuanced interactions remain irreplaceable in fostering holistic child development.
Sarah Sebo emphasizes this balance, cautioning against overreliance on technology, especially following the educational disruptions witnessed during the pandemic. Instead, she envisions robotic tutors as scalable augmentations, granting teachers reprieve and enabling targeted support for students who might otherwise be overlooked in group settings.
The broader implications of this Chicago-based experiment extend beyond local classrooms. The study exemplifies a model of partnership-driven innovation, synthesizing academic research, public school expertise, and social policy connectors. Its success invites other school districts and educational technology developers to reconsider entrenched assumptions about the roles and modalities of robot tutors, especially within the sensitive domain of social-emotional learning.
As schools globally grapple with integrating technology responsibly into pedagogy, this research provides a roadmap emphasizing transparency, ethical design, and empirical validation. Honest, factual robots represent a promising frontier where educational technology truly supplements and enhances human teaching rather than attempting to imitate or replace it.
In conclusion, the Chicago experiment heralds a paradigm shift in education robotics, advocating for designs grounded in authenticity and straightforward communication. It reminds us that the future classroom need not be inhabited by simulacra of human emotions but by honest allies, empowering teachers and nurturing every child’s emotional intelligence with clarity and purpose.
Subject of Research:
Robotic tutors’ impact on child social-emotional learning, comparing fictional emotional dialogue versus factual, emotion-free dialogue in classroom settings.
Article Title:
Fictional vs. Factual Robot Tutor Dialogue Can Shape Child Social-Emotional Learning
News Publication Date:
16-Mar-2026
Web References:
DOI: 10.1145/3757279.3785596
2026 ACM/IEEE International Conference on Human-Robot Interaction
References:
University of Chicago Department of Computer Science research led by Lauren Wright and Sarah Sebo; collaboration with Chicago Public Schools and Chapin Hall.
Keywords:
Robotics, Social-Emotional Learning, Educational Technology, Human-Robot Interaction, Transparency in AI, Personalized Education, Emotion-Free Robot Dialogue, SEL Curriculum, Teacher Support Tools, Responsible AI in Education.

