Friday, August 29, 2025
Science
No Result
View All Result
  • Login
  • HOME
  • SCIENCE NEWS
  • CONTACT US
  • HOME
  • SCIENCE NEWS
  • CONTACT US
No Result
View All Result
Scienmag
No Result
View All Result
Home Science News Technology and Engineering

Why AI Lacks the Human Touch in Understanding Flowers

June 4, 2025
in Technology and Engineering
Reading Time: 3 mins read
0
66
SHARES
601
VIEWS
Share on FacebookShare on Twitter
ADVERTISEMENT

Despite advancements in artificial intelligence, a recent study indicates that AI tools like ChatGPT fall short of representing concepts as richly and deeply as humans do. This research, conducted by a group of psychologists led by Qihui Xu from The Ohio State University, highlights the fundamental differences in how humans and AI understand the world, particularly regarding sensory and motor experiences. The study found that while AI models excelled in representing words devoid of sensory connections, they struggled significantly with concepts that rely on the richness of human experiences.

The core of the issue lies in the architecture of large language models (LLMs) which predominantly harness linguistic data. Unlike humans, who possess multisensory experiences—sight, sound, touch, taste, and smell—AI models rely on extensive text-based datasets to learn. This discrepancy leads to a significant gap in understanding complex concepts such as flowers or food, where sensory engagement is critical. As Xu articulates, “A large language model can’t smell a rose, touch the petals of a daisy or walk through a field of wildflowers.” This inability to engage in sensory interaction causes a limitation in the AI’s conceptual framework as it lacks the rich, embodied experiences humans draw upon.

The study published in Nature Human Behaviour, examined how humans and two sophisticated AI models—OpenAI’s GPT-3.5 and GPT-4, as well as Google’s PaLM and Gemini—represent a range of concepts. The researchers focused on nearly 4,500 words, analyzing the degree of alignment between human and AI conceptual understandings. Assessments were based on two distinct measures: the Glasgow Norms, which evaluate words across dimensions like arousal and imageability, and the Lancaster Norms, which scrutinize how concepts interrelate with sensory and motor information.

In one aspect of their analysis, the researchers investigated how both human and AI systems correlated on differing concepts. They aimed to discern if there was uniformity in acknowledging certain concepts’ emotional weight, or how they are perceived across various dimensions. The results revealed an intriguing pattern; while AI performed admirably on abstract concepts lacking sensory connection, it faltered significantly when confronted with sensory-rich terminology.

Words that relate to human touch, taste, or sight posed considerable challenges for AI. For instance, the term ‘flower’ encompasses a multitude of experiences beyond its mere linguistic definition; it invokes vivid memories from scent, texture, and emotional context. Xu highlighted that the representation of a flower in human thought encompasses diverse sensory experiences that AI fails to integrate adequately. In essence, human cognition creates a multifaceted tapestry of experiences that words alone cannot encapsulate.

The researchers further explored the implications of these findings for future interactions between AI and humans. If AI processes the world differently, it could lead to misunderstandings or diminished effectiveness in communication. As AI technologies become more integrated into our daily lives, the nuances in their understanding of concepts can significantly affect their interactions with human users.

Moreover, the study also spotlighted an evolving trend: while AI has a long way to go in replicating human-like conceptualization, there are improvements on the horizon. Models trained not just on textual data but also on images have shown better performance in grasping vision-related concepts compared to their text-only counterparts. This suggests a pathway for AI to enrich its representations by incorporating various sensory modalities.

As the domain of artificial intelligence continues to evolve, it is conceivable that future advancements may incorporate more sophisticated forms of understanding, potentially enriched through sensory data directly linked to robotic interactions within the physical world. Xu anticipates that as LLMs become more integrated with sensor technologies, their ability to emulate human-like understanding could dramatically improve.

Despite the current limitations, researchers and developers maintain an optimistic outlook for the next generation of AI models. By embracing a more holistic approach that combines language with sensory experiences, the gap between human and AI understanding could narrow, leading to more intuitive and effective interaction paradigms.

The findings of this study emphasize the multifaceted nature of human understanding, underscoring that our experiences are significantly shaped by our direct engagement with the world. In contrast, AI’s reliance on text alone renders it an incomplete mimic of human cognition, illustrating that there is much room for growth and improvement in developing AI technologies.

In conclusion, the study by Xu and her colleagues paves the way for critical discussions surrounding the future of AI and its evolving relationship with human users. As advancements are made, the integration of sensory and motor experiences into AI frameworks could herald a new era in artificial intelligence, where AI not only understands language but also experiences the richness of life similarly to humans.

Subject of Research: People
Article Title: ‘Large language models without grounding recover non-sensorimotor but not sensorimotor features of human concepts’
News Publication Date: 4-Jun-2025
Web References: http://dx.doi.org/10.1038/s41562-025-02203-8
References: Not available
Image Credits: Not available

Keywords

AI, language models, human cognition, sensory experience, concept representation, emotional arousal, multimodal learning, artificial intelligence, robotics, human-computer interaction.

Tags: AI and human experience gapAI understanding of sensory experienceschallenges in AI conceptualizationdifferences between human and AI perceptionembodied experiences in understandingflower concepts and AIhuman touch in AIlanguage models and sensory datalimitations of artificial intelligencemultisensory experiences in learningpsychology of AI comprehensionQihui Xu research findings
Share26Tweet17
Previous Post

Unveiling the Secret Mechanisms of Ice Formation: How Ice Builds Layer by Layer

Next Post

Massive Planet Orbiting Minuscule Star: A Breakthrough Discovery That Questions Existing Planet Formation Theories

Related Posts

blank
Technology and Engineering

AI’s Impact: Threatening or Rescuing Democracy?

August 29, 2025
blank
Technology and Engineering

Simulating Fly Casting: Innovations in Overhead Techniques

August 29, 2025
blank
Technology and Engineering

Interface Strategy Drives Multi-Scale Hybrid Additive Manufacturing

August 29, 2025
blank
Technology and Engineering

Advanced Urban Scene Segmentation with ResNet and Attention

August 29, 2025
blank
Technology and Engineering

Creating Power Estimation Tools for Wattbike AtomX

August 29, 2025
blank
Technology and Engineering

AI Advances Kidney Stone Diagnosis Through Imaging

August 29, 2025
Next Post
TOI-6894b Exoplanet

Massive Planet Orbiting Minuscule Star: A Breakthrough Discovery That Questions Existing Planet Formation Theories

  • Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    27541 shares
    Share 11013 Tweet 6883
  • University of Seville Breaks 120-Year-Old Mystery, Revises a Key Einstein Concept

    954 shares
    Share 382 Tweet 239
  • Bee body mass, pathogens and local climate influence heat tolerance

    642 shares
    Share 257 Tweet 161
  • Researchers record first-ever images and data of a shark experiencing a boat strike

    509 shares
    Share 204 Tweet 127
  • Warm seawater speeding up melting of ‘Doomsday Glacier,’ scientists warn

    312 shares
    Share 125 Tweet 78
Science

Embark on a thrilling journey of discovery with Scienmag.com—your ultimate source for cutting-edge breakthroughs. Immerse yourself in a world where curiosity knows no limits and tomorrow’s possibilities become today’s reality!

RECENT NEWS

  • NEXN Prevents Vascular Calcification via SERCA2 SUMOylation
  • Guilt Balances Taqwa and Divine Forgiveness in Islam
  • Predictive Models Shape Transplant Eligibility Decisions
  • Quantum Forces Forge Universe: Birth, Death

Categories

  • Agriculture
  • Anthropology
  • Archaeology
  • Athmospheric
  • Biology
  • Blog
  • Bussines
  • Cancer
  • Chemistry
  • Climate
  • Earth Science
  • Marine
  • Mathematics
  • Medicine
  • Pediatry
  • Policy
  • Psychology & Psychiatry
  • Science Education
  • Social Science
  • Space
  • Technology and Engineering

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 5,181 other subscribers

© 2025 Scienmag - Science Magazine

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • HOME
  • SCIENCE NEWS
  • CONTACT US

© 2025 Scienmag - Science Magazine

Discover more from Science

Subscribe now to keep reading and get access to the full archive.

Continue reading