Monday, August 4, 2025
Science
No Result
View All Result
  • Login
  • HOME
  • SCIENCE NEWS
  • CONTACT US
  • HOME
  • SCIENCE NEWS
  • CONTACT US
No Result
View All Result
Scienmag
No Result
View All Result
Home Science News Technology and Engineering

AI-powered ‘sonar’ on smartglasses tracks gaze, facial expressions

April 18, 2024
in Technology and Engineering
Reading Time: 10 mins read
0
65
SHARES
593
VIEWS
Share on FacebookShare on Twitter
ADVERTISEMENT

ITHACA, N.Y. – Cornell University researchers have developed two technologies that track a person’s gaze and facial expressions through sonar-like sensing. The technology is small enough to fit on commercial smartglasses or virtual reality or augmented reality headsets, yet consumes significantly less power than similar tools using cameras.

Both use speakers and microphones mounted on an eyeglass frame to bounce inaudible soundwaves off the face and pick up reflected signals caused by face and eye movements. One device, GazeTrak, is the first eye-tracking system that relies on acoustic signals. The second, EyeEcho, is the first eyeglass-based system to continuously and accurately detect facial expressions and recreate them through an avatar in real time.

The devices can last for several hours on a smartglass battery and more than a day on a VR headset.

ADVERTISEMENT

“It’s small, it’s cheap and super low-powered, so you can wear it on smartglasses everyday – it won’t kill your battery,” said Cheng Zhang, assistant professor of information science. Zhang directs the Smart Computer Interfaces for Future Interactions (SciFi) Lab that created the new devices.

“In a VR environment, you want to recreate detailed facial expressions and gaze movements so that you can have better interactions with other users,” said Ke Li, a doctoral student who led the GazeTrak and EyeEcho development.

For GazeTrak, researchers positioned one speaker and four microphones around the inside of each eye frame of a pair of glasses, to bounce and pick up soundwaves from the eyeball and the area around the eyes. The resulting sound signals are fed into a customized deep learning pipeline that uses artificial intelligence to continuously infer the direction of the person’s gaze.

For EyeEcho, one speaker and one microphone is located next to the glasses’ hinges, pointing down to catch skin movement as facial expressions change. The reflected signals are also interpreted using AI.

With this technology, users can have hands-free video calls through an avatar, even in a noisy café or on the street. While some smartglasses have the ability to recognize faces or distinguish between a few specific expressions, currently, none track expressions continuously like EyeEcho.

These two advances have applications beyond enhancing a person’s VR experience. GazeTrak could be used with screen readers to read out portions of text for people with low vision as they peruse a website.

GazeTrak and EyeEcho could also potentially help diagnose or monitor neurodegenerative diseases, like Alzheimer’s and Parkinsons. With these conditions, patients often have abnormal eye movements and less expressive faces, and this type of technology could track the progression of the disease from the comfort of a patient’s home.

Li will present GazeTrak at the Annual International Conference on Mobile Computing and Networking in the fall and EyeEcho at the Association of Computing Machinery CHI conference on Human Factors in Computing Systems in May.

For additional information, see this Cornell Chronicle story.

ITHACA, N.Y. – Cornell University researchers have developed two technologies that track a person’s gaze and facial expressions through sonar-like sensing. The technology is small enough to fit on commercial smartglasses or virtual reality or augmented reality headsets, yet consumes significantly less power than similar tools using cameras.

Both use speakers and microphones mounted on an eyeglass frame to bounce inaudible soundwaves off the face and pick up reflected signals caused by face and eye movements. One device, GazeTrak, is the first eye-tracking system that relies on acoustic signals. The second, EyeEcho, is the first eyeglass-based system to continuously and accurately detect facial expressions and recreate them through an avatar in real time.

The devices can last for several hours on a smartglass battery and more than a day on a VR headset.

“It’s small, it’s cheap and super low-powered, so you can wear it on smartglasses everyday – it won’t kill your battery,” said Cheng Zhang, assistant professor of information science. Zhang directs the Smart Computer Interfaces for Future Interactions (SciFi) Lab that created the new devices.

“In a VR environment, you want to recreate detailed facial expressions and gaze movements so that you can have better interactions with other users,” said Ke Li, a doctoral student who led the GazeTrak and EyeEcho development.

For GazeTrak, researchers positioned one speaker and four microphones around the inside of each eye frame of a pair of glasses, to bounce and pick up soundwaves from the eyeball and the area around the eyes. The resulting sound signals are fed into a customized deep learning pipeline that uses artificial intelligence to continuously infer the direction of the person’s gaze.

For EyeEcho, one speaker and one microphone is located next to the glasses’ hinges, pointing down to catch skin movement as facial expressions change. The reflected signals are also interpreted using AI.

With this technology, users can have hands-free video calls through an avatar, even in a noisy café or on the street. While some smartglasses have the ability to recognize faces or distinguish between a few specific expressions, currently, none track expressions continuously like EyeEcho.

These two advances have applications beyond enhancing a person’s VR experience. GazeTrak could be used with screen readers to read out portions of text for people with low vision as they peruse a website.

GazeTrak and EyeEcho could also potentially help diagnose or monitor neurodegenerative diseases, like Alzheimer’s and Parkinsons. With these conditions, patients often have abnormal eye movements and less expressive faces, and this type of technology could track the progression of the disease from the comfort of a patient’s home.

Li will present GazeTrak at the Annual International Conference on Mobile Computing and Networking in the fall and EyeEcho at the Association of Computing Machinery CHI conference on Human Factors in Computing Systems in May.

For additional information, see this Cornell Chronicle story.

Media note: Pictures can be viewed and downloaded here: https://cornell.box.com/v/sonarsmartglasses.

-30-



Share26Tweet16
Previous Post

Internationally renowned science leader to give Aston University’s annual distinguished lecture

Next Post

Discovery of a marker of delayed recovery of concussion in children

Related Posts

Technology and Engineering

Author Correction: Breakthroughs in Ultrafast Photonics Integration

August 4, 2025
blank
Technology and Engineering

Customizing Soft Fiber Pumps via Winding Electrode Patterns

August 4, 2025
blank
Technology and Engineering

Microplastics Movement in Wheat Roots and Response

August 4, 2025
blank
Technology and Engineering

Toxicity of Micro- and Nanoplastics in Lung Cells

August 4, 2025
blank
Technology and Engineering

Breakthrough in Genome Editing: Scientists Attain Megabase-Scale Precision in Eukaryotic Cells

August 4, 2025
blank
Medicine

Real-Time In-Situ Magnetization for Soft Robotics

August 4, 2025
Next Post
Journal of Neurotrauma

Discovery of a marker of delayed recovery of concussion in children

  • Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    27529 shares
    Share 11008 Tweet 6880
  • University of Seville Breaks 120-Year-Old Mystery, Revises a Key Einstein Concept

    939 shares
    Share 376 Tweet 235
  • Bee body mass, pathogens and local climate influence heat tolerance

    640 shares
    Share 256 Tweet 160
  • Researchers record first-ever images and data of a shark experiencing a boat strike

    506 shares
    Share 202 Tweet 127
  • Warm seawater speeding up melting of ‘Doomsday Glacier,’ scientists warn

    310 shares
    Share 124 Tweet 78
Science

Embark on a thrilling journey of discovery with Scienmag.com—your ultimate source for cutting-edge breakthroughs. Immerse yourself in a world where curiosity knows no limits and tomorrow’s possibilities become today’s reality!

RECENT NEWS

  • Research Reveals Significant Health Insurance Disparities Among Certain Adopted Children
  • Tracing Ancient Arthropod Movements: Decoding the Hidden Steps of Burgess Shale Trilobites
  • Key Factors Driving Substance Abuse in Med Students
  • Pre-Treatment FDG PET/CT Predicts Rectal Cancer Response

Categories

  • Agriculture
  • Anthropology
  • Archaeology
  • Athmospheric
  • Biology
  • Bussines
  • Cancer
  • Chemistry
  • Climate
  • Earth Science
  • Marine
  • Mathematics
  • Medicine
  • Pediatry
  • Policy
  • Psychology & Psychiatry
  • Science Education
  • Social Science
  • Space
  • Technology and Engineering

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 5,184 other subscribers

© 2025 Scienmag - Science Magazine

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • HOME
  • SCIENCE NEWS
  • CONTACT US

© 2025 Scienmag - Science Magazine

Discover more from Science

Subscribe now to keep reading and get access to the full archive.

Continue reading