Friday, February 6, 2026
Science
No Result
View All Result
  • Login
  • HOME
  • SCIENCE NEWS
  • CONTACT US
  • HOME
  • SCIENCE NEWS
  • CONTACT US
No Result
View All Result
Scienmag
No Result
View All Result
Home Science News Social Science

How the Brain Uses Eye Movements to Perceive 3D Vision

February 5, 2026
in Social Science
Reading Time: 4 mins read
0
65
SHARES
588
VIEWS
Share on FacebookShare on Twitter
ADVERTISEMENT

When we walk down a street and observe the world around us, our brain performs a remarkable feat: distinguishing between objects that are stationary and those in motion. Consider the challenge of telling apart a parked car from one zipping past at high speed. It might seem trivial, but the mechanisms that allow this perception are highly intricate. This difficulty arises primarily because the motions of our eyes themselves induce apparent movement of the entire visual scene across the retina—a phenomenon long regarded as visual “noise” that the brain must filter out to perceive true object motion.

Traditional neuroscience has held that the visual system must subtract out the retinal motion generated by eye movements to isolate the motion of objects relative to the environment. However, this longstanding notion has been challenged by new research from the University of Rochester. Their groundbreaking investigation reveals that the visual motion caused by our eye movements is far from meaningless interference. Instead, these specific patterns of image motion are valuable clues that the brain actively analyzes to decipher how objects move and, critically, how they occupy three-dimensional space.

Leading this innovative inquiry is Professor Greg DeAngelis, a distinguished figure in brain and cognitive sciences, neuroscience, and biomedical engineering. According to DeAngelis, the assumption that image motion produced by eye movements is merely a nuisance variable to be discarded is a misconception. Their findings illustrate that the brain harnesses these global patterns of image flow to infer the relative movements of the eyes in the surrounding space. This insight revolutionizes our understanding of visual processing by framing eye movement-induced image motion as an essential component for depth and motion interpretation, not as a problem to be erased.

To systematically investigate these dynamics, the research team devised an advanced theoretical framework predicting human perception of motion and depth under different eye movement conditions. This model accounts for the complex interplay between target object motion, eye fixation, and the accompanying retinal image displacement. By simulating multiple scenarios with varying object trajectories and gaze directions, they formulated precise predictions on observers’ perceptual errors regarding depth and motion.

The team validated these predictions through controlled experiments employing immersive 3D virtual reality environments. Participants maintained fixation on a stable point while observing target objects moving in the scene. In one perceptual assessment, subjects adjusted a dial to align a secondary object’s motion direction with their perceived trajectory of the target. In another depth perception task, participants indicated whether the target appeared closer or farther than the fixation point. The observed consistent and systematic perceptual biases in both tasks matched the theoretical expectations remarkably well, underscoring the model’s robustness.

Importantly, this body of work demonstrates that the brain integrates multiple streams of information—especially the image motions generated by eye movements—when constructing its representation of the three-dimensional world. Rather than suppressing these retinal signals as noise, the visual system evaluates their spatial patterns to infer the real-world layout accurately. This nuanced understanding challenges canonical perspectives in vision science that have dominated for decades.

The implications of these findings extend beyond basic neuroscience, touching on real-world applications such as technological interfaces and virtual reality. DeAngelis points out that current VR systems largely ignore the dynamic relationship between eye movements and the visual scene when rendering images. This disconnect may produce visual conflicts causing discomfort or motion sickness among users, as the artificial image motion does not align with the brain’s expected sensory input during eye movements.

By incorporating models of how the brain processes eye movement-induced image motion, future VR technologies could render more naturalistic and stable visual environments. Such advancements have the potential not only to enhance user comfort and reduce motion sickness but also to improve immersion and accuracy in virtual spaces. This line of research opens pathways toward a new generation of visually intelligent systems that harmonize with the brain’s perceptual strategies.

Furthermore, these discoveries inform our understanding of neurological disorders affecting visual perception and motion processing. Conditions that impair the brain’s ability to integrate eye movement signals might underlie difficulties in spatial navigation, object recognition, or depth perception. By elucidating how the healthy brain solves these challenges, this research sets the stage for targeted therapies and diagnostic tools.

The study involved contributions from graduate and postdoctoral researchers, reflecting a collaborative endeavor across multiple domains of expertise. Zhe-Xin Xu, formerly a doctoral student and now a postdoc at Harvard, and Jiayi Pang, currently continuing graduate studies at Brown University, brought critical insights. Akiyuki Anzai, a research associate at Rochester, also played a key role, underscoring the multidisciplinary nature of the investigation within neuroscience and visual cognition.

Supported by the National Institutes of Health, this research underscores the value of integrating theoretical modeling with immersive experimental paradigms to unravel complex brain functions. The fusion of computational and behavioral approaches emerges as a powerful tool for deciphering perception mechanisms that govern human experience of space and motion.

Ultimately, this paradigm-shifting work not only redefines our conception of how eye movements affect visual perception but also paves the way for innovations across health sciences and technology. By revealing that eye movement-induced image motion serves as an informative signal rather than unwanted noise, this study illuminates the sophisticated strategies the brain employs to interpret and navigate the three-dimensional world around us.


Subject of Research: Neuroscience, Visual Perception, Eye Movement, 3D Spatial Interpretation

Article Title: The Brain’s Use of Eye Movement-Induced Image Motion to Interpret 3D Space and Object Motion

Web References:

  • University of Rochester: http://www.rochester.edu/
  • Greg DeAngelis Lab: https://www.sas.rochester.edu/bcs/people/faculty/deangelis_greg/index.html
  • Nature Communications article: https://www.nature.com/articles/s41467-025-67857-4
  • DOI: http://dx.doi.org/10.17605/OSF.IO/ZY8W6

References:
DeAngelis, G.C., Xu, Z.-X., Pang, J., Anzai, A. (2025). Patterns of visual motion produced by eye movements inform the brain’s perception of 3D motion and depth. Nature Communications. https://doi.org/10.17605/OSF.IO/ZY8W6

Image Credits: John Schlia Photography, University of Rochester

Keywords: Neuroscience, Visual Perception, Eye Movements, 3D Vision, Depth Perception, Motion Perception, Virtual Reality, Cognitive Psychology, Brain and Cognitive Sciences

Tags: 3D vision perceptionbrain and cognitive scienceseye movement analysisGreg DeAngelis contributionsneuroscience of perceptionobject motion differentiationperception of stationary objectsretinal motion filteringUniversity of Rochester researchvisual motion processingvisual noise interpretationvisual system mechanisms
Share26Tweet16
Previous Post

Could Finger Length Hold Key Insights into the Evolution of the Human Brain?

Next Post

Latent diffusion model delivers efficient and high-quality results

Related Posts

blank
Social Science

Ethical Challenges of Hybrid Tech in Operating Rooms

February 6, 2026
blank
Social Science

Building Urban Climate Action: UCCRN Case Study Atlas

February 6, 2026
blank
Social Science

Measuring Stadium Fever: Why Live Football Elevates Heart Rates

February 6, 2026
blank
Social Science

Knowledge Silos Hamper Emergency Management App Development

February 6, 2026
blank
Social Science

New Model Tracks Foot Traffic Patterns Across New York City

February 6, 2026
blank
Social Science

From Edge Growth to Infill: China’s Urban Fragmentation

February 5, 2026
Next Post
blank

Latent diffusion model delivers efficient and high-quality results

  • Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    27610 shares
    Share 11040 Tweet 6900
  • University of Seville Breaks 120-Year-Old Mystery, Revises a Key Einstein Concept

    1017 shares
    Share 407 Tweet 254
  • Bee body mass, pathogens and local climate influence heat tolerance

    662 shares
    Share 265 Tweet 166
  • Researchers record first-ever images and data of a shark experiencing a boat strike

    528 shares
    Share 211 Tweet 132
  • Groundbreaking Clinical Trial Reveals Lubiprostone Enhances Kidney Function

    514 shares
    Share 206 Tweet 129
Science

Embark on a thrilling journey of discovery with Scienmag.com—your ultimate source for cutting-edge breakthroughs. Immerse yourself in a world where curiosity knows no limits and tomorrow’s possibilities become today’s reality!

RECENT NEWS

  • Weill Cornell Physician-Scientists Honored with ASCI Early-Career Awards
  • Texas Children’s Establishes National Benchmark in Pediatric Organ Transplantation
  • Penn Nursing Study Reveals Key Predictors of Chronic Opioid Use After Surgery
  • Ethical Challenges of Hybrid Tech in Operating Rooms

Categories

  • Agriculture
  • Anthropology
  • Archaeology
  • Athmospheric
  • Biology
  • Biotechnology
  • Blog
  • Bussines
  • Cancer
  • Chemistry
  • Climate
  • Earth Science
  • Editorial Policy
  • Marine
  • Mathematics
  • Medicine
  • Pediatry
  • Policy
  • Psychology & Psychiatry
  • Science Education
  • Social Science
  • Space
  • Technology and Engineering

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 5,190 other subscribers

© 2025 Scienmag - Science Magazine

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • HOME
  • SCIENCE NEWS
  • CONTACT US

© 2025 Scienmag - Science Magazine

Discover more from Science

Subscribe now to keep reading and get access to the full archive.

Continue reading