In a groundbreaking study that challenges longstanding assumptions about sensory processing in the brain, researchers have revealed that complex statistical representations of visual scenes begin deep within the primary visual cortex (V1), rather than emerging only in higher-order brain regions. This discovery, led by scientists Lee Doyun and Kim Yee-Joon at the Institute for Basic Science’s Center for Memory and Glioscience, illuminates how the brain rapidly distills vast and variable sensory data into coherent summaries, enabling efficient perception and decision-making. Published in Advanced Science on March 23, 2026, the research meticulously charts a hierarchical process where early sensory areas compress detailed motion inputs into summary statistics, which are then further abstracted in downstream regions such as the posterior parietal cortex (PPC).
Traditionally, the primary visual cortex has been conceptualized as a region dedicated to processing elementary visual features like edges or the direction of motion in isolated elements. However, this new evidence shows that V1 simultaneously encodes not only the average direction of complex motion patterns but also their statistical variance—indicating the degree of scatter or uncertainty within those motions. The retention of both mean and variance at this initial cortical stage suggests that V1 performs a sophisticated summarization function, amalgamating fluctuating sensory inputs into stable ensemble statistics that are robust against the noise inherent in single-neuron variability.
To probe how the brain achieves such an intricate computational feat, the researchers devised a novel experimental paradigm involving head-fixed mice trained to classify random-dot motion stimuli. Unlike conventional motion experiments that use coherent movement across dots, this study leveraged motion stimuli with independently varying directions for each dot—sampled from controlled distributions. This design enabled precise manipulation of both the mean direction and the spread (variance) of the motion cues, allowing investigators to disentangle how these statistical aspects are represented neurally and behaviorally.
Despite the inherent variability in the local motion signals, the mice successfully learned to categorize the stimuli into broad directional groups, demonstrating an ability to extract a holistic statistical summary rather than relying on tracking a subset of prominent individual movements. This behavioral capability highlights a form of ensemble perception whereby the brain captures the “gist” of a dynamic scene rapidly, pointing towards neural mechanisms tuned to global statistical properties rather than isolated features.
Neural recordings obtained via miniscope calcium imaging provided compelling insights at both cellular and population levels. While only a minority of individual neurons in V1 exhibited marked selectivity for the global mean motion direction, the collective activity across the cortical population reliably encoded both the mean and variance statistics of the motion stimuli. This population-level encoding underscores the importance of distributed neural coding strategies where seemingly unselective neurons contribute to the overall fidelity of sensory representations when considered as part of an integrated ensemble.
The hierarchical nature of this processing stream was further elucidated by recordings in the PPC, a higher-order cortical area implicated in perceptual decision-making. In contrast to V1’s encoding of summary statistics, PPC neural activity transformed these sensory summaries into more abstract, task-relevant category representations. This progressive abstraction from raw statistical encoding to categorical decision signals exemplifies the brain’s capacity to compress environmental complexity into manageable cognitive constructs that guide behavior in real time.
An intriguing aspect of the findings is the malleability of early sensory representations based on task demands. During active categorization, the V1 representation of mean motion direction exhibited systematic biases aligning with learned category centers. Such top-down influences reveal that primary sensory cortex is not a passive recipient of raw stimuli but is dynamically shaped by learning and cognitive context, which optimizes sensory coding for behaviorally relevant distinctions.
Further analysis illuminated that neurons traditionally classified as “untuned” due to weak individual selectivity nevertheless played a crucial role in ensemble coding. Their distributed contribution was essential for maintaining an accurate population code for global motion direction, highlighting the value of looking beyond classical tuning curves to understand sensory representation mechanisms fully. This insight challenges simplistic views of neuronal selectivity and promotes a more holistic perspective on cortical information processing.
Co-corresponding author Kim Yee-Joon emphasized the broader implications: the study reveals fundamental principles underlying the brain’s efficient interpretation of complex scenes through hierarchical reorganization of visual information. From statistical summaries in early cortex to category representations downstream, the brain implements an elegant compression scheme to distill sensory noise into reliable perceptual signals. Such mechanisms are likely generalizable across various sensory modalities and cognitive functions.
The impact of these findings extends beyond neuroscience, offering valuable paradigms for artificial intelligence and computer vision. Understanding how biological systems robustly categorize variable, noisy sensory inputs into stable perceptual categories can inspire algorithms that better mimic human-like scene comprehension and decision-making. The demonstration that statistical summaries emerge early and are dynamically influenced by task context suggests new pathways to enhance machine learning models with hierarchical and context-sensitive representations.
This study’s integration of behavioral training, precise stimulus control, and advanced population-level neural imaging sets a new standard for dissecting the neural basis of ensemble perception. By revealing how the primary visual cortex and posterior parietal cortex collaborate to encode and transform statistical information, the research opens fresh avenues for exploring sensory coding, learning, and cognition in the mammalian brain.
As the brain encounters continuous streams of complex visual input, its capacity to abstract meaningful patterns swiftly and reliably is paramount. This work not only deepens our understanding of early sensory cortex functions but also places primary visual areas centrally in the cognitive machinery of perception, learning, and decision-making. Ultimately, unraveling these processes furnishes critical insights into the neural algorithms behind the brain’s remarkable ability to see beyond details and grasp the big picture.
Subject of Research: Animals
Article Title: Hierarchical summary statistics encoding across primary visual and posterior parietal cortices
News Publication Date: 23-Mar-2026
Web References: DOI link
Image Credits: Institute for Basic Science
Keywords: Visual cortex; Primary visual cortex (V1); Posterior parietal cortex (PPC); Ensemble perception; Motion processing; Statistical summary representation; Population coding; Calcium imaging; Neural population codes; Sensory coding; Hierarchical processing; Visual perception

