A new study published in iScience from researchers from the Ivcher Institute for Brain, Cognition, and Technology at Reichman University offer a new perspective towards the plasticity of the brain, and integration of the senses, paving the way for the enhancement of prosthetics and assistive technologies, and even our interactions with virtual and augmented realities.
Credit: Reichman University
A new study published in iScience from researchers from the Ivcher Institute for Brain, Cognition, and Technology at Reichman University offer a new perspective towards the plasticity of the brain, and integration of the senses, paving the way for the enhancement of prosthetics and assistive technologies, and even our interactions with virtual and augmented realities.
The team developed a novel Touch-Motion Algorithm (TMA) to explore how tactile sensation with a spatial component might allow participants to perceive externalized 3D information. Four experiments were performed in a higher-order ambisonic facility, containing 97 speakers facilitating a 360-degree auditory scene which enabled the researchers to compare auditory capabilities to when using TMA. The results demonstrated that individuals can rapidly learn and localize the trajectories of various sounds moving around them on a 360-degree plane using vibrotactile input to the fingertips, and successfully integrate audio and tactile information in challenging environmental conditions. “They were able to do this when receiving combined audio-tactile feedback, with an accuracy significantly exceeding chance levels, and comparable to auditory capabilities, without any prior training” explains Dr. Adi Snir, first author of the paper and senior researcher at the institute. “We wanted to test the potential of using multisensory integration in complex real world environments” says Dr. Katarzyna Ciesla-Seifer, first author of the paper and senior researcher at the Ivcher Institute and the Institute of Physiology and Pathology of Hearing, Warsaw. “Performing source separation auditorily is something which we all do all the time, yet this is particularly challenging for people with hearing impairments, partially because their understanding of sound locations is often impaired” she adds. Collectively, their findings illustrate the incredible capacity of the human brain, to both adapt to and integrate novel sensory information. “Not only does this open new pathways and possibilities for enhancing human perception and interaction with the world around us, it also has serious implications for rehabilitation and proper development of assistive devices for individuals with hearing impairments”. “We currently have extremely promising results improving some of the biggest challenges that cochlear implant patients faces. At the same time the technology hold big promise in the enhancement of human performance of normal hearing people and even developing Supra human capabilities by linking the algorithm developed in the lab with other technologies like virtual reality or brain implants similar to Neurolink of Elon Musk” Prof Amedi further elaborates.
Journal
iScience
Method of Research
Experimental study
Subject of Research
People
Article Title
Localizing 3D motion through the fingertips: following in the footsteps of elephants
Article Publication Date
26-Apr-2024
Discover more from Science
Subscribe to get the latest posts sent to your email.