(Santa Barbara, Calif.) — Birds migrating. Your cat, returning home from a day of roaming. Bees taking pollen to their hives. You, finding yourself back home without actually remembering the drive from work. Animal navigation is a fundamental behavior, so innate that most of the time we don’t notice that we’re doing it. And yet, so many times a day we (and the animals around us) unerringly find our ways to our target locations whether they be old haunts or new venues, from different directions and even in the dark.
(Santa Barbara, Calif.) — Birds migrating. Your cat, returning home from a day of roaming. Bees taking pollen to their hives. You, finding yourself back home without actually remembering the drive from work. Animal navigation is a fundamental behavior, so innate that most of the time we don’t notice that we’re doing it. And yet, so many times a day we (and the animals around us) unerringly find our ways to our target locations whether they be old haunts or new venues, from different directions and even in the dark.
How do we do it? That’s the question UC Santa Barbara neurobiologist Sung Soo Kim seeks to answer; more specifically, his work involves mapping networks of neurons involved in wayfinding. “My ultimate goal is to understand how the brain processes visual information and generates navigational commands to move,” he said.
With a 2024 Scholar Award from the McKnight Foundation, Kim is that much closer to his goal. He is one of 10 neuroscientists selected by the organization to receive the prestigious early-career award, which consists of $75,000 per year for three years. Kim is the first UCSB researcher to receive the award.
I’m honored to receive this award,” Kim said. “It will help me to make solid progress in my research. It also gives me an opportunity to connect with top scientists of the country.”
Turns out, animals have a variety of ways to gather location information that they must then interpret to make decisions about where to go. Some rely on landmarks, others on smells and yet others on the Earth’s magnetic field. These inputs and others are transformed into a neural representation of the world that we are thought to hold in our minds to help us make navigation decisions based on our goals. It may be why we are able to avoid obstacles as we walk through a room even after the lights have been turned off.
How does this happen in the brain? To find out, Kim built a virtual reality arena for the fruit fly, a model organism in which approximately 50 “compass neurons” are arranged along the outside of a donut-shaped structure in their brains called the ellipsoid body. This simple structure encodes the fly’s sense of direction, and through the highly controlled environment of the arena, researchers can apply visual stimuli such as outdoor scenes, simulate motion and wind, and observe the tethered fly’s brain as it navigates the virtual scenes. It’s research that, according to the McKnight Foundation “looks at the combined role of perception, cognition and motor control, three subfields of systems neuroscience that are rarely connected in a single research program.”
Of particular interest to Kim for this project are the cues that we would not be able to see, but abound in the fruit fly’s world.
“Humans use a lot of landmarks like buildings and signs as cues to navigate the world,” Kim said. “For the flies, it’s a little bit different.” For one thing, he explained, fruit flies are partial to vertically oriented objects, like trees. “But the more important thing is the polarization of light in the sky. There is a very specific pattern of light in the sky that is invisible to humans but insects can see,” he said.
Just like how polarized sunglass lenses filter the horizontal light waves to reduce glare from sunlight reflections on water or snow, or the headlights from an oncoming car, the atmosphere also filters out part of the incoming sunlight, which creates patterns that insects are particularly attuned to. “Polarization, light patterns and the color of the stimulus are things that insects use for navigation,” Kim added, “And I want to understand how that information is processed in the visual system and how it is conveyed to the central brain where the navigation task is performed.”
The answers to Kim’s questions will go a long way toward gaining a basic understanding of how navigation works in animals and ultimately humans, but the results may also have some immediate practical applications. For instance, autonomous vehicles and robots that lose their link to GPS could shift into an alternative mode of navigation that collects information from their immediate surroundings and makes decisions on where to go based on that input. Looking farther out, this kind of research may also lead to insights on neurological conditions, such as Alzheimer’s disease, that often affect the parts of the brain involved in one’s sense of location.
“Of course, the fly brain is very far from the human brain,” Kim said. “But still, understanding how the navigation system works may potentially help us understand how and why the brain is malfunctioning in these cases.”
Discover more from Science
Subscribe to get the latest posts sent to your email.