In a remarkable leap forward for robotics and autonomous navigation technology, researchers at Worcester Polytechnic Institute (WPI) have unveiled a palm-sized aerial robot that harnesses the power of ultrasound and artificial intelligence to traverse environments previously deemed too hostile or complex for small drones. This innovative project, led by Assistant Professor Nitin J. Sanket, adopts a bioinspired approach, taking cues from bats—masters of echolocation—to navigate through visually degraded environments such as fog, smoke, and darkness, which often confound conventional drone sensors.
Traditional aerial robots rely heavily on sensors such as lidar and radar that emit light or radio waves to map their surroundings and inform navigation decisions. Despite their effectiveness in many contexts, these systems bring substantial drawbacks: they are heavy, involve high power consumption, and often balloon the cost of the drone platform. More critically, their functionality degrades rapidly in adverse weather conditions, poor lighting, and noisy environments. This is particularly problematic during emergency scenarios like search-and-rescue missions where visibility can be minimal, and agility and reliability are paramount.
The WPI team’s breakthrough system leverages milliwatt-level ultrasound emissions combined with a form of artificial intelligence known as deep learning to mimic the echolocative capabilities of bats, which can deftly navigate cluttered caves using minimal neural resources. The robot’s design integrates two ultrasound sensors embedded within an acoustic shield that dampens the intense propeller noise—a perennial obstacle in aerial robotics—thus preserving the fidelity of the weak echo signals crucial to spatial perception.
Central to this approach is the use of AI algorithms trained to decipher the nuanced ultrasound echo patterns against background noise, enabling the robot to effectively “hear” its way through complicated obstacle courses. Unlike lidar and radar, this ultrasound system is low-power, less expensive, and can operate independently of ambient lighting conditions—the kind of flexibility essential for real-world deployments in challenging environments where human rescuers might be hindered by smoke, dust, or poor lighting.
The physical platform itself is an X-shaped quadrotor drone, compact at approximately six inches in width and weighing about one pound, optimized to be lightweight but strong enough for stable flight. During trials, the drone autonomously navigated through a range of scenarios including outdoor wooded areas and indoor environments populated with obstacles such as transparent plastic poles, metal rods, and even near-total darkness with black obstacles. Researchers further tested resilience by introducing simulated fog and snow to replicate adverse weather conditions often encountered in disaster zones.
The results were striking: across 180 test flights, the ultrasound-guided aerial robot demonstrated success rates between 72% and 100% in effectively maneuvering through the courses. The system excelled at detecting and avoiding larger obstacles, although challenges remained in reliably identifying thin structures like slender metal poles and tree branches that provide weaker ultrasound signal reflections. These limitations highlight areas for future refinement in sensor sensitivity and AI pattern recognition algorithms.
By adopting an acoustic shield, the researchers successfully reduced confounding noise generated by the drone’s own propellers, a key innovation that allowed the ultrasound system to detect subtle echoes otherwise masked by mechanical sounds. This innovation is particularly important because propeller noise has historically complicated sonar-based navigation in drones and limited the effectiveness of traditional ultrasonic sensors on aerial platforms.
Professor Sanket emphasized the potential implications of this research for real-world applications: “Small aerial robots equipped with low-power ultrasound navigation could extend flight duration and improve autonomy in cluttered, hazardous environments. This would be invaluable for search-and-rescue teams working in smoke-filled buildings, disaster rubble, or subterranean caves where visibility is poor and time is critical.”
The ultrasound navigation system’s low power requirement and minimal computational burden, enabled by efficient AI, mean these drones can operate longer on limited battery capacity. Extending flight time by even a few crucial seconds during a search-and-rescue mission could greatly enhance the chances of locating survivors and providing timely assistance.
Looking ahead, the research team envisions shrinking the ultrasound components further to enable even lighter platforms that can stay aloft longer and maneuver with greater agility. Scaling down hardware alongside improving deep learning models could open doors to higher flight speeds and more complex pathfinding capabilities in challenging environments.
This research is part of a growing trend in robotics where natural biological systems inspire technological innovation, creating machines that can perform tasks humans find difficult or impossible. Prior work in Sanket’s lab includes bioinspired robots modeled on the flight and navigation mechanisms of bees and bats, illustrating the power of interdisciplinary science in solving complex engineering problems.
Supported by a grant from the U.S. National Science Foundation, this work represents a significant contribution to the fields of robotics, autonomous systems, and sensor technology. By demonstrating the feasibility of ultrasound-based navigation for palm-sized aerial robots, the study opens exciting avenues for practical deployment, particularly in life-saving search-and-rescue missions and other safety-critical operations.
For the scientific community and industry alike, these findings suggest a paradigm shift in the design and operation of compact aerial vehicles capable of autonomous flight in adverse and visually degraded settings, freeing drones from the limitations of conventional optical and radio-frequency sensors.
Ultimately, the success of this ultrasound-enabled drone not only signifies a technological milestone but also exemplifies how lessons from nature’s own navigators can redefine the future of robotics—transforming small flying machines into perceptive, intelligent explorers capable of saving lives.
Subject of Research: Palm-sized aerial robots using milliwatt ultrasound and AI for navigation in visually degraded environments
Article Title: Milliwatt ultrasound for navigation in visually degraded environments on palm-sized aerial robots
News Publication Date: March 25, 2026
Web References:
- Science Robotics Article DOI
- Worcester Polytechnic Institute
References: Published in Science Robotics, supported by U.S. National Science Foundation grant
Image Credits: Professor Nitin J. Sanket / Worcester Polytechnic Institute
Keywords
Aerial robots, Robotics, Autonomous robots, Artificial intelligence, Artificial neural networks, Robotic sensors, Robot navigation, Robotic designs, Robot flight, Robots and society, Ultrasound sensing, Search-and-rescue robotics

