The discovery’s potential applications include self-driving cars, smartphones that can guide the visually impaired and drones that can help firefighters at night
As the industry for self-driving cars, robots and other unmanned vehicles quickly evolves – and engineers work to overcome the limitations of sensors that use visual, infrared or thermal information – math experts at Purdue University and the Technical University of Munich have proven there’s another, equally viable solution: using sound.
Mireille “Mimi” Boutin, associate professor of mathematics at Purdue University, and Gregor Kemper, professor of algorithmic algebra at Technical University, found that a drone equipped with four microphones and a loudspeaker can precisely reconstruct the wall configuration of a room by listening to echoes, similar to how bats use echolocation to orient themselves.
Their work is significant because it demonstrates the feasibility of using sound for navigation in unmanned systems, leading to many potential applications such as cars, drones, underwater vehicles or even devices that people can carry, such as a smartphone.
Kemper says: “The idea is to give the drone, or any other unmanned vehicle, the ability to navigate using sound.
“Cars are already equipped with cameras. A new approach might be to include an acoustic sensor as well, to enhance the visual information already available and give a better picture of reality.”
The research done by Boutin and Kemper – published in the current issue of SIAM Journal on Applied Algebra and Geometry – is based on a minimal setup of four microphones arranged in a rigid, non-planar shape that measure sound emitted by a loudspeaker.
When a microphone hears an echo, the time difference between the moment the sound was produced and the time it was heard is recorded to show the distance traveled by the sound after bouncing on a wall.
Their novel method – called echo sorting – accurately determines which distance corresponds to which wall, ensuring that all walls that are heard are truly there and removing the phenomenon of ghost walls.
Applications for smartphones and cars
By solving the problem, the team has opened the door for several possible applications for echolocation, including:
- a smartphone that could be carried by a visually impaired person as a guide;
- a sensor that could be added to a self-driving or camera-equipped car to help when cameras don’t perform well, such as in the glare of the sun or a snowstorm; or
- a drone capable of seeing in the dark that could serve as an aid to firefighters.
Having more signal input prevents the need to rely solely on one type of input, improving the chances that objects can be detected more accurately and under a wider variety of conditions, explains Boutin.
Boutin says: “Our algorithm shows that sound adds a level of reliability to existing approaches and therefore engineers should consider pursuing their work to build navigational systems that listen.”
The area of mathematics applied by Boutin and Kemper is based on methods from commutative algebra, some of which date back to the 1800s.
The pair was motivated to work on the problem of ghost walls when they saw the need to prove the reliability of proposed acoustic sensing solutions.
Boutin says: “We live in a world of big data and simulations where it’s easy to let the computer do the work to come up with an answer, and some people have become very good at it.
“But sometimes the only way to ensure that a computer-generated answer is reliable is to look at the problem numerically and show that we can trust it, and that’s something that can only be done with math.”
The next steps for the researchers will be to consider other scenarios, such as when the movement of the drone is restricted or when the drone hears echoes of consecutive sounds as it flies.
In the meantime, the algorithm is publicly available for mathematicians and engineers to use.