Autonomous vehicles There are eyes – cameras Lidarradar. But ears? Here's what the researchers in Fraunhofer Institute for Digital media Technologies Oldenburg branch for hearing, speech and audio technology V Germany built with A hearing carThe idea is the field Microphones And AI for detecting, localizing and classifying environmental sounds in order to help cars react to dangers that they do not see. So far, this means approaching ambulance vehicles – and ultimately PedestriansPierced tire or failure brakes.
“It's about giving the car another meaning so that it can understand the acoustic world around it,” says Moritz BrandesProject manager for a auditory car.
In March 2025, the researchers Fraunhofer held a prototype of hearings 1500 kilometers from Oldenburg before checking in the north SwedenThe Brandes field says that the trip has checked the system in mud, snow, sling, road salt and freezing temperature.
How to build a car that listens
The team had several key questions to answer: what if the microphone houses are stained or frozen? How does it affect localization and classification? Testing showed that performance deteriorates less than expected, after the modules were cleaned and dried. The team also confirmed that microphones can survive a car wash.
Each external microphone module (EMM) contains three microphones in a 15-centimeter package. Installed in the rear of the car and where the noise of the wind is the lowest-they capture the sound, digitize it, convert it into spectrograms and transmit it on the basis of the region A strip neural network (RCNN) is trained to detect audio events.
If RCNN classifies the audio signal as a siren, the result is checked using a car cameras: Is there a blue flashing light? The combination of “feelings” like this increases the reliability of the vehicle, reducing the chances of false works. The audio signals are localized through The formation of the beamAlthough Fraunhofer refused to provide specifics for equipment.
All processing occurs on board to minimize the delay. It also “eliminates fears about what will happen in the area with the poor Internet connection or a lot of intervention from [radiofrequency] Noise, says Brandes. Workload, he adds, can be processed by modern Raspberry PiField
According to Brandes, the early tests for the auditory car system include the detection of sirens at a distance of up to 400 meters in quiet low -speed conditions. According to him, this figure is reduced to 100 meters at the speeds of the highway from the wind and road noise. Alerts are launched after about two seconds – throughout the whole time for drivers or Autonomous systems To react.
This display doubles as a control panel and panel panel, allowing the driver to activate the “rumor” of the car.Fraunhofer
The story of listening to cars
The roots of the auditory car have been extending for more than ten years. “We have been working to force cars to hear since 2014,” says Brandes. Early experiments were modest: detecting a nail in the tire using a rhythmic tapping along the sidewalk or opening the barrel using the voice command.
A few years later, the support of the level supplier (a company that provides complete systems or large components, such as transmissions, brake systems, batteriesORADVANSED The help of the driver (ADAS) Systems directly for car manufacturers) pushed the work in the development of a car level, soon joining a large automaker with EV adoption Rising, automakers began to understand why the ears had as much as their eyes.
“A person hears the siren and reacts – even before seeing where the sound comes from. The autonomous car must do the same if it will safely coexist with us. ” –Oin King, University Gorei Sound -Laboratory
Brandes recalls one point shown: sitting on the test path, inside electric car It was good isolated from road noiseHe could not hear the emergency siren, while the car was almost not on him. “It was a big“ ah ha! “The moment that showed how important the auditory car will become as EV adoption increases,” he says.
Eoin KingProfessor of Mechanical Engineering c Gorei University V IrelandHe considers a jump from physics on AI as a transformative.
“My team has accepted the approach very based on physics,” he says, recalling his 2020 Work in this field of research V Hartford University In Connecticut. “We looked at the direction of arrival – measuring delays between microphones in triangulation, where it sounds. This demonstrated the fulfillment. But today, today, today, Ay can do much further. Machine-listening really changes the game. ”
Physics still matters, King adds: “This is almost like a physicist informed by artificial art. Traditional approaches show that it is possible. Now, Machine training Systems can generalize much better in different environments. ”
The future of sound in autonomous vehicles
Despite the progress, King, which leads Glue Sound LaboratoryResearch in acousticsNoise and vibration, careful.
“Five years later, I see that this is a niche,” he says. “It takes time to become standard technology. Prevention Transmission There would be niches once, but now they are everywhere. Hearing technology will get there, but step by step. ” Large deployment is likely to appear in the vehicles of the premium or autonomous fleets, while the mass introduction further.
The king is not a word about why the perception of sound matters: autonomous vehicles should coexist with people. “A person hears the siren and reacts – even before seeing where the sound comes from. An autonomous car should do the same if it will safely coexist with us, ”he says.
The king’s vision is vehicles with multimsencedic awareness – chambers and lydar for vision, microphones for hearing, perhaps even vibration sensors for Road monitoringThe field “The smell,” he jokes, “maybe too far.”
The Swedish road test of Fraunhofer showed that durability is not a big obstacle. King points to another area of anxiety: false alarms.
“If you train the car to stop when he hears someone shouting:“ What happens when the children do it like a joke? “He asks. “We must carefully check these systems before sending them to the road. This is not Consumer electronicsWhere, If Chatgpt gives you the wrong answerYou can simply rephrack the question – people's lives are put on the map. ”
The cost is less than the problem: microphones are cheap and stormy. The real task is to provide Algorithms It may make sense of noisy sound landscapes of the city, filled with horns, garbage trucks and construction.
Fraunhofer will currently process algorithms with wider data sets, including sirens from the USA, Germany and DenmarkMeanwhile, the field, King's Lab improves sound detection in the inner context, which can be redesigned for cars.
Some scenarios, like a hearing car that finds a red light engine that lives to its visible, can be for many years, but King insists that the principle relates: “Theoretically is possible with the correct data. The task is to obtain these data and training for them. ”
Both Brandes and the king agree that there is no sense. Cameras, radar, Lidar – and now microphones – work together. “Autonomous vehicles that rely only on vision are limited by the vision line,” says King. “Adding acoustics adds another degree of safety.”
From the articles of your site
Related articles on the Internet