What should electric vehicles sound like?
- ntyler31
- 2 days ago
- 4 min read

There is a lot of concern being expressed about how quiet electic vehicles are and what this means for pedestrian safety. I always wonder about this. Why are we worried about cars being so quiet, when there is an overarching sense that cities are too noisy? I think the answer to this is actually quite deep. We explored the issue of sounds and moving vehicles when we created the alert sound for e-scooters in London. In that case, the issue was the surprise resulting from the sudden appearance of an e-scooter which had not been anticipated. Why was it not anticipated? Because the sound it generated was quieter than the level of surrounding urban noise. What this meant was that the natural alert system in the human brain was not being triggered. The auditory cue, evolved to tell our vision system where to look for the source of a sound, was not triggered, so the vision system was not expecting anything to appear. Therefore, when the e-scooter appeared, it was a shock. The response to this shock was powerful, along the lines of "that could have killed me!". The overall result was to raise stress levels as a protection against this unexpected threat: when might it happen again? This is of course all part of the brain's survival mechanism, driven by the process of Active Inference.
We resolved the problem with the e-scooters by creating a sound that would trigger the internal alert system without being yet another addition to the already-too-noisy soundscape in the city. I discussed this in an earlier post (https://www.pearl.place/post/how-pearl-helps-us-see-the-world-in-a-new-way-or-is-it-how-we-are-meant-to-see-it).
The issue is in some ways similar for electric cars, but actually rather different. How we should approach the issue of alerts for electic vehicles means that we need to work with the evolved capabilities of the human being, not just create a loud noise. Emergency vehicles are notorious for having loud sirens, yet in every case, there is a problem of knowing exactly where the vehicle is so that we can know what to do. The sound is loud, yes, but it is unlocatable, especially in the hard-surfaced built-up surroundings of a modern city. Often we only actually know where the emergency vehicle is when we manage to see its flashing lights. For a car without flashing lights, the problem is important because, for example, a pedestrian needs to know where it is in order to judge, for example, whether to cross the road or not. But also, do we really want to live in a city infested with auditory noise at this level?
The fact is that we are not good at loud noises. The city soundscape is now at a level that means that there is little scope for subtlety in terms of warning sounds, and this means that even though vehicles themselves might be becoming quieter we might be subjected to increasing noise levels. What can we do about this?
I think the answer is to examine much more closely how we interpret smaller sounds in the urban context so that we can design a sound that indicates the approach of a vehicle, by being able to work with how the human hearing system works rather than just to blast it with high energy pressure waves. The e-scooter work succeeded in working like this, through the careful design we created, but the situation of a vehicle is different: the vehicle's position in the streetscape is easier to determine (it is unlikely to be travelling on the footway for example), but it might be travelling faster, and it is certainly heavier and bigger than an e-scooter, so the consequences of an impact with a person will be greater.
As with the e-scooters, we need to do some detailed research on this. We could do this at PEARL, where we can set up scenarios to test the effectiveness of different sounds under different soundscape regimes, within the controlled conditions of the laboratory (including varying noise, lighting and visibility) so that we can see how a person's preconscious brain works with the situation of an approaching vehicle. With the e-scooters, it was clear that a crucial element of the sound was the start ("onset"), and in particular, the way the sound transitioned from the onset to the body of the sound. Designing that onset and its transition was really important in making it useful as an alert. But how should this arrangement be designed for a vehicle as opposed to an e-scooter?
What we do know is that solving this challenge will be vital for the large-scale introduction of electric vehicles, and even more especially if these are autonomous. This is because the autonomous vehicle does not have the benefit of the human preconscious brain driving the vehicle. It cannot anticipate potential hazards, especially ones as potentially random as people, as well as a human who is capable of assessing a whole situation and thus judging the predictability of a pedestrian's movements.
In PEARL we can explore this with both human-driven and autonomous vehicles in a safe controlled environment under different conditions. The image shows an example of such work for the European Institute of Technology Urban Mobility programme. What you can't hear in this image of course is the sound of the city surrounding this space and the sounds being tested by the vehicle. What we learn from this can help to ensure that implementation of such systems can take into account what actually works with the human brain in order to keep everyone safe and living well. This of course includes people with different cognitive and sensory capabilities so that we can make sure that the results are inclusive. Our overarching aim is to ensure that everyone can thrive in cities in a mutually beneficial, safe, equitable and healthy way, and work like this enables us to help cities to ensure that the introduction of new (and continuing operation of existing) technology meets this challenge. Something for city authorities to think about perhaps?




Comments