How do gaming headsets allow us to differentiate distance and location of noises, such as if footsteps are on the floor above or below you, or where and how far gunshots are coming from in a BR type game based on where you are standing and which way your camera is facing?

379 views

How do gaming headsets allow us to differentiate distance and location of noises, such as if footsteps are on the floor above or below you, or where and how far gunshots are coming from in a BR type game based on where you are standing and which way your camera is facing?

In: Technology

It’s basically the same as how you do it normally. We can work out where sounds are coming from based on tiny differences in timing and level of sounds arriving at the two ears. There’s also some filtering of sounds based on the shape of your ears and the position of the sound. The underlying idea for virtual audio is that you reproduce those tiny differences in the ears using a bunch of computer calculations.

Really, it’s not all that different from how VR headsets let you see in 3D. You see in 3D because of the differences that your two eyes see. In VR, the headset sends a slightly different picture to each eye to let you perceive depth etc.

It doesn’t matter how the signals get to your eyes or ears, whether they are arriving naturally or via a screen/earphone. The important thing is what the light/sound is like when it’s *in* your eyes/ears.

Source: hearing scientist with a PhD in sound localisation 🙂

If you want to know more about how we work out where sounds are, I’ll find a link to another thread 🙂

Technically it’s not just the headsets that allowes this but also how the game is programmed.

In games you have sound files or sound clips that are played based on a trigger, for example a footstep or a gunshot. Based on that sound clip and the location of you and the source of the sound different algorithms can change and affect the sound so you perceive it as being further away.

Some games with tighter development budgets or with limited development teams might not have the resources to implement this type of sound design. This leads to some games having worse sound and making it really difficult to tell how close a sound source is or which direction it’s coming from.

In most games this doesn’t really matter too much, but in first person shooters this can be really frustrating when it sounds like an enemy is just around the corner when he actually is on the other side of the building or 30-40m away.

Humans have 2 ears on different sides of their heads. Based on the differences in loudness and time as a sound as it reaches both ears, you can get an approximation of where that sound is. This is something human brains evolved to do to help use hunt and avoid predators. Gaming headsets have multi-channel sounds, and provide different sounds input to each ear, allowing game designers to take advantage of this ability. The designers do some reasonable complex physics simulation of the game environment to figure out how loud each sound effects needs to be in each ear in order to trick the brain into thinking that it’s in a specific position.