Tuesday, 28 May 2013

The Future of Acoustic Space Simulation?

I don't think one can avoid the fact 2013 is seeing the 4th Generation of games consoles hitting home come winter.  PC fans are right to point out that both the Xbox One and Playstation 4 specs meet up with that of a mid-range PC's capabilities. But we cannot disregard that this will endeavour the standard of gaming to be a lot higher than that of the 3rd generation.

So lets go back... way back to Half-Life 2 and the days of 2004. HL2 received 39 game of the year awards and is to be considered one of, if not, the best game of all time. It brought us from limping cave gamers into First Person Shooter (FPS) galore. Not only were the visuals like nothing we had ever seen. The audio had actually had a decent budget assigned to it.
We hear our protagonist's footsteps ebb out from under us, varying as we walk from concrete to wood, to concrete, to cardboard. I've said it before and I'll say it again... it's the little things.
We're not stuck in mono. As we look around sounds radiate from their point source, increasing and decreasing in amplitude as we change our distance to and fro said source. Now we're wielding a handgun and pulling off shots in a room. The walls are made of plaster board and we're getting the slap-back of our gunfire coming back to us (these are referred to as early reflections) followed by a decaying reverb (formed from the combination of the late reflections from the rooms surfaces) It's 2004 and you're in awe and for what feels like the first time, an immersive experience is upon us.

Cue another one of the greats: Battlefield 3. Released in October 2011 from DICE incorporating the Frostbite 2 game engine. Having received over 60awards to date BF3 is also considered one of the best FPS of all time. Not forgetting the praise that has gone to the Sound Design in this game. I've played BF3... a lot! My reference headphones are that of Sony MDR7506's and I made my analysis of BF3 with them on. The gunshots are incredible, sound localisation is terrific, voice acting superb and distance effects great. My only beef is that... the acoustic space simulation isn't far beyond that of Valve's Half-Life 2.


Although I do not have the direct insights into the methods DICE use for their audio in the Frostbite 2 engine I can assume it is using Cell based approaches in order to compute the reverberations. This means the games levels are broken up into sections called Cells, for example, you have an alleyway and an auditorium, these will be split into two Cells. From there the acoustic simulation is processed on each cell individually creating reverberation for that environment appropriately. Here is an image courtesy of Crytek looking at the cell based approach in Crysis, their incredible release of 2007:

Crysis, 2007

This is a highly popular technique used in video games over the last decade and you're thinking 

"yes this is a good idea, it calculates the correct reverberations appropriate in each sector."

"It's sounding really good in Battlefield 3 as I'm shooting inside this big room"

But the issue comes when we transition between two cells... If your front door is open you can hear the birds and road traffic coming in through the open door. Within cell based games this rarely does occur, as you pass through the door the environment will change from acoustic space A to acoustic space B. Within BF3 a crossfade effect occurs as you pass through the portal into the new acoustic space. This was a obscenely noticeable for me and brought me right back to reality as I realised I was actually playing a video game. 

So what is the future of acoustic space simulation? 

Well, the big game companies are holding their cards very close to their chest. The only response I seem to be able to get is that of Indie developers whom are more than happy to share their information. Where would we be without them?
But as my undergraduate thesis is based on Dynamic Acoustic Space Simulation I have come across some great examples of work from Masters and PhD students in my research. One of which is a 'Directionally varying reverb' which means anywhere the player stands in the game will have a different reverberation effect just as it does in real life. I would gladly delve deeper into this but I fear this article has dragged on. So check out this video and skip along to 3:35 to hear the comparison between their reverb algorithm and the one currently in the Half-Life 2 engine.

I leave you with a quote from a contact in the audio department at Krillbite studios;

"I see the gameplay and physics as the inner body, skeleton and gooey stuff, lighting and visuals as the appearance and audio as its personality(or the twinkle in the eye)."
                                                                                - Martin Kvale

Thanks for reading,

Rob Tyler

V | G | A

No comments:

Post a Comment