Game Audio Learning Portal

View Original

How To Mix & QA Test Your Game

Overview Of Mixing

There are very few resources out there on how to mix video games so this process can seem very mysterious. My aim is to demystify it and give you a framework that allows you to approach your mixes with confidence.

One of my main resources for this section is Rob Bridgett's amazing book Leading With Sound, which I highly recommend: https://www.amazon.co.uk/dp/0367535874/

What Is A Mix?

The mix of a game, similar to that of music or other linear media, is all about adjusting the relative volume of individual elements to get a clear and powerful end result that’s pleasing to listen to. The challenge in games, however, is that the elements in the mix change are constantly changing as the game is played, with sounds being introduced or removed at a fast pace.

When we mix a game we generally build around one central question: “What are the important sounds that we need the player to hear right now?” For an example, let’s consider a game like Overwatch: a multiplayer first-person shooter with up to 12 players on a single map. You’re likely going to hear a lot of shooting, explosions, special ability SFX, footsteps and VO lines, as well as the map’s music and ambience.

If we heard all of these elements at the same time it would be overwhelming and impossible to focus on what matters. To avoid this, the developers had to come up with some rules and systems to set the priority of sounds. Based on that priority, we can start getting rid of sounds that are less important (known as “culling”), and focus on the most important ones such as the footsteps of nearby players.

Some of the questions that the Overwatch mix is based on include:

  • Who is the player's greatest threat?

  • Who is the player looking at?

  • Who is looking at them?

  • Who is firing their weapon nearby?

With the tools available to us (which we’ll discuss below), we can then start to adjust the volume of sounds based on their priority, and mute/stop playing other sounds that aren’t important or are further away from the player.

Check out this awesome explainer video from Marshall McGee on the mix of Overwatch: https://youtu.be/MbV_wKScrHA

If you are interested in learning more about the Overwatch mix from the audio team at Blizzard then check out this 15-minute talk from Wwise Tour 2016: https://youtu.be/2M5cWHswQsM

How Do You Mix?

As I mentioned above, mixing for video games is a lot more complex than for linear media. However, we have a broader toolset at our disposal, some of which I’ll introduce you to below, and we can create custom systems to suit our needs.

Mixing games comes down to a combination of refining the automatic systems that do a lot of the heavy lifting for us, and then manually tweaking volumes and other parameters to fine-tune certain sections or moments of the game. In order for us to start, we need to find out what to focus on and be introduced to the tools available to us.

The Anchor & Pillars Of The Mix

Game mixes typically have an ‘anchor’ - a single element that is the most important part of the mix at all times, and is always priority number one regardless of whatever else is happening. The anchor acts like a benchmark, and the level and intensity of other elements of the mix are all balanced against it.

The choice of element to use as the anchor will often depend on the game’s genre, and often revolves around the core mechanic of the game - shooting for an FPS, the movement for a Platformer, dialogue in a narrative game, etc. Some broad examples are:

  • Narrative-focused games (Until Dawn, Detroit: Become Human) will usually prioritise dialogue - what are characters saying and feeling?

  • FPS titles (DOOM, Call of Duty, Battlefield) will put guns ahead of everything else - what does the player’s gun sound like, and where are the enemies shooting from?

  • MOBAs (League of Legends, Dota) will focus on champion abilities - what is my character doing and what abilities are other champions using?)

  • Strategy games (Civilization, Humankind, Age of Empires) will prioritise UI and unit sounds - what are the player’s units doing, and what are the consequences of the player’s decisions and actions?

  • Platformers (Super Mario, Hollow Knight) will emphasise character movement and interactions - what is my character doing and how are they interacting with their environment?

These are a couple of very broad examples, and I encourage you to listen closely to the next game you play and think about what you can hear as you play - what elements are in the foreground and what is pushed to the back? What elements does the game think it’s most important for you to hear?

After the anchor, the pillars of the mix are the next most important elements. These are typically the most common sounds, or the ones that contribute most to the game’s overall soundscape, but aren’t necessarily in the foreground at all times.

Identifying the pillars of a mix can be done using questions similar to those in the Overwatch example above. The pillars are the most important things the player needs to hear, that give them information, feedback and immerse them into the experience. In Call of Duty, for example, these high-importance elements be weapons, of course, as well as the Foley and GUI elements that inform the player of what is happening at any given moment (who is shooting at me, is there a grenade close by, did I kill this player, etc.).

Focusing on these pillars ensure that the most important sounds get the attention they deserve, and help make the experience of playing the game feel responsive and satisfying.

Tools To Mix

I’ve listed some of the most common tools with short descriptions below. It may seem a little overwhelming at first, but to begin with, try selecting a few that are particularly useful to whatever game you’re working on and learning more about them. Knowing what tools are available to you in the first place will give you the upper hand when it comes to mixing.

This article by Ronny Mraz is also a great applied example of these tools and theories: https://splice.com/blog/dynamic-game-audio-mix/

Volume

The most obvious and basic tool for mixing is controlling the volume of your assets. You can save yourself a lot of headaches if you import assets of the same category with a consistent volume during the development process, e.g. ensuring that all footsteps are at the same volume, all guns, all dialogue, etc. This will make life easier later as you can focus on adjusting the volume of entire groups of assets rather than individual ones, which leads us to the next tool - buses.

Buses

A bus (also known a group) is a track that takes inputs from multiple tracks and sums them into one combined stream of audio. We can then use the fader on the bus to control the volume of all of the input tracks at once, or add plug-ins and effects to the bus to process multiple sounds together.

Buses are often used to group similar elements together, such as all of a game’s ambient sounds, or dialogue. This makes it very easy to adjust the volume of key parts of a mix quickly, either using automation and programming, or in-game settings - the volume controls in a game’s settings menu are effectively just adjusting the faders on various buses.

Auxiliary Inputs (Sends)

An auxiliary input (also known as an aux bus or a send) functions similarly to a regular bus, except that instead of sending the entire signal to the aux input, we split the signal and send a copy in parallel. This can be a useful way to apply effects, in particular delay or reverb, as we can add the effect to multiple tracks at once (like with a bus), but control how strong the effect is for each track individually, by changing the level that we send each signal to the aux input.

Snapshots (States)

Similar to presets in our audio plugins, snapshots (also known as states) can save certain values and then recall them when required. For example, imagine when you open the pause menu in a game, all of the sound effects and ambient sounds are paused, and the music is low-passed. This can be done by recalling a mix snapshot for the pause menu, with preset volume levels and effects.

When you unpause the game, the game with recall a snapshot for regular gameplay and set the affected values back to where they were before you paused the game. Switching between the pause menu and gameplay states with snapshots allows us to make these transitions very easily, and ensure that they are consistent.

Sidechains

A sidechain is a tool that allows one sound or group of sounds to control others in real time. This allows us to create space for important sounds that need to cut through the mix, but only when the space is needed.

For example, adding a compressor to the music bus that is triggered by input from the dialogue bus. When dialogue plays, the compressor lowers the volume of the music to make space, before letting the volume return to normal when the dialogue is finished. This form of sidechained volume control is also known as ‘ducking’.

Sidechains can be used to control pretty much any parameter you like, and is useful for both utility effects like ducking or more creative modulation effects.

Attenuation

Attenuation allows us to change the volume and/or frequency content of a sound (through lowpass and highpass filters) depending on how far away the sound source is from the player (audio listener). This is great not only because it creates a more believable soundscape, but also because it allows us to keep the mix clear by not having far-away sounds take up space and clutter the mix.

LOD

LOD stands for “Level Of Detail” and refers to how simple or complex a soundscape is. We can change the level of detail of the soundscape in games in the same way that when we look closely at a painting we can see more detail than when we are standing far away.

A great example of the application of LOD is in strategy games like the Total War or Civilization series, where you can zoom in and hear individual sounds like soldiers fighting or a horse’s hooves. As you zoom out, these sounds get quieter and you hear less detail, until you’re left with only a general ambience and the wind. This works by turning groups of sounds on and off based on parameters like distance from the camera or audio listener.

HDR

HDR stands for “High Dynamic Range” audio, and is the most advanced of the various tools on this page. To explain it in very oversimplified terms, the goal of HDR is to ‘de-clutter’ the mix and make loud sounds seem louder and quiet ones quieter, essentially achieving a higher dynamic range.

It does this by muting quieter sounds like footsteps or a voice when very loud sounds like explosions are playing. It’s a fairly advanced system, and definitely not something that you need to worry about understanding fully early-on. If you are interested in reading more, however, check out this article from Designing Sound on the topic: https://designingsound.org/2013/06/21/finding-your-way-with-high-dynamic-range-audio-in-wwise

When Do You Mix?

Planning

The first step of mixing is the planning phase, which happens during pre-production. At this stage, the goal is to figure out what kind of tools and systems you will use to maintain clarity, and establish what your mix anchor and pillars will be. Having a plan will make the execution easier and allow you to adapt.

Execution

Mixing starts properly during production, usually working on the mix at the same time as you’re adding and implementing all of your sounds and effects, tweaking systems as needed. Remember, always focus on what the most important thing is for the player to hear at any given time.

Polish

The very end of production is when you finalise the mix. This is an incredibly important step as you take some time specifically to play the game, assess the mix and make changes where necessary. During your playtests, it’s important to ‘zoom out’ to see all of the individual parts in context, in order to make sure that everything sounds cohesive.

Reflection

The last step, which is often forgotten, is reflection, sometimes called a ‘post mortem’. This happens after the game is released, and the goal is to look back at what went well, what went wrong, and what can be learned from these moments.

Reflecting on completed projects is an extremely valuable process, letting you learn from your failures and wins and take those lessons into the next project. You can read more about it here: https://www.gamedesigning.org/learn/postmortem/

Overview Of QA Testing

QA stands for “quality assurance,” and as the name suggests, is all about making sure that the game is in a suitable state to be released, and doesn’t have any game-breaking bugs. QA testers have to find bugs, identify and document how to reliably reproduce them, and then verify that the bug has been fixed once the developers have addressed it.

QA testing may seem like a dream job, as you get to play video games all day, but in reality it’s a tough job that requires a lot of skill and patience. QA testers play games throughout their development, including early builds with missing assets and textures, and have to focus on specific tasks like bumping into all of the walls to check that the player can’t clip through them and fall off the map.

If you don’t have a dedicated QA team, then it’s even more important that you playtest your game a lot to make sure that the audio plays back correctly, and find any bugs. Some examples of common audio bugs are footsteps not switching correctly when the material changes, dropping an object triggering the impact sound 10+ times due to the physics, a loop not stopping when it should, etc.

Audio QA tester is a viable career option that you can go down, most likely starting off as a regular QA tester to gain experience and then moving into the specialised job role after a while.

Check out this short little summary of what an audio QA tester does: https://gameaudioresource.com/2019/08/28/chapter-17-a-qa-what-is-qa-audio-testing/

Becoming A Good QA Tester

As I previously mentioned, QA testing is a skill that can be learned and improved, so the best way to get better at it is to practise. However, there are some ways to maximise your time in QA testing which mainly come down to two things:

  1. Good debug tools

  2. Good documentation

Debug Tools

Debug tools allow you to find issues more quickly and easily by providing additional information about game parameters while you play. This can include things like visual indicators of the surface material the player is on, or when enemies are hit, etc. Creating these tools is a collaboration between the audio team and the programmers, or if you’re lucky, your dedicated audio programmer.

Debug tools also often give you more information on why a sound isn’t playing. For example, if a footstep isn’t triggering a debug tool might tell you if the material tag is incorrect, the animations haven’t been tagged, or if it just can’t find the correct audio file. These tools are here to enable the audio team to find and solve bugs and problems faster.

Documentation

Creating a bug list is absolutely vital to keep track of what’s broken and needs to be fixed. It’s important to detail how exactly to reproduce a bug, as sometimes identifying the precise steps involved can be the most challenging aspect. There is specialised software like Jira designed for tracking bugs, but a spreadsheet is always very helpful as they are highly customisable and you can design templates that fit your workflow best.

I recommend putting the following information in a bug-tracking spreadsheet:

  • What the bug is - description of the issue and what is potentially causing it.

  • Where the bug occurs - give as much detail as you can, so it can be found easily (level, area, character, animation, etc.).

  • How to recreate it - it’s important that you can reliably recreate the bug, otherwise, you won’t be able to fix it.

  • The status of the bug (incomplete, work in progress, complete, blocked, etc.)

  • Additional notes - further information about the bug and potential ways to fix it

  • Tracking number - this is useful if you’ve got a very large list of bugs and you need to quickly communicate or delegate bugs to the rest of the team.

These are just some of the most common things that go into my bug list spreadsheets, but each list is customised to the game I’m working on, depending on what I need. You’ll learn over time what information is useful and what is unnecessary - too much info can be counterproductive, as it makes the list harder to read.