How To Make Music For Video Games

Before You Start

Music is a huge topic that can be more or less important depending on the career path you choose. That’s why I’ve split this topic into a basic overview that’s relevant to everyone, and a more advanced deep dive for people interested in the composer career path specifically as well as dive into the technical world of implementing music.

This chapter also contains a short summary of Stingers & Cinematic Sounds, which are related to music, and important for sound designers and composers alike.

What Does Video Game Music Do?

Music in video games is hugely important and can play a lot of different roles depending on context and the game’s genre.

  • Music can enhance the impact of narratives through emotional storytelling, and by creating or relieving tension.

  • It can provide positive or negative feedback to the player (e.g. the ‘puzzle solved’ jingle in the Legend of Zelda series).

  • It can dictate or reinforce the rhythm of gameplay (e.g. Doom Eternal’s adaptive soundtrack that edits and remixes on the fly in response to the player’s actions).

  • It can be a part of a game’s ambience, helping to build immersion or provide cues to the player (e.g. ‘exploration music’ playing when navigating an open world, ‘combat music’ being initiated when spotted by enemies).

  • It can also be used for world building and immersion as a diegetic element (e.g. Assassins’s Creed: Black Flag’s sea shanties or Grand Theft Auto’s car radio stations).

  • Music can be a dedicated game mechanic, either as an optional element within a larger game (e.g. Minecraft’s note blocks), or as the primary gameplay loop (e.g. rhythm games like Beat Saber of Guitar Hero).

  • And of course, it can also be used as a sound effect, such as the Saints Row series’ Dubstep Gun.

As you can see, music often plays a huge and varied role in a game’s soundscape, and is able dramatically influence the way players experience and interact with the game. The key to getting the most out of a soundtrack is to be creative and innovative not only when composing, but also when implementing your music.

Like all parts of game development, music is a powerful tool that can be further enhanced through collaboration, and working with sound designers and audio programmers can open up all kinds of possibilities for you as a composer.

Combining Sound & Music

Music and sound design might be different disciplines, but it’s important for them to work together well in order to create a satisfying and cohesive soundscape. Of course, you can purposefully create dissonance between music and sound design, but that is usually reserved for specific situations and genres such as horror games. In most cases, it’s better for the music to work together with the sound design than against it.

So how do we marry the sound design and music together in a mix? Well, there are three main ways to bring them to the altar, two of which we have already discussed in the Music Theory chapter (in the Sound Design Basics section of the Roadmap).

  1. Frequency

    As you know, humans can hear sounds from roughly 20Hz-20,000Hz, which means that we have a limited range of frequencies that we fit all of a game’s sounds into. There is a phenomenon called “frequency masking” in which multiple elements playing in the same frequency range obstruct one another and become hard to hear clearly - think when a tall person stands in front of you at a concert and now you can’t see the lead singer and only half the band.

    If we have a loud guitar in our soundtrack and dialogue playing at the same time, both occupying the same frequency ranges, we’ll have a hard time hearing the dialogue. To avoid this. it’s smart to ‘make space’ for elements that you definitely want the player to hear by using an EQ to remove unnecessary frequencies from competing sounds.

    Here’s a cool little ‘frequency cheat sheet’ that shows the rough frequency ranges of various instruments and some notes on what frequencies are worth paying close attention to: http://virtualplaying.com/interactive-frequency-chart/

  2. Pitch

    If there are sound effects in the game that are very tonal (meaning that they sound with a specific note like a musical instrument), such as UI sounds, then it’s usually a good idea to make sure they are tuned to match the key of the music. This ensures that the sounds complement the music and the overall vibe of the gameplay. Of course, you can also deliberately tune sounds to clash with the music, if the mood calls for it.

  3. Volume

    If everything is loud all of the time then you won’t hear anything clearly, so pick and choose where to bring up the music and where to bring it down again. Changing the volume of the score also has a big impact on how we perceive it, which is a great tool that allows us to tell stories and shape the player’s perception and response to music and gameplay.

Stingers & Cinematic Sounds

In addition to a game’s more conventional music and sound effects, it’s common to use stingers and cinematic sounds, to accentuate or embellish the soundscape at key moments.

Stingers are short musical flourishes that are often used to transition into, out of, or between musical segments. They can also be played back in isolation to add tension or release to certain situations.

Cinematic sounds (sometimes shortened to ‘cine sounds’) are very similar to stingers, and can often be put in the same category. The main difference is that they are usually less musical or tonal, and are more akin to one-shot sound effects than stingers. This is not an absolute distinction, however, and the lines can get very blurred. Examples of cinematic sounds are things like risers, impacts or the classic “braaaaam” sound that we hear in trailers.

Linear cinematic games make use of stingers and cine sounds a lot as they are a fantastic tool to highlight story elements, give the player feedback (finding an important item) and create or release tension.

Check out this great overview of the topic by Game Design with Michael: https://youtu.be/x2gwY9ukXN4

This video from Audiokinetic dives into some additional examples: https://www.audiokinetic.com/learn/videos/YFC8gV_bcwc/

How Are They Implemented?

There are a lot of different ways to implement stinger and cine sounds but, here are a couple of common examples:

  • Cutscenes

    Stingers can often be found in cutscenes, in which they are triggered on the timeline at a specific pre-determined point. This can simply be a “play scary stinger” event which then plays a generic scary stinger from a random container at a specific point in the cutscene.

  • IF statements

    In video games we can make things happen that are dependent on external factors through IF statements. For example “IF the player's health is below 50% then play a heartbeat sound”. We can use the same logic to trigger stingers and cine sounds: “IF the player is spotted by an enemy, then play x cine sound”.

    These statements generally have to be hard-coded, meaning you’ll have to collaborate with programmers to get them working.

  • Trigger volumes

    An invisible volume that triggers an action when the player enters or leaves it, such as spawning enemies, changing the music and ambience states, or in our case playing a stinger. A trigger volume is basically an IF statement saying “IF the player is in x location”, but using trigger volumes means that the logic is usually automatically set up in the engine, meaning we don’t have to write a bunch of custom code.

The main difference between stingers and cine sounds is that because stingers are more musical in nature, it’s important that they play back in time with any other music that’s playing. Certain middleware like Wwise make this a lot easier as they have a custom trigger system which allows stingers to be synced with the music.

Check out this great tutorial showcasing this trigger system in action: https://youtu.be/v6zSzOiUhQM

I Want To Be A Composer, What Now?

Being a composer for video games is definitely a dream job, but just like working as a writer or producer in the music industry, it’s not an easy one to get to, and will take a lot of hard work and dedication, especially early on. When you’re just starting out you may find yourself taking on projects that you’re not as passionate about, either composing for games that don’t interest you particularly, or working to a brief that feels limiting or otherwise isn’t what you’d rather be writing.

This is not meant to discourage you, but it is important to acknowledge as the reality of most people’s early careers in music composition. If you’re prepared to walk down this path anyway and stick with it when it gets tough, you’ll eventually be able to take on more and more of the projects that interest you.

How Do I Make Music For Video Games?

As mentioned above, making music for interactive media such as games poses some unique challenges, but also new possibilities. Collaborating with technical sound designers, or developing your own programming and implementation skills, can enable you to be even more creative with your approach to composition and sound design and add new dimensions to your work.

If you have experience writing more conventional ‘linear’ music, you’ll likely find that the mindset you need to compose for games is a little different, requiring you to consider the adaptability of your music - how is the soundtrack going to change from one part of the game to the next seamlessly?

Starting is usually the hardest part, but once you find an angle to approach a problem it immediately becomes a lot more doable. Try using the framework below when searching for the right angle to approach a composition task with. With practice, you can refine this framework and mould it to fit your own needs.

How To Approach The Soundtrack - A Framework For Getting Started

When we approach a new project we need to figure out a bunch of things before we start writing, like the scope (how much music will be required) and what kind of genre our music should be.

Answering some of these fundamental questions before starting will help give you a general impression of what the project needs, from which you can determine an approach to start with instead of going in completely blind, as well as a ‘creative compass’ to guide your creative decisions further down the line. Pre-planning like this can also be useful as a way to estimate how long a project might take, and what you should charge, which is especially important when working as a freelance composer.

Of course, it is important to note that there isn’t a single ‘correct’ approach to creating a score, and everyone has their own way of doing it. This is merely a push in the right direction.

Try to answer the following questions on a sheet of paper or in a Word document:

  • What type of game are you composing for - RPG, FPS, MOBA, action-adventure, puzzler, etc?

  • What is the game’s art style like - where does it fall between realistic and stylized?

  • What aspect/s of the game are you scoring - where is the player; what is the player is doing; what is the intended emotion at this point in the game?

  • Where is music needed - in menus/UI, combat, exploration, idling, etc?

  • Will there be music specific to quests, areas, levels, boss fights, day/night cycle, etc?

  • How is it going to play back and adapt - horizontal layering, vertical resequencing, or a hybrid?

  • Based on the above, how much music is needed (scope of the project)? This could be measured in minutes, or by number tracks or stems.

  • Are there any references to draw from - soundtracks that match the style you are going for, or from similar games?

With these questions answered, you should have a better idea of the scope of the project, and what lies ahead.

Generating Ideas

The next step is to start generating ideas and writing some basic demos. Coming up with ideas can often be the hardest part of the whole creative process. There is no single method that works for everyone, but after asking the game audio community on Twitter and doing some research, here are some great techniques that will hopefully get your creative juices flowing:

  • Be inspired by the game

    Ask for concept art, play the game and immerse yourself into the world. What is the setting? What is the vibe? What do you want the players to feel? Ask yourself questions like these to inform your creative direction.

  • Look for references

    Find scores that inspire you, and identify elements from them that you like to you use as a starting point for your own creations. They can be from similar projects to your own or completely different, whatever sparks your imagination.

  • Experiment

    Probably the most straightforward and simple idea, but sometimes just picking up an instrument, loading up your favourite VST, or simply finger-drumming on your desk can lead to some great happy accidents.

  • Impose creative limitations

    Somewhat counterintuitively, setting constraints for yourself can often be incredibly liberating. By reducing the number of small and insignificant choices you need to make, you can focus on the things that matter most. Try limiting yourself to a single instrument, or limiting the number of tracks you use, and get creative working around those limitations.

  • Watch a GDC talk

    Hearing others nerd out about music can be a great learning resource, and often plant the seeds for great ideas. I personally recommend Mick Gordon’s GDC talk about the music of Doom (2016).

  • Do something else

    Inspiration often strikes in strange or unexpected places, so if you’re struggling for inspiration, try stepping away from your computer and going for a walk, playing sports or taking a shower. Just make sure to keep your phone or a mic close by to record your ideas when they come to you.

Check out this Twitter thread for some more great suggestions from the game audio community: https://twitter.com/itsgreglester/status/1513233514551984129

Writing The Score

While it’s easy to suggest new ways to come up with new ideas, the process of expanding on those ideas and developing them into full pieces of music will naturally be a process of your own, and depend on your tastes and what you want to achieve with your composition. As this is a huge topic that can’t simply be summed up in a couple of paragraphs I will instead link some amazing resources below.

This is an excellent video where Shaun Chasin gives some insights into his writing workflow: https://youtu.be/xzRlvRBqwPc

I highly recommend reading this book if you want to learn more about writing music for video games: https://www.google.co.uk/books/edition/A_Composer_s_Guide_to_Game_Music/qebUAgAAQBAJ?hl=en&gbpv=0&kptab=getbook

Some good tips from composer Matthew Kilford: https://audient.com/tutorial/composing-music-video-games/

And of course, remember to refer back to your framework and pre-planning notes if you need to!

YouTube Channels Exploring Music Composition For Video Games

Ongaku Concept: https://www.youtube.com/channel/UCAa1ld_G99694y8a3N68opg

8-bit Music Theory: https://www.youtube.com/c/8bitMusicTheory

Game Score Fanfare: https://www.youtube.com/c/GameScoreFanfare/videos

Austin Wintory YouTube Channel (podcasts, behind the scenes, etc.): https://www.youtube.com/c/awintory

If you’re looking for some more in-depth talks then check out the “Additional Music Learning Resources” below.

Music Implementation (The Basics)

Why You Should Learn Music Implementation as a composer?

Music implementation is a very technical topic usually reserved for technical sound designers and music editors/implementers. It can, however, be an incredibly valuable skill for composers as it allows you to:

  • Have more control over the score - by understanding the fundamentals of implementation you’ll be able to make sure your music is implemented the right way or even do it yourself.

  • Have more creative avenues - you’ll be able to build more interesting music systems through your knowledge and ability to implement them.

  • Sets you apart from a lot of composers - especially on smaller indie projects, being able to implement your own music can be a game-changer as you’re reducing the workload of other teams who can focus their efforts elsewhere if needed.

Adaptive Music

The main difference between music for games (interactive media) and music for films (linear media) is that the music for games has to adapt depending on the actions of the player, whereas music for films will always play back in the same way. This is why we often use the term ‘adaptive music’ for game scores. When we write a soundtrack we not only have to consider not just the pieces themselves, but also how they will be played back.

There are three established techniques for creating adaptive music:

Vertical Layering

As the name suggests, the score is created from different instrumental layers such as strings, piano, percussion, etc. These layers can be mixed in or out to create variations of a single track, depending on what is happening in the game. An example of this would be having a simple piano melody during an exploration section, and then adding in additional layers when the player enters combat to match the intensity of the gameplay.

Horizontal Resequencing

With this technique, instead of creating a variation on an already-playing piece of music by adding or subtracting layers, a change in game state is accompanied by switching to an entirely different piece of music. For example, in an exploration section, the player gets an ‘exploration track’ that is gentle and upbeat, which changes into a percussive and intense ‘combat track’ when the player is in a combat encounter.

Hybrid

The hybrid technique is a mix of both vertical layering and horizontal resequencing, which gives us the most flexibility and allows us to tailor the score even more to various gameplay scenarios.

In practice, this might mean having dedicated ‘exploration’ or ‘combat’ tracks that are additionally broken into vertical layers. The appropriate track is selected depending on the game state, and layers of that track can be mixed in and out to create changes of mood and intensity within a specific section of gameplay. For example, at the start of a boss fight you could swap from the exploration track to the combat track using horizontal resequencing, but start the combat track with only a few instrumental layers, adding more as the boss’ health bar drains, or when the fight enters its second phase.

Here is a brilliant video by Game Makers Toolkit showcasing some examples of great adaptive music: https://youtu.be/b0gvM4q2hdI

Additionally, I highly recommend watching this excellent GDC Talk on interactive music by the incredible Winifred Phillips: https://youtu.be/XPV258XyNy8

Additional Tools

As you can probably tell by our boss fight example above, we have a bunch of additional tools at our disposal in the form of in-game parameters and values, as well as things like event triggers and volumes, that we can use to change the soundtrack on the fly.

If you’re interested in learning more about them and how they can used to bring a soundtrack to life then check out the ‘Advanced Music Implementation’ section below.

Advanced Music Implementation (Technical Sound Designer)

Music implementation is similar to sound implementation and works in three steps:

  1. Preparing music stems for implementation (chopping them up and exporting them in the right format and lengths).

  2. Implementing the music stems into your game engine or middleware.

  3. Setting up playback triggers in the game engine (when the music will play, stop, transition, etc.).

In order to get an overview of the topic I highly recommend reading this fantastic and short primer for music implementation by Ronny Mraz: https://splice.com/blog/interactive-music-system-video-games/

Now that we know about the basic process of creating a music hierarchy and planning out the various states and transitions, it’s time to dive into the technical side.

The specific ways in which music is implemented will naturally depend on the game engine and middleware you’re using. I recommend using the same strategy mentioned in the Game Engines chapter, determining what career path you are interested in (AAA or Indie) and focusing on learning a single engine/middleware exclusively at first.

With a solid understanding of one, learning others in the future will be much easier, but to begin with it’s best to remain focused and develop your skills with one engine/middleware, before branching out once you’re familiar and confident with that first one.

Below are resources that explain how to implement music into the most popular Game Engines and Middleware.

Music Implementation Resources

Enhancing The Player Experience with Dynamic Music, by Ryan Gehrlein, focusing on Unity: https://ryan-gehrlein.medium.com/enhancing-the-player-experience-with-dynamic-music-6e26246b119b

Epic Games has a great online portal for learning all aspects of Unreal Engine: https://dev.epicgames.com/community/?application=unreal_engine

A series of videos on adaptive music using FMOD: https://youtube.com/playlist?list=PLgp5bmNEz8VyoMmVt2wKvuMOUJLhF6dQJ

A basic overview of interactive music implementation in Wwise: https://www.audiokinetic.com/learn/videos/zvnt3tbl3ou/

And a more advanced deep dive into implementing music from scratch using Wwise: https://vstbuzz.com/blog/video-game-music-implementation-with-wwise-1/

Additional Music Learning Resources

An in-depth look at some of the music composed for Returnal: https://youtu.be/GRG05kqYN0E

The Brilliance of DOOM’s Soundtrack, by Raycevick: https://youtu.be/7X3LbZAxRPE

How to Write One Minute of Music, by Austin Wintory: https://youtu.be/JzVliLM1czM

A macro-scale breakdown of the music of THE PATHLESS: https://youtu.be/EkgpbQuRtIM

GDC - Lessons from The Last Of Us and More: https://youtu.be/UFLmVsyIQDA

GDC - Approaches to Musical Storytelling in Cyberpunk 2077: https://youtu.be/uX9rjMZARVg

GDC - The Gothic Horror Music of Bloodborne: https://youtu.be/5yncMReF8QA

GDC - DOOM: Behind the Music: https://youtu.be/U4FNBMZsqrY

GDC - Slavic Adaptation of Music in The Witcher 3: https://youtu.be/ChnSuLYKkWc

GDC - Composer Confessions (stories and lessons learned from their careers): https://youtu.be/GGAdKSNYnTk

Previous
Previous

How Dialogue Works In Video Games

Next
Next

How To Mix & QA Test Your Game