The Bureau: XCom Declassified - 2013
Note: For a more in-depth look at the Wwise Interactive Music System, please check out "Making Music Interactive: Elaboration of the Feature Set in Wwise" by Louis-Xavier Buffoni over at gamesounddesign.com.
It was important to me that the music in The Bureau be a dynamic, evolving experience for the player. To achieve this, I experimented a lot with the interactive music system in our audio engine, Wwise. My goal was for the music to progress along with the player as they made their way through each combat encounter. In the end, I accomplished this in two ways. The main way was using Switches triggered off of scripted events in the game to change the music, but I also played around with music segments and RTPCs.
I learned a lot of best practices for working in this complex system. For one thing, using multiple State groups would have been a much better way to go. When originally created my project, Wwise only allowed one State group, and I needed to keep the music triggers separate from the state groups we had already established for global mixing. In hindsight, when Wwise released an updated version supporting multiple State groups. I should have bit the bullet and just reworked everything using States.
I had our composer, Garry Schyman, deliver each music cue split out into track separated by instrument groups(strings, brass, piano, percussion, and sweeteners). Then I took each track and made a series of mixes that combined each set of tracks in every way possible. The general idea was that I would start with a single instrument, usually the strings, and then transition into various mix combinations based on how the player was progressing through the encounter. This provided me with a lot of flexibility in creating the "feel" for each encounter. This also allowed me to use music cues in more than one encounter over the course of the game without repeating the exact same musical experience.
Additionally, I had Garry split out the melody from the more rhythmic elements of each cue. This allowed me to create clean loop points with the rhythmic tracks. Then I took the melody tracks, usually strings and brass, and hooked them into the Trigger system in Wwise. I wanted to make sure that the emotionally impactful elements of the cue were played at appropriately dramatic moments and not just played randomly in a looping track.
The image below is screenshot of my music Pro Tools session. The top section are the tracks that I received from Garry and the bottom sections are the various mixed tracks I created.
It was fairly time consuming to get all of the aspects of the interactive music system set up properly. Once that was accomplished, Wwise makes it fairly easy to test your setups to make sure everything is working as expected. Below is a narrated video demonstrating how I set up the music for one of the games encounters. Watch full screen to see all the details.
Once everything was set up, the next step was to create the scripting in the Unreal game editor that would call the various events, triggers, and switches in Wwise. Below is a screenshot of the scripting I used for the music cue demonstrated in the above video.
I also experimented with layering tracks in music segments. This is where you have multiple tracks playing at once and crossfading between them using real-time parameter control (RTPC) values sent by the game. Unfortunately this was not a good solution for most of the music cues in the game because the audio engine was restricted to only using eight streams at one time and VO was taking up most of those. The only time I used it was when I was able to fit the loops into RAM. Thankfully, we were able to include a dynamic RAM loading/unloading solution in the game that allowed us to use more assets at higher quality throughout the levels.
I layered four tracks in a segment made up of strings, brass, piano, and percussion. I then hooked RTPCs up to each tracks voice volume and controlled these values in scripts in Unreal. These scripts were triggered both randomly and based on player progression in the encounter. I was still using Switches to control when the segments would start and using Triggers to bring in the melodic elements of each cue as I did in the previous example. Below is a video demonstrating this type of setup in my Wwise session.
This set-up required a lot more work in Unreal. First, I created a complex set of Kismet scripting to handle the various combinations of RTPC values I wanted the encounter to play. Then came the scripting for the encounter itself which, in the above example, was pretty crazy as well. The screenshots below show most of my Unreal scripting for this setup.
Overall, I'm fairly happy with the results in the game, though I could have spent another 3 months tinkering with the music setups to get them all exactly the way I wanted. There are a number of encounters that did not get as much love as I would have liked. I certainly learned a lot about the complexities of the Wwise Interactive Music system but, as with sound design, everything comes down to the original material you are working with. If you are able to work with the composer and have them deliver a great score and deliver in the formats that you need, you are most of the way there. Experiment early and map out as much of your set-ups as possible before you hire a composer and you'll save yourself a lot of time, effort, and money.