In this interview episode, summer podcast lead Michael Butler speaks with composer Tyson Cazier about the field of adaptive music, rising technologies in the field, and Tyson’s approach to storytelling in soundtrack composition. Cazier composes for video games and film and is a guitar professor at Utah State University.
[Musical intro, fades out]
Angela: Hello, AMT Lab listeners, and welcome to an interview episode of Tech in the Arts, the podcast for the Arts Management and Technology Lab. My name is Angela Johnson, and I'm the podcast producer. In this episode, our summer podcast lead Michael Butler interviewed composer Tyson Cazier about using music and storytelling in the art of soundtrack composition. We hope you enjoy their conversation.
[Musical intro, fades in]
Michael: Thank you for joining me today, Tyson. Would you mind giving our listeners a little more of an introduction to who you are and what you do?
Tyson: Hey, yeah, for sure. Thanks for having me on. My name is Tyson Cazier, and I am a composer. I've been composing for probably two decades, and I love it. And right now what I currently do—I've done film in the past and a lot of live composition—currently, I'm focusing mostly on video games. I have a lot of video game projects going on. So, I compose and my big passion is composing for different forms of visual media.
Michael: Awesome. I didn't realize you'd been at it for quite that long. But before we jump into the meat of today's interview, how are you conducting your lessons? Because I know we've talked before and you said you do some lessons for students. And if they're online, what kind of tools do you leverage for it?
Tyson: So right now, yeah, I do teach. I teach at Utah State University—guitar. And then I teach privately outside of that. And a lot of it's just really simple on Zoom. So I haven't gone too tech heavy on what I do for teaching. Because I spend most of my time outside of teaching really focusing on the composition aspect of my of my business.
Michael: Is there anything that you found particularly difficult with the online teaching for music?
Tyson: Latency. Latency has been probably the most frustrating thing, because when I'm working with students, it's really helpful for me to be in the room with them and like, strumming along with them. Or playing along with them and kind of pushing them to make switches faster or to do things more effectively. And it's hard. And so with the latency issue, I can't really do that. So I've had to come up with workarounds, and things like that.
Michael: Yeah. When I was trying to learn guitar, I'd never got too far with it, but like, one of my big problems was with like finger positioning, and it was really helpful that I had somebody there who could like, physically reposition my hand and be like, “No, no, no, you want it a little bit more down this way,” or something like that, “More pressure.”
Tyson: Right, right. Yeah. Yeah, so those things have been a little bit challenging. But, all in all, it's pretty well and pretty similar doing it online.
Michael: So, like you said, you've worked on music for at least a few types of mediums, including games and film. In particular, you provided some great music for the Steam game I released with the Souper Chef team in the spring of 2019.
Tyson: Oh, yeah.
Michael: Music and games can run the gamut of purposes, but could you go over the general process for creating a piece for a narrative form, like film or static piece of music? And then what the differences are when creating something more adaptive for an interactive experience?
Tyson: Yeah, so to answer that, it might be good for me to also kind of talk about adaptive music and what that is. But before I do, I'll answer the first part of the question in coming up with basically like a cue, or a linear piece of music for a film or a game. And basically what that entails, for me anyway, I can't speak for other composers, but I try to...I like to get as many assets from the game developer as possible, or the filmmaker. Which would just be the film, or concept art or photography, to help to put myself in that world and understand the story. And then the next step is to understand the purpose of the music. So sometimes, the purpose of the music is to heighten the emotion of the experience. Sometimes, the purpose of the music is to give context to the player or to the viewer. So, for example, if a movie opened up with a scene, like an aerial shot, coming in over a green field, in like this valley surrounded by mountains, and it's moving really fast, and you could interpret a lot of things from that. But if you had like a [sings] “dun, duh-duh-duh-dun, duh-duh-duh-dun, duh-duh-duh-dun," then you think that you're on your way to battle or something intense is about to happen. But if you had like this slow, sweet violin, it will create a different kind of context for you.
So a picture is worth a thousand words, but when you add music to it, it can write a whole story. So that's one of the first steps, is think about what's the purpose of the music here. Am I creating context? Am I heightening the emotions? And so I'll think about that. And then from there, it's...I don't know. I don't have anything too profound to say. It's basically jamming. I'm singing melodies in my head or I'm noodling around on the guitar or the piano, until, for me, it's kind of like, I think I stole this example from a Stephen King book on creative writing where he talks about, he's not really conjuring up a story and writing it so much as he's uncovering a story. So he'll start off with a "what if" statement—what if vampires invaded modern day Salem, Massachusetts?—and then he'll uncover the story like a paleontologist digs around until they see a bit of a bone and then they uncover it. That's kind of how I feel like when I'm creating melodies and noodling around. It's like, I'm digging all around, and then I'll come up with something that I'm like, yes, this, this fits. And then I start to develop it, and I'm uncovering it more, and I'm kind of discovering the music that works for that medium, if that makes sense.
Michael: Yeah, it totally does. I've actually heard that creative writing bit, like, taking a prompt and just discovering the story from a couple of places before. And my writing is, as you know, very amateur, but like, I definitely get that feel because I do enjoy the process of writing fiction and, like, uncovering that story or, like, starting to develop a game and just, like, letting it figure out what it is for itself after you start with the core concepts.
Tyson: Yeah, yeah. And I totally agree with that. Like, I love stories. I have to limit myself. And usually by limit, I mean, like, not let myself read books for long periods of time. Because I'll just get, like, addicted to stories. Like when I started reading Fablehaven, I just stopped getting things done in my business and like, was ignoring my family. And I'm like, I have to stop. So, I've stopped that series so many times, it took me years to finish it, because I'd be like, "Oh, it's consuming me too much." So, I love uncovering a story, whether it's through creating music, or reading a story or writing a story, that process to me is just really exciting. And so, that's kind of my initial process, is: what's the music for, what's the purpose of it, seeing some assets to get inspired, and then noodling around to find something that works and then developing that. And then I create a piece of music. And, traditionally, music is very linear, it begins and it does something and it ends. And even with music that's loopable, it still basically has a beginning and end, but the end is just made to start over again and repeat continuously. So for, as far as I know, the entire time humans have been on the planet, music has been a linear thing. But with interactive music, which is kind of like…we don't use that term anymore, because people don't actually interact with the music like they interact with the game. So people start using the word "adaptive" music, the term “adaptive music.” With that, it's kind of removed the linear piece from music.
So that's really, what's been really interesting to me and what's really hooked me on doing music for games, and why that's mostly what I do now. Because there's this new adaptive element where music is no longer linear. It's more three-dimensional or it morphs and changes with the game. So with a traditional piece of music, you'll develop the idea and it'll have a beginning and a middle and an end. But with adaptive music, you have to try to think about it in layers, and where the music can go, like, what are the different paths that the music can go in. And so you're thinking in layers and in directions, rather than beginning, middle and end. So you might have this world that you can explore within a game and then you might want to be able to add layers into the music as the person goes from exploring to questing to battling. But you also might want that music to morph and adapt as the player moves. So maybe they're in an opening field and they move into a dark forest: you want the music to morph and change with the player. And obviously, with a game, one thing with the game that's different from a movie is that a movie has a set amount of time, so if you're filming a movie where a person is in a field, exploring, and then they go into a dark forest, they're going to be in that field for 13 minutes and 27 seconds. And then the next scene, they go into the forest, and they'll be in this forest for exactly seven minutes and 58 seconds, or whatever it is. But with a game, the player can choose how long they're in whichever areas. And so you've got to create the music in a way that adapts to any scenario. And so that's why you have to think not only in layers, but also in paths: where could the player go? And how would the music adapt and change? And so it's, it's cool. It's exciting. It's very interesting.
Michael: Yeah, adaptive music is pretty amazing and there are a lot of ways that I can see it being applied, like you mentioned, a lot in the gaming side, and I can absolutely see it being used outside of games as well. One of the examples is Metal: Hellsinger, which is going to be coming out sometime next year. It's a Doom-style shooter where you get bonuses for like shooting and reloading on the beat, and the music is gonna layer on your performance. And I'm super excited for that game, because I absolutely loved the music and all the previews they have for it. I don't know if you've seen that one at all.
Tyson: I haven't yet. I gotta check it out.
Michael: But how do you think these forms of technologies have changed music in general? And what are some of the other interesting ways that you've seen it implemented?
Tyson: I think that it's such a new form that we're yet to see many of the ways that it's going to be implemented and affect music. But to me, it's pretty remarkable because it's an entirely new approach to music, rather than viewing music as something strictly linear. So it's, yeah, it's so young that I think there's just so much more that can happen, but I know that I haven't seen it yet. But I know that Austin Wintory—really awesome game composer—he did a project, I believe it was called "Erica." And it was more of, like, it was like a game / film kind of experience. So they're trying to push the boundaries of what a game actually is. And I know that he really worked a lot on making the music a lot more adaptive, and really trying to push the envelope of the musical experience, just like they were trying to push the envelope of the playing / viewing experience. So I know that there's a lot of experimentation. There's also a lot of coding that's getting into music and patch cord programming. Things like...Csound is one of them, is one of these programming programs for music. There's another thing called Max/MSP, and that's like Patch Cord kind of programming. It's being used a lot with live musicians, and it allows music to, like, change and adapt in live settings. So there's a lot of technology, experimentation, and innovation going on right now, and a lot of that stuff. Honestly, I'm not too adept in it because I haven't used it yet in any of my projects. So the things that I'm most familiar with are things that I've had to use in projects, things like FMOD or Wwise—these middlewares that help you to create more interactive or adaptive music.
Michael: Yeah, I'm looking forward to the point where, like, in gaming in particular right now, like there's a lot of experimentation going on, like you had mentioned with that title that's like kind of blending film narrative and video game. And it's like, while it's good to have a distinction for video games, it's also good for like, push outside of video games, create a new medium, if there is one to be found, like that kind of thing. And keep developing technology, especially music technology, because like some game soundtracks are—they're fantastic, like some of what I listen to in my off time is just game soundtracks. Can you talk a little bit more about the technologies you use for your processes? And do those tools differ depending on the process you're using or the medium you're creating for?
Tyson: Good question. So the main tools that I use would be Logic, which is my DAW: my D A W, “digital audio workstation.” And so that's where the music is created. I will load up sampled instruments or I can record live instruments. And that's what I'll use to actually produce and create the track. And those have been around a long time. And then when it goes to implementing sound into a game, some developers, they just have me send them the tracks and they implement it. But, ideally, what I enjoyed the most is when a developer will use something like FMOD. So, right now, I'm working on a project and we're using FMOD as a middleware. And so I'll use that to implement the sound. So they can just tie that into Unity and then I'll go in. So instead of me describing to the developer, "Okay, I've created a track, but I've sliced it into two, the first 15 seconds or the intro, and then the rest of it is the part that you loop, put it in this way, and here's how it works, and here's where it should come in," with FMOD, what it allows me to do is I can just put the track into FMOD, and it looks kind of like a DAW. It looks kind of like Logic. And so I can, because I don't really know much about programming or coding or anything like that,I just know the music side of it. So I can put it into FMOD, and I can set parameters. So I can have the intro thing, and then I can loop just the loopable section. If there's going to be layers that are added onto the music, I can set parameters and, like, little dials, so when things pass certain numbers and certain parameters, it enters in these new tracks. So instead of me trying to describe all that to the programmer and having them try to program it in, me as the musician who conceived the track and created the whole thing and knows exactly how it should work, I can get in and make it work exactly like that. And then FMOD is just tied into Unity or Unreal, or whatever the engine is, and the music is implemented the way it should be in the game. So those are basically the two programs I use most often: Logic to create the music and then FMOD to implement the music inside the game and have it behave as it should.
Michael: Awesome. I'm definitely gonna have to check out FMOD. I hadn't heard of that before, and like, it sounds super useful to be able to let the artist, you know, properly express their creative vision.
Tyson: Yeah, yeah, FMOD's great. The other one is Wise or Wwise, it's W-w-i-s-e. Those are both, like, the industry gold standards.
Michael: Cool. Onto our final question before we wrap up: are there any other tech innovations that you see coming down the pipe or that you would like to see being worked on, whether it be related to music, games, or any other field you're interested in?
Tyson: So kind of like I mentioned earlier, I'm most familiar with the things that I've used on projects. So the tech innovations I'm most excited about for me personally, rather than the whole industry, because I'm not as up-to-speed on some of these newer things like Csound. So for me, personally, I'm really excited to start exploring some of these, some of these newer technologies. I mean, they've been out for years, but they're a little bit less well known. Anyway, I'm excited to explore those, and see how they can function within games and interactive experiences. And I think that they're just going to be over the next five, ten years. We're gonna see a lot of innovation with programming and music and pushing the limits of what we can do and what we can experience.
Michael: Yeah, we're living in exciting times when it comes to tech innovations.
Tyson: Definitely.
Michael: Finally, what are you working on next? And where can people check out your work?
Tyson: Currently, I'm working on an MMO RPG that kind of has this—it's this fantasy game—kind of has this medieval fantasy kind of sound. I'm going to be working on a spaceship game pretty soon. We're hoping that we can get on with the Switch with that once—that'll be really exciting. That's going to be a kind of a double-a game. I'm most active on Instagram, and my handle there is Tyson Cazier: C-A-Z-I-E-R is how you spell my last name, and then I think it's _videogame_composer (@tysoncazier_videogame_composer). That's where I'm most active. And then TysonCazierMusic.com is my website.
Michael: Awesome. Well, I'd like to thank you for joining us today. It's been a pleasure having you on.
Tyson: Yeah, thank you for having me on the show. It's been fun. It's been good to talk with you, to catch up with you, Michael.
Angela: Thanks for listening to the AMT Lab podcast. Don't forget to subscribe and to leave a comment. If you would like to learn more, go to amt-lab.org, that is A-M-T dash L-A-B .org. Or you can email us at amtlabcmu@gmail.com. You can also follow us on Twitter at Tech in the Arts, or on Instagram, Facebook, or LinkedIn at Arts Management and Technology Lab. You can find the resources that we referenced today in the show notes. Thanks for listening. See you next time.
[Musical outro]