Here is a new interactive music system. psai Unfortunately not much detail as to the engineering design or content authoring pipeline at the time of this posting, however the resulting game-reactive music output is well demonstrated.
You are able to freely try out how things are done from an authoring-perspective in wwise or fmod designer, and these could be useful to assist initial composition your music assets while playing with different interactive authoring approaches before you commit to any particular design in your own engine.
There are many possible combinations of game actions and musical parameters that may be used to shape a composition interactively. Budgets of money time and target platforms are usually the limiting factors within which to scope an interactive music system.
From the audio standpoint you can produce the following types of music assets for different interactive music systems:
Finished mixes of songs or song sections and transitions for end-to-end or cross-faded playback with optional timing metadata for exit / enter on-beat. This is the least amount of interactivity from a music-standpoint, but still it is sufficient for a great number of games.
Finished mix stems of songs or song sections and transitions packaged as a multi-channel file or played together in sync to allow dynamic blending among the stems. These can also use timing metadata for on-beat switches and different realtime effects on the stems are now possible.
Audio clips ranging from mix-stems down to individual instrument parts which are then arranged on an interactive music editor timeline to reconstruct the song sections and transitions. Such an arrangement provides much more granularity for bringing parts in and out of the mix as well as randomization and other dynamic effects within the library of parts.
The above have generally been limited to the base tempo and key as recorded in the audio files unless you allow dynamic bending of playback rate or implement a pitch-shift / time-stretch effect. Alternatively you could author a huge matrix of tempos / keys but that would be pretty wasteful from a data size and work effort standpoint.
- If you require the most granular level of music interactivity you can play back individual note sequences of wave data or synthesis algorithms. Here you can freely change arrangement, tempo, meter, key, instrumentation, synthesis parameters, mix and realtime effects for each part of the composition if you choose to build it all the way down to that level.