Manhattan's powerful procedural music engine can be embedded in any game (or app) that supports audio processing and native C/C++ plugins, with dedicated support for Unity.
Using a simple API, games gain an unprecedented level of real-time control over the music - every note, channel, variable, and fragment of code can be accessed, manipulated or even generated, reacting to events and the changing context of the game to enable more dynamic and immersive soundtracks.
This tutorial introduces the basic concepts of how Manhattan integrates with a game or app, explores the API that is available in the game, and provides tools to simulate game events and states, to help you compose your own dynamic game soundtracks.
MANHATTAN-POWERED GAME MUSIC
Manhattan can integrate with a game in two modes: embedded in the game itself (as a native plugin, e.g. for release or distribution), communicating with and delivering audio directly to the game...
... or connected using a 'live editing' mode, where the running game (or game editor, such as Unity) communicates with the Manhattan app running separately, allowing you to play and edit both your game and its soundtrack interactively in realtime:
You develop your game music in the live edit mode, then export it to a file containing everything (music, code, samples) embedded Manhattan needs to play.
Both modes support bidirectional control messages to and from the game, allowing the music to be fully manipulated at runtime, plus the full sequencing and synthesis capabilties of Manhattan - though embed mode also allows the game to further process to the audio generated (e.g. spatialise in the game world).
In the next section (2, top) we'll explore a few of the music concepts and techniques for designing interactive game soundtracks.
DESIGNING LIVE SOUNDTRACKS
In contrast to conventional game music tools based on pre-recorded audio phrases, Manhattan sequences and performs the music live, so can react to almost any detail or change in the game.
The game has access to every part of the score, from low-level detail (individual notes, chords, phrases) to high-level structure (arrangement, form, channels, instrumentation), plus control of synthesis, effects, and mixing.
Simple designs begin with a game altering variables or values in Manhattan in response to the changing game state, or triggering Manhattan code in response to specific events in the game. Games can also generate and send arbitrary code expressions to make bigger changes in the music, executed in realtime.
More complex interaction might involve the music send information back to the game - even controlling it. Manhattan's send() function allows the music to pass arbitrary messages to the game, at any point in the music. Manhattan also broadcasts every note it plays, which the game is free to handle, filter, or ignore as it sees fit.
GAME-MUSIC MECHANICS
This following list outlines specific mechanisms used to create dynamic game soundtracks, including both traditional and noval approaches:
- Layering - adding or removing of musical parts to mirror changing game contexts (e.g. tension).
- Stingers - one-shot musical motifs triggered by game events, e.g. deaths, spawns, victory, etc.
- Proximity Triggers - musical elements that activate or change based on player location
- Granular Sequences - stochastic generation of ambient melodies or textures using probability
- Constraint / Rule-based - generative methods that filter random seeds using musical rules
- Sonification - direct mapping of game world data to multiple music elements
- Interactive Sequences - musical structures whose playback depends on the player
- Visualisation - interplay between graphics in the game and the music, e.g. tablatures
On the next page (3, top) we'll explore the API and functions available for games to access and control the music - or for the music to control the game - plus a simulator you can use to develop and test your own interactive soundtracks.
MANHATTAN API
The game communicates with the playing music via the code functions provided by Manhattan's API (or application programming interface), which supports loading files, managing playback, and sending and receiving messages to and from the performance. This page details the functions available, both in Unity C# syntax and the underlying C functions.
- Controlling the Music
These three functions provide the core functionality for most games or interactive audio application: - Set() - change a variable or other property
usage: Set(engine, variable, value); Changes a specific value in the music - a labelled variable, cell property, or other musical attribute.
The argument variable is a string identifying the value to change, and value is number to set (in floating-point - though most times this will be a whole number between 0 and 255).
Set(engine, "MyVar", 23);The above example sets the value (.param cell property) of the cell labelled "MyVar" to 23 ('17' in hex), similar to the Manhattan code: @MyVar = 23For named variables, this method is slightly faster than using Code() to perform the same operation.
- Code() - execute a Manhattan code expression
usage: Code(engine, expression); Sends the specified code expression (as a string argument) to Manhattan for immediate execution. This function supports the full functionality of the Manhattan language - changing values, controlling playback, adjusting synthesis, channel mixing, etc.
Similar to a code block in a cell, the expression can include multiple lines, separated by carriage returns or semi-colons (';') - it is not associated with a specific cell, but will default to the first cell in the pattern ([1:0]) if left unspecified.
Code(engine, "play(@Melody)");The code is executed immediately, without waiting for musical boundaries like beats or bars.
To sync game events with the music, place your code in a labelled pattern cell (e.g. "EnemyDeath") and use Set() or Code() to set a flag (i.e. variable), signalling that the code is waiting to run. Now use a macro (i.e. 0xx to 9xx) to check the flag regularly (i.e. at specific rows, every beat, etc.) and trigger the code when needed - i.e. @EnemyDeath.run().
- Run() - execute existing code in the pattern
usage: Run(engine, address); Runs the code block contained in the specified address or labelled cell. Unlike Code(), it runs in cell hosting the code.
Run() shares many use cases with Code(), but is tailored for pre-planned actions handled primarily in Manhattan, and simply triggered by the Game or Unity. The following code executes the code block in the cell labelled "EnemySpawned":
Run(engine, "EnemySpawned");This is functionally equivalent to the Manhattan code: @EnemySpawned.run() - but slightly faster than using Code() for the same operation.
The code is executed immediately, without waiting for musical boundaries like beats or bars.
- Simulating the Game
You can explore and experiment with these API functions using Manhattan's built-in simulator, which allows you to emulate the live interactions within a game and the messages it sends to your music.
Use the Game / API tab under Help > Live Control to send messages to any open project, or load the "Ars Arcus" example from Help > Examples > Generative, which contains the dynamic soundtrack and several preset events simulating the game. The original game is also available as free DLC for Manhattan (via Steam).
- Managing the Engine
In Unity, these functions are generally managed for you, by the Manhattan C# script (Manhattan.cs), but may be useful in advanced applications or ports to other platforms (e.g. Unreal, JUCE, etc.): - Create() - initialise the Manhattan engine
usage: IntPtr engine = Create(isLive); Initialises Manhattan's game music engine, either in live editing mode (isLive = true) or embedded mode (isLive = false), returning a pointer to the new instance, which should be stored for use with other functions.
Live mode simply sends and receives messages to/from a running instance of the Manhattan app, whereas embedded mode creates and initialises a new playback engine, which will need to have its audio settings configured and music file loaded - see Config() and Open().
In Unity, this process is handle for you by the Manhattan C# script (Manhattan.cs).
- Destroy() - shutdown and free the engine
usage: Destroy(engine); Shuts down the existing game music engine referenced by the pointer, engine.
This function will stop playback and free all resources (i.e. memory) used by the engine. Generally called on termination of the game.
In Unity, this process is handle for you by the Manhattan C# script (Manhattan.cs).
- Config() - configure an engine's audio (embed only)
usage: Config(engine, samplerate, channels); Configures the sample rate and output channels for the engine referenced by the pointer, engine.
Only used in embedded mode. The argument samplerate can be any non-zero integer (e.g. 44100 or 48000), whereas channels is reserved for future use (the current version assumes stereo audio, i.e. 2 channels). Samples are 32-bit floats.
Return value: 1 on success, 0 for an invalid handle, and 3 for an error inside the engine.
In Unity, this process is handle for you by the Manhattan C# script (Manhattan.cs).
- Open() - load a Manhattan project (embed only)
usage: Open(engine, path); Loads a Manhattan Unity bundle (.zmu file) into the engine referenced by the pointer, engine.
Only used in embedded mode, the argument path is a string containing the path to a Unity bundle, exported by Manhattan (File > Export to Unity...).
Bundles contain all the music, code, and samples, required by the embedded version to perform the music. Unlike Manhattan's default (.zm) format, it also packages samples required from Manhattan's built-in instrument library, which the embedded version of Manhattan omits to minimise disk and memory usage, optimising game performance.
Return value: 2 on success, 0 for an invalid handle or load failure, and 1 to flag errors during load.
In Unity, this process is handle for you by the Manhattan C# script (Manhattan.cs), configured for a bundle named "manhattan.zmu", placed in your Unity Project's StreamingAssets sub-folder.
UNDER CONSTRUCTION
This tutorial will eventually include a simple
game soundtrack with simulator and preset
events, plus basic game music exercises.