When composing music for an interactive medium the composer must overcome different obstacles from more linear entertainment such as film. The non-linear nature of games often conflicts with the idea of player autonomy and thus new... more
When composing music for an interactive medium the composer must overcome different obstacles from more linear entertainment such as film. The non-linear nature of games often conflicts with the idea of player autonomy and thus new methods of working must be implemented.
This paper looks at using a "generative" approach to musical composition in games as a possible solution to some of the challenges found in composing for an interactive medium. Various forms of generative and algorithmic music will be discussed and a proposed strategy for use in a game situation will be given.
Recent advances in crowd simulation techniques have led to realistic agent and group behavior through elaborate behavioral models, complex motion planning algorithms and impressive physics systems. As many crowd simulation solutions... more
Recent advances in crowd simulation techniques have led to realistic agent and group behavior through elaborate behavioral models, complex motion planning algorithms and impressive physics systems. As many crowd simulation solutions typically target only specific types of environment and scenario, a variety of special-purpose methods and systems has emerged that are hard to re-configure and re-use in other contexts. Solving this situation demands a higher-level approach that takes re-use and configuration of crowds as a priority, for adequate application in a broad variety of scenarios, virtual environments and inter- action with the entities present in that environment. In this article we propose semantic crowds, a novel approach that allows one to re-use the same crowds for virtually any environment, and have them use the objects available in it in a meaningful manner, without any modification. To have the agents autonomously interact within any virtual world, we minimize in them ...
In the last years an increasing interest can be observed for developments in game engine technologies as a versatile creative tool. In particular, the possibility to visualize and simulate real-time complex physical behaviors facilitates... more
In the last years an increasing interest can be observed for developments in game engine technologies as a versatile creative tool. In particular, the possibility to visualize and simulate real-time complex physical behaviors facilitates the design and implementation of 3D virtual music instruments and the exploration of sound gesture as a result of their kinematic and spatial properties. This paper describes two case examples in the form of linear compositions based on non-conventional instrumental designs where audio is procedurally generated using custom-built APIs in the game engine scripting language (Unity3D-Javascript/C#). Sound events are also organized as a sequence of flexible code instructions, resulting in a quasi-fixed piece duration with subtle timbral variations over multiple playbacks. In both cases, the model presented shows inherit spatial characteristics, which are useful in order to build spatialization patterns in a multichannel loudspeakers configuration and emphasize the strong causal connection between the visual and sonic aspects of these works.
In the last years an increasing interest can be observed for developments in game engine technologies as a versatile creative tool. In particular, the possibility to visualize and simulate real-time complex physical behaviors facilitates... more
In the last years an increasing interest can be observed for developments in game engine technologies as a versatile creative tool. In particular, the possibility to visualize and simulate real-time complex physical behaviors facilitates the design and implementation of 3D virtual music instruments and the exploration of sound gesture as a result of their kinematic and spatial properties. This paper describes two case examples in the form of linear compositions based on non-conventional instrumental designs where audio is procedurally generated using custom-built APIs in the game engine scripting language (Unity3D-Javascript/C#). Sound events are also organized as a sequence of flexible code instructions, resulting in a quasi-fixed piece duration with subtle timbral variations over multiple playbacks. In both cases, the model presented shows inherit spatial characteristics, which are useful in order to build spatialization patterns in a multichannel loudspeakers configuration and em...
It was a pleasure to welcome procedural audio pioneer, Leonard Paul, to February’s meetup! Leonard shared some of the highlights of his procedural audio career to date, and talked in depth about his early ambitions to develop tools for... more
It was a pleasure to welcome procedural audio pioneer, Leonard Paul, to February’s meetup! Leonard shared some of the highlights of his procedural audio career to date, and talked in depth about his early ambitions to develop tools for videogame integrations, using Pure Data to build prototypes that might one day emerge in a AAA title. He went on to discuss the current state of play, looking at the movers and shakers in the field and where they could be heading next, before leading us into a Q&A session and some open discussion.
During a demo of his music and SFX systems for the educational game, Sim Cell, he explained how he used oscillators, variable delay lines and granular synthesis patches in Pure Data to generate a dynamic soundtrack that could react to different game states, as well produce as the more routine sound effects used for GUI navigation and spacecraft propulsion.
Videos of each section of the talk can be found in the link, and if you want to try out any of Leonard’s Pd patches for yourself, you can follow the link.
Leonard Paul is a composer, sound designer and educator based in Vancouver, Canada. He runs the School of Video Game Audio.
This paper introduces the AIE-Studio (Audio Interfaces for Exploration), a modular dataflow patching library implemented with Pure Data. The AIE-Studio introduces new tools for procedural sound design through generative sonic and... more
This paper introduces the AIE-Studio (Audio Interfaces for
Exploration), a modular dataflow patching library implemented
with Pure Data. The AIE-Studio introduces new tools for
procedural sound design through generative sonic and musical
structures. Particular focus is on aesthetic experience. The
designed modules allow versatile dataflow mapping through
matrix routing system while also enabling the sound designer to
influence generative processes of music creation. In particular,
The AIE-Studio was used to create generative sonic and musical
material in an embodied game-like application. In this paper we
present key questions driving the research, theoretical
background, research approach and the main development
activities.