A performative interface for real-time single-source sound spatialization.

Playspace is a simple conceptual approach to real-time sound spatialization that, in its current state, provides a performative, pitch-based interface to distribute a single sound source among up to 12 discrete speakers.

In providing a performative interface that is pitch-dependent (or, more accurately, pitch-class dependent), it also provides a compositional format that may be more immediate to composers with backgrounds primarily in acoustic music. Just as aleatory can be employed with pitch, it can be employed with space through this process, for example. Glissandi are also gesturally applicable, as are a range of other discrete-pitch gestures.

Performative input is given via MIDI, and while the imagined input instrument is a digital piano, it can work with any controller capable of providing MIDI note input.


The current prototype of Playspace is simple, intended as a proof-of-concept build that requires testing, refinement, and further feature additions. I chose Max/MSP to build this system prototype for one very simple reason: there are few technological interfaces as ubiquitous in compositional pedagogy as Max. Given that this is a system that is intended to be applicable to compositional and performative tasks, it seemed expedient not to require one to learn a new programming language or accomplish feats of virtual audio routing, but rather to make such a system such that it can be easily integrated into any other work.

It's straightforward to load Playspace as a bpatcher. The two inputs are the audio source (single-channel), and the MIDI note input respectively.

A very simple usage example.

Pitch-classes are manually mapped to discrete speakers; the user selects a number of pitches in order on the on-screen keyboard, and can use the nodes object to get a sense of where these speakers will be in space, to act as a compositional aid.

Finally, an envelope controller is provided. The attack time, as well as each speaker's signal level, is directly impacted by velocity, and so it is possible to create fluid transitions, stark contrasts, and nearly anything in between. Of course, ultimately there are other tools and techniques, some ubiquitous, that might be useful in controlling this aspect of spatialization as well. For example, sustain pedals might easily enable fluid spatial transitions and motions.


The Playspace concept arises from an approach I've been interested in for a while, one that intends to create performative interfaces to traditionally compositional tasks. Spatial work has traditionally been the domain of purely or primarily electronic music, and has also often carried a considerable barrier to entry. While the cost-associated barrier of access is perhaps still very much in place, I wondered how one might address the task of spatialization itself in a way that extends the skillset of composers and performers rather than requiring that they learn new skills.

Future Plans

As spatialized music and sound experiences become more common, and new tools are developed for creating them in game audio and other contexts, it is possible that the gap between concert composers and spatial technology is increasing.

Placespace is intended to simply introduce the idea of building systems and interfaces that increase the immediacy with which musicians, composers, and sound artists can take advantage of developments in this area, in extension of existing training and ability.