When learning a classical instrument, people often either take lessons in which an existing body of “technique” is delivered, evolved over generations of performers, or in som...
In this paper we study the potential and the challenges posed by multi-user instruments, as tools that can facilitate interaction and responsiveness not only between performers an...
In this paper we present an example of the use of the singing voice as a controller for digital music synthesis. The analysis of the voice with spectral processing techniques, der...
We present a novel way of manipulating a spatial soundscape, one that encourages collaboration and exploration. Through a table-top display surrounded by speakers and lights, part...
In this presentation, we discuss and demonstrate a multiple touch sensitive (MTS) keyboard developed by Robert Moog for John Eaton. Each key of the keyboard is equipped with senso...
McBlare is a robotic bagpipe player developed by the Robotics Institute at Carnegie Mellon University. McBlare plays a standard set of bagpipes, using a custom air compressor to s...
Roger B. Dannenberg, Ben Brown, Garth Zeglin, Ron ...
Tangible Acoustic Interfaces (TAI) rely on various acousticsensing technologies, such as sound source location and acoustic imaging, to detect the position of contact of users int...
The Sonictroller was originally conceived as a means of introducing competition into an improvisatory musical performance. By reverse-engineering a popular video game console, we ...
This paper takes the reader through various elements of the GoingPublik sound artwork for distributive ensemble and introduces the Realtime Score Synthesis tool (RSS) used as a co...
In the Expression Synthesis Project (ESP), we propose a driving interface for expression synthesis. ESP aims to provide a compelling metaphor for expressive performance so as to m...