NIME: New Interfaces for Musical Expression

NIME 2017

Paper and performance documentation coming soon!

 

NIME 2016

“Ricky Graham, a guitarist and computer musician, was interested in working with data from Tidmarsh’s history and developed an interface that allows a listener/improviser to select data ranges to iterate through driving his own granular synthesis patch. This interaction encourages exploration of how data sets can drive timbral and temporal changes in electronic music. Using this interface, a piece was created based on the contour which barometric pressure, humidity, and illuminance take over the course of a day on the full moon. This piece was presented at the Fall 2015 Media Lab Member’s Event and appeared as a research presentation at SEAMUS 2016 [6]. The second version of this piece is presented within the virtual environment. Each device drives it’s own granular synthesis patch with real-time data, as well as ranges of recent historical data on multiple timescales.”

Sensor Chimes: Musical Mapping for Sensor Networks by Lynch and Paradiso 

NIME 2015

nime15poster_rgjh

Multichannel (or divided) audio pickups are becoming increasingly ubiquitous in electric guitar and computer music communities. These systems allow performers to access signals for each string of their instrument independently and concurrently in real-time creative practice. This paper presents an open-source audio breakout circuit that provides independent audio outputs per string of any chordophone (stringed instrument) that is fitted with a multichannel audio pickup system. The following sections include a brief historical contextualization and discussion on the significance of multichannel audio technology in instrumental guitar music, an overview of our proposed impedance matching circuit for piezoelectric-based audio pickups, and a presentation of a new open-source PCB design (SEPTAR V2) that includes a mountable 13-pin DIN connection to improve compatibility with commercial multichannel pickup systems. This paper will also include a short summary of the potential creative applications and perceptual implications of this multichannel technology when used in creative practice.

NIME 2015 | Poster Paper written with John Harding (Ulster University- j.harding@ulster.ac.uk)

———————————

nime15poster_rgbb

This paper presents the ideas and mapping strategies behind a performance system that uses a combination of motion tracking and feature extraction tools to manage complex multichannel audio materials for real-time music composition. The use of embodied metaphors within these mappings is seen as a means of managing the complexity of a musical performance across multiple modalities. In particular, we will investigate how these mapping strategies may facilitate the creation of performance systems whose accessibility and richness are enhanced by common integrating bases. A key focus for this work is the investigation of the embodied image schema theories of Lakoff and Johnson alongside similarly embodied metaphorical models within Smalley’s influential theory of electroacoustic music (spectromorphology). These metaphors will be investigated for their use as grounding structural components and dynamics for creative practices and musical interaction design. We argue that pairing metaphorical models of forces with environmental forms may have particular significance for the design of complex mappings for digital music performance.

Watch the Supporting Video / Listen to the Supporting Examples

NIME 2015 | Poster Paper written with Dr Brian Bridges (Ulster University – bd.bridges@ulster.ac.uk)


NIME 2014

nime14poster

This paper describes the design, theoretical underpinnings and development of a hyperinstrumental performance system driven by gestural data obtained from an electric guitar. The system combines a multichannel audio feed (parsed in Pure Data (Pd) for its pitch contour, spectral content and note inter–onset time data) with motion tracking of the performer’s larger–scale bodily movements using a Microsoft Xbox Kinect sensor. These gestural materials provide the basis for the system’s musical mapping strategies, informed by an integration of embodied cognitive models with electroacoustic/electronic music theory (specifically, Smalley’s spectromorphology). The system’s core design philosophy is that coherence and accessibility is facilitated through the use of broadly isomorphic embodied/gestural models at its various levels, from input event parsing to its output audio processing.

http://www.nime.org/wp-publications/rgraham2014/

NIME 2014 | Poster Paper written with Dr Brian Bridges (University of Ulster – bd.bridges@ulster.ac.uk)

0 comments on “NIME: New Interfaces for Musical ExpressionAdd yours →

Leave a Reply

Your email address will not be published. Required fields are marked *