LMJ27: Collaborative Works in Sonification and Virtual Reality

I have an article on sonification and virtual reality in Leonardo Music Journal 27 

Environmental Histories and Personal Memory: Collaborative Works in Sonification and Virtual Reality


This short paper will present an overview of two historical data projects developed at the Sensory Computation/Experimental Narrative Environments Lab at Stevens Institute of Technology between 2015–2017. The first project focuses on the sonification of environmental data derived from a ubiquitous sensing network embedded in Tidmarsh Living Observatory in Plymouth, Massachusetts. The second project presented in this short paper explores a history of instrument gesture data as a basis for interactivity in a virtual scene. This short statement discusses these two projects and their creative implications.

Early Access:


NIME 2017

Chris Manzione and I will present a paper and demo of our virtual reality based performance system at NIME 2017.

Paper Session 4: Platforms

Demo Session 3:

Additionally, we will perform a new piece, Disrupt/Construct, using the performance system. The performance will take place on Tuesday evening at Stengade 30 in Copenhagen: http://sched.co/ASgj 

Here’s the Facebook event page for the concert: https://www.facebook.com/events/235489450262255/ 

[sinedelay~] v1 – Waveform Generator and Variable Delay Line

[sinedelay~] v1. is a waveform generator and variable delay line external written in C for Pure Data (Pd). The external allows crossfading between waveforms and provides 4-point interpolation between tap-delay times in ms. It also has build-in amplitude modulation. v1. applies a tanh-based, soft digital distortion to the delayed output (right-most outlet).

Please send feature requests / report bugs to ricky at rickygraham dot net



Up-and-Coming Lectures and Recitals in 2017

February 16, 2017 – Sonic Arts Research Centre, Belfast – 1pm

Next Thursday, I am giving a lunch time recital at the Sonic Arts Research Centre in Belfast.

Program Notes:

Axon (2011) – Composed by Richard Graham and Michael Andrews

“Axon” is a collaborative project that began life as a series of recorded guitar improvisations. These initial recordings provide the main source materials in the fixed media element of the piece. The guitar and tape are in a continual loop of mutual influence with the line between the the composer and the performer becoming continually blurred, resulting in a truly collaborative piece that follows a combined creative vision. This piece was composed at the Sonic Arts Research Centre, Belfast, in 2011. 

Excerpts from Disrupt/Construct (2017) – Composed by Richard Graham

“Memory is malleable. Whether intentionally or otherwise, human beings have a tendency to misremember parts of their personal histories. Misremembering may provide the means to manage a troubled past—a method of self preservation—an escape from prior traumatic events. Accepted personal narratives may not necessarily be representative of the truth even though they may be based on factually accurate information or what an actor may presume to be the case.”

These excerpts are derived from a developing composition, “Disrupt/Construct” (2017), featuring electric guitar and live electronics. This piece is a point of reflection on two ostensibly divergent sound worlds that have emerged through a solo instrumental practice over a ten year period. Certain excerpts will present a more ‘accessible’ or parseable sonic world with defined pulses and tonal systems. Others are situated more within the realm of noise, ambient, and drone music, disrupting the continuity of the preceding pieces. While both sound worlds exhibit their own divergent attributes, both reach across the aisle and borrow from one another.

Quiet Arcs (2012) – Composed by Richard Graham

“Quiet Arcs” is a live performance and fixed media piece for electric guitar and multichannel loudspeaker array. The piece explores the notion of a dynamic pitch space and the bodily metaphors which underpin it. The guitarist’s melodic and timbral choices are analyzed, scaled, and mapped in real-time to determine the spatial position and timbral shape of the multichannel live source relative to the accompanying drone-based tape part. The real-time instrumentation is largely improvised with the fixed media element providing a series of morphing pedal points as a basis for the improvisation.

I’ll then visit with Designing Technologically Mediated Performance students for critique at 2pm. Thanks to Chris Corrigan and Paul Stapleton for the invitation.



February 22, 2017 (moved from Feb 10) – University of Limerick, Ireland – 3pm 

I’m visiting Music Technology students at the Digital Media and Arts Research Centre, University of Limerick, for a guest lecture on my current live performance system. Thanks to Kerry Hagan for the invitation.




March 07, 2017 – MMT, Trinity College, Dublin – 2pm

On March 7, I will visit Music and Media Technologies students at Trinity College, Dublin, to talk about performance systems design and virtual reality. Thanks to Enda Bates for the invitation and for making some studio time available.