Sonifying Tidmarsh Living Observatory (2016)
A collaboration with researchers at the Responsive Environments (ResEnv) Group at the MIT Media Lab. The collaboration culminated in a series of parameter mapping sonification patches and externals for Max/MSP, which integrates ResEnv’s ChainAPI. This collaboration also resulted in a six-channel music composition for loudspeaker presentation (2015). A stereo render of the multichannel sonification installation is available below:
From Lynch and Paradiso’s paper (NIME 2016):
“Ricky Graham, a guitarist and computer musician, was interested in working with data from Tidmarsh’s history and developed an interface that allows a listener/improviser to select data ranges to iterate through driving his own granular synthesis patch. This interaction encourages exploration of how data sets can drive timbral and temporal changes in electronic music. Using this interface, a piece was created based on the contour which barometric pressure, humidity, and illuminance take over the course of a day on the full moon. This piece was presented at the Fall 2015 Media Lab Member’s Event and appeared as a research presentation at SEAMUS 2016 . The second version of this piece is presented within the virtual environment. Each device drives it’s own granular synthesis patch with real-time data, as well as ranges of recent historical data on multiple timescales.”
An Introduction to the Tidmarsh Living Observatory
Tidmarsh Farms is the combination of two cranberry farms (est. 1982) based in Manomet, Plymouth, Massachusetts. Currently, both locations are being restored to preserve the wetland corridor. These types of wetland restoration projects protect, improve, and increase wetlands allowing their natural functions to be reclaimed . The Living Observatory aims to provide a better understanding of the relationship between, “ecological processes, human lifestyle choices, and climate change adaption.” Sensor nodes are installed throughout the marsh in order to capture “ecological vital signs.” The resulting ubiquitous sensing network provides a creative canvas for the development of interactive music systems. This project uses the visual programming language, Max/MSP, to sonify historical sensor data from sensor nodes embedded within Tidmarsh Living Observatory. Historical data for each device is accessible via the ChainFlow library developed for Max/MSP by Evan Lynch at MIT. This iteration of the project focuses on historical data for 24 hour periods where the whole disk of the moon is illuminated (full moon phase) in an attempt to examine periods where illuminance data (and other related natural phenomena) may play a key role. The goal is to retrieve data for each of these time periods, on each of these days, for each month, throughout the year of 2015. Initially, the author has chosen eight nodes located at the periphery of the sensor network, parsing data for barometric pressure, humidity, and illuminance from each location. Data is then scaled, manipulated, repeated, and mapped to DSP parameters to create environmentally-informed music.
Mapping Strategies for the Sonification System
Historical data is received by an abstraction that immediately scales and displays each set in a range of 0 to 1. The user can select and loop portions of the data at different speeds and playback directions. First and second order derivatives of selected sensor data are also computed for additional parametric control purposes.
Repeating cycles of data provide useful mechanisms for composition and may also provide useful insights into underlying environmental processes.
In this iteration of the system, historical data controls the filter bandwidth, playback speed, direction, and grain size of the granular synthesizer modules, amplitude and onset, and the azimuth of each audio stream in the physical performance space. The resulting musical output reflects overlapping and shifting contours of environmental structures throughout a 24-hour period.
ChainFlow on Github
NIME Paper 2016
ResEnv’s Tidmarsh Page