[vimeo 10722455 w=400 h=300]
The objective of this patch is to reflect a balance between salient musical cues, such as pitch, timbre, space, and time, during real-time music performance. This patch is still a work in progress. Pitch tracking is utilised in PureData to obtain string class data. The data is converted to MIDI note data (pitch and velocity). MIDI data is then recorded into a [coll]. In direct MIDI mapping mode, the user can simply associate MIDI notes with automation processes in Live. In random MIDI mapping mode, the recorded notes and associated DSP are recalled at random to allow an element of improvisation. The note sequence algorithm allows faster note sequences to cease DSP in Live, foregrounding pitch. The interval algorithm allows certain interval relationships to trigger CC messages. In this example, the CC’s are mapped to a variation of spatial gestures.