On a fundamental level, the user may design and implement unique real-time signal processes per string of the guitar, permitting the creation of dense polyphonic instrumental musical structures. The (spectromorphological) design of each string may correspond to real-time instrumental figuration executed by the performer.
The system currently operates on a note-event basis, storing pitch and amplitude data per string for use with a series of (predominantly monophonic) instrumental cognition-based abstractions, concerning melodic contour and improvisatory syntax (such as linear completion, melodic attraction, tension, asymmetry and extrapolation of sequential note-events; cf. Lerdahl, 2001). Instrumental control-rate abstractions function specifically to attend to melodic content only, concerning the quantification (and extrapolation) of tonal attraction relationships between tones and the extraction of hierarchical melodic contour structures (per string – solo; and for all instrumental registers – global) on a 4 to 7 note basis, relative to short-term memory contraints (cf. Snyder, 2000). Cognition-based data may then be applied to timbral and spatial parametric control processes in an attempt to reflect instrumental cognition-based constructs (in memory) within the resultant sound structure, potentially establishing unique performance gesture relationships within a physical (multi-channel) performance space.
My extended thanks to the Pure Data community for their guidance during the development of this project. Hopefully, this is only the beginning.
This is the first quasi-beta release. It was developed specifically to function within my own instrumental performance practice and remains a work-in-progress. I invite you to interrogate the presented concepts, develop and include them within your own practice. Please get in touch and share your ideas and suggestions for further developments. Details are included below:
Please note the system was developed on the OSX platform and through collaborative testing we are working towards a fully functional Windows release. Windows users may still utilise the system without the Grid 0.9 update (alternatively using GEM representation for tones in space).
Required Externals and Libraries (Download Links)
Pd2Live utilises a series of additional DSP externals and libraries (included in the downloadable files):
A guide on how to configure externals and libraries may be found at:
1) Add “boids” to start-up binaries
2) * Copy augmented “pd2live” ambilib folder to Pd-extended directory and add “ambilib~” to start-up binaries
3) ** Replace grid.pd_darwin in unauthorized folder in Pd-extended directory
The following videos demonstrate the key components of the proposed live performance system:
Basic Mapping Control
Cognition-based Melodic Model
Future work may attend to a series of developments concerning dynamic pitch-class profile templates, phrase recognition and an improved pitch tracking system in order to detect continuous pitch-events (attending to the continuant phase of each instrumental note-event), which may be further developed to detect common instrumental performance gestures.
GET INVOLVED! BUGS, COMMENTS, FEEDBACK?
I suspect that there will be a number of teathing issues with the initial release of patches, so please do communicate any issues via the email address provided below. In addition, please email your questions, comments and suggestions to: pd2live at rickygraham dot net
Additional demonstration videos are available at: http://youtube.com/rickygrahammusic
MUSIC PRODUCED WITH PD2LIVE
A series of original compositions were produced using pd2live and were released under the alias “signalsundertests.”
For more information, please send an email to: info at rickygraham dot net