October, 2013: My current research, in collaboration with Dr Brian Bridges (University of Ulster), focuses on the development of more ecological/musical relationships (e.g. melodic syntax models controlling environmental flocking models). Why is it useful? Input/Output modalities are not always clear in electronic music performance. Based on notions presented by Johnson (2008), we argue that all musical structures our based on more primitive, ecologically-informed structures (such as a innate, bodily-informed understanding of physical forces). Notions of melodic attraction in music could be likened to attractional forces in our surrounding environment and in various ecological behaviours, such as flocking steering behaviours in the flight patterns of birds (an exemplary ecological model of physical forces, if you will). Not only is this a useful educational model but this application may elucidate a primitive basis from which we as listeners have developed a human understanding of musical structure. By mapping input and output modalities of the musical and ecological, we may exploit the ecological basis of musical concepts for performance and compositional gain. Relationships between physical and figurative gestures may be reified in an otherwise densely textural, polyphonic musical practice. Such relationships lend themselves well to technology at hand, such as multi-channel guitar PUPs, DSP and multi-channel loud speaker systems.
Magaret Noble features my most recent approach to instrumental music performance on her website, “Sound is Art.” Performance Space Frames was informed by a series of acousmatic concepts (cf. Smalley, 1997, 2007) and a clearly defined electronic music performance paradigm developed and presented by Simon Emmerson, in his book, “Living Electronic Music” (2007).
This approach entails the real-time organisation of instrumental musical material into spatial frames, specified within a physical music performance space. Local and Field frames are determined by the application of digital signal processes (DSP) to an instrumental sound source, whilst considering audiovisual schemas associated with forms of (instrumental) musical gestures, both physical (effective, ancillary, real) and figurative (motivic, metaphorical, imaginary). For example, musical structures that are guitarististic (instrumental events adhering to typical instrumental expectancy schemata) and are localisable to the actions of the performer, may be located within a stage space within close proximity to the physical location and actions of the performing musician. More heavily processed audio events that may not be easily localised to the performer may be spatialised within an arena space, conveying new gestural narratives based on interactive digital signal processes that respond to real-time instrumental performance data (such as pitch, amplitude, motives, riffs, chords, melodies, physical instrumental techniques etc.). Interaction between space frames is encouraged throughout real-time instrumental music practice.
This concept may also be developed in order to accommodate unconventional chord voicing percepts that are highly dependent on spatial location (or perceived amplitude).
The presented concept is discussed at length in my doctoral dissertation, which is available from the author directly, upon request. More information concerning the presented instrumental (spatial) music performance concepts are available in this informal interview featured on the Sound is Art blog: http://margaretnoble.net/blog/signals-improv/