BLOCKS – Visualising Performance Data in GEM (Pure Data)

179519_10150901654313708_599213707_11956369_653817571_n

BLOCKS reflects the spatial location and amplitude of a multi-channel output of an electric guitar. This is the initial patch.

 

Initial Patch (Phase 1)

[youtube http://www.youtube.com/watch?v=Qdjw-O91vhU]

 

New (Spatial) Modes – Phase 2

[youtube http://www.youtube.com/watch?v=349FMIlaVsE]

A series of spatial modes are based on pitch-class distance, pitch height and flocking behaviours constructed in Pure Data. 

Mode 1: Each block corresponds to register and pitch-class distance within a particular time frame relative to short-term memory constraints. Blocks will be spatialized dynamically across a physical multi-channel speaker-based musical performance space, relative to register, pitch-class distance and note onset.

Mode 2: Each block adheres to a series of flocking behaviours (boids) in Pure Data. A gestural narrative may be established by associating tonal tension and resolve directly with attraction and inertia-based flocking behaviours.

The user may then switch between modes during a real-time musical performance.

 

BLOCKS – Live Performance Demonstration

[youtube http://www.youtube.com/watch?v=OkH4q-dSOgQ]

Download the Patch

Leave a Reply

Your email address will not be published. Required fields are marked *