We present a prototype DMI that eschews artefact multiplication in pursuit of enhanced real time creative independence. We reconceptualise the universal click-drag interaction model via a single sensor type, which affords both binary and continuous performance control. Accessibility is optimized via a familiar interaction model and through customized ergonomics, but it is the mapping strategy that emphasizes transparency and sophistication in the hierarchical correspondences between the available gesture dimensions and expressive musical cues. Through a participatory and progressive methodology we identify an ostensibly simple targeting gesture rich in dynamic and reliable features: (1) contact location; (2) contact duration; (3) momentary force; (4) continuous force, and; (5) dyad orientation. These features are mapped onto dynamic musical cues, most notably via new mappings for vibrato and arpeggio execution.
Return to the previous page.