Monomyth is a piece where I felt I could finally achieve a large scale sense of gestural sound control, with the bonus of an ensemble performance. This piece transforms the Kinect into a MIDI controller, allowing notes to be triggered by touching a specific point in a three-dimensional space.
The interaction here between a performer and these hovering sounds is one of intrigue and mysticism. Though using the Kinect as a MIDI controller creates an effective illusion of interacting with spatialized sounds, this method of gestural sound control remains a stepping stone toward a larger goal of finding an even more interactive setting between sound and performer.
The patch provided needs other components to function. One is a program to convert Kinect data into OSC data, and the other is to route the OSC data to specified addresses which can be in turn designated within the max patch. The patch then transmits MIDI data to a specific location. The patch allows for each sound to have an x, y, z coordinate, a size, and a MIDI pitch, velocity, and channel.
Composed by Kevin Anthony and Austin Lopez; Performed by BYU Group for Computer Music