Nick Inzucchi – Looking Outwards 4

by nick @ 4:02 pm 17 March 2012

This week I looked at several projects involving physical motion and sound manipulation. I’m interested in how depth and motion sensors can be used to make listening a more active experience. Many have used this input to control digital instruments, but few have explored more passive methods of sound control.

Kinectar // Chris Vik

[youtube=https://www.youtube.com/watch?v=7DcENxBIKTE]

Kinectar is an OSC-based system for handing Kinect skeletal data to Ableton Live. Chris Vik’s demo uses two hands to control 8 parameters of a granular synthesis system, including grain size, loop start, pitch, filter, LFO cutoff/depth, reverb, and panning. This project rides the border between synthesis and manipulation. The source material is a preloaded sample, but its musical qualities emerge from the artist’s motion alone. I would like to see these same controls in a more inherently musical context.

ScratchML // Kyle McDonald, Jamie Wilkinson, Quin Kennedy

[youtube=https://www.youtube.com/watch?v=bGh1wV-VXDM]

ScratchML is a file format for recording and replaying turntablism routines. It comes with an open source tool for capturing crossfader and record movements and relaying this data over OSC. This dataflow is extremely valuable as it capitalizes the strong link between a DJ’s physical motion and sonic results. TTM transcriptions could prove valuable for developing analogous sound-manipulation systems on kinect. How can we recreate this input three dimensions, rather than just 1?

Kinect Looping // Chris Vik

[youtube=https://www.youtube.com/watch?v=xPcoM7BIDZ4]

In this video Chris Vik uses simple kinect data (position + speec) to control a complete Ableton orchestra. Each movement is tied to a single control, and a looping pedal is used to add some complexity. The sonic results are rich, but the performance is visually a disaster. I appreciate the artist’s enthusiasm and hard work, but his flamboyant gestures make it hard to take seriously. Future projects should use subtle and intuitive gestures to create more natural musical interfaces.

1 Comment

  1. Check out Johannes Kriedler’s “kinect studies”: https://www.youtube.com/watch?v=UAlcTnvbBS0

    Comment by dan — 20 March 2012 @ 9:09 am

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2023 Interactive Art and Computational Design, Spring 2012 | powered by WordPress with Barecity