Nick Inzucchi – Project 4 – Kinectrak

by nick @ 6:58 am 29 March 2012

[vimeo https://vimeo.com/39389553 w=600&h=450]

My fourth project is Kinectrak, a natural user interface for enhancing Traktor DJ sets. Kinect allows DJs to use their body as a MIDI controller in a lightweight and unobtrusive way. The user can control effect parameters, apply low + high pass filters, crossfade, beatjump, or select track focus. This limited selection keeps the interface simple and easily approached.

[slideshare id=12200721&doc=p4pres-120328211310-phpapp02]

I used Synapse to get skeleton data from the Kinect (after several failed bouts with ofxOpenNI), sent over OSC to a custom Max patch. Max parses the data and uses it to drive a simple state machine. If the user’s hands are touching, the system routes its data to the Traktor filter panel, rather that FX. This gesture lets DJs push and pull frequencies around the spectrum; it feels really cool. Max maps the data to an appropriate range and forwards it to Traktor as MIDI commands. The following presentation describes the system in greater detail.

I’d like to end with a video of Skrilex (i know) and his new live set up by Vello Virkhaus.

1 Comment

  1. LOL for “emotional knob-twisting”+1+1
    Great presentation and process description! agree
    I like your presentation style. Your thought process is articulate to understand well. Smart idea of using movement to control sound. I like how you perform/mix music and perfom body-movement(like dancing) at the same time.
    I’m glad you’re thinking about non-speculative design solutions for a personal problem. This is real design; the criteria for success are quite clear.
    I like the fact that the system is controlling some non-crucial, subtle, ambient/peripheral aspects of the sound. The kinect/body app is well-suited to this aspect of the sound-performance.
    Good presentation, lays out your design and approach clearly, good demo video composition and overlays
    Your project is really cool and really useful. The hi-/lo-pass filters are particularly effective. From the first video, I thought it was a little hokey, but the second video (of the actual djing) shows how cool it is.
    The diagrams in the presentation are very clear and show your thought process and workflow cleanly. Awesome video! The concept seems almost complete, but it’s not clear if you can replicate the most important things a DJ would actually be doing during a performance. You jam out a lot with your body along to the beat, without any gestures – could you capitalize on that to pulse the volume, or make some other subtle effect?
    Sweet! You’re jamming out. Seems really useful and to capture the natural inferface capabilities you mentioned at the beginning of your prez. It would be good to have a display showing the current effect selected.
    I like the idea that you’re converging high tech DJ/VJ with classical/traditional conducting or orchestrating … I’d like to see if you could overlay digital music over a full orchestra (based on the conductor)
    Other than the high / low pass filter and the fades, it’s hard for me to tell exactly what is being manipulated by your gestures.
    +1 to Madeline’s IRL comment about conductors
    This project is a good example of one of the most useful applications for Max & PD: mapping
    Cool stuff, kinda hard to tell how it is working from the video though, maybe make one with just you moving and really simple noticable tracks? or make a video with just you moving and having the effect that is put on flash on the screen? like make it REALLY obvous. i think you should try to demo this at the next VIA festival?
    Nice presentation, Nick. You did a great job showing your process, the motivation was clear, and you obviously gave this a lot of thought.
    its a good concept, well executed, lots of potential depending on what effects you put in each hand. very bloggable.

    Crazy how many functions you can perform with the skeleton tracking of the Kinect. Is there a lot of issues with accuracy of recognizing gestures? I really like your sensitivity to incorporating the natural movements of the DJ in the Kinect gestures.
    Great job making this whole process for using the kinect to dj much more elegant and user friendly. Looks like it could really be a useful (and fun) tool.
    Very nice. It looks like a lot went into this. It seems like you are able to use the tool you built very effectively. The high and low pass gesture is most dramatic for the uninitiated. I think for the video, for the sake of explaining the tool, try to come up with some more dramatic examples to show how the movements map to the effects on the music.
    Your arguments and logical presentation of why this is important was incredible. From your explanation it seems important and logical. But I am slightly left in the dark during the performance. Without a background in music I don’t fully know the importance.

    Comment by admin — 29 March 2012 @ 11:40 am

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2023 Interactive Art and Computational Design, Spring 2012 | powered by WordPress with Barecity