Too much fun at my job – Part 2


For the past few days, I’ve toyed with the Kinect sensor. In order to use the sensor, I downloaded the freenect library, as well as a freenect wrapper for both Processing (courtesy of Daniel Shiffman) and Max (courtesy of Jean-Marc Pelletier). After a bit of experimentation, I started working with the jit.freenect.grab object for Jitter.

By combining the jit.freenect.grab object with some computer vision processors (OpenCV – also thanks to JMP), I built a quick & dirty kinect-controlled synthesizer in Max.  X/Y position of your hands (or head or legs) determines pitch/amplitude, and Z position triggers sound events on and off.

I’m thinking of this interface as a kind of virtual curtain. Put your hand through the curtain, and you make sounds. Move your hands around, and the sounds change accordingly. Pull them behind the curtain and the sound stop. It’s a pretty simple mapping, and I will incorporate more CV tracking to influence sound generation (direction, # of features, optical flow), and to build more complex sound objects. Right now, it’s just 16 voices of sine waves, but I’m thinking of using granular textures as well as virtual objects that will make sounds when “hit.” (Working title of this project is “Pay No Attention” as in “pay no attention to the man behind the curtain” from The Wizard of Oz).

I’ll post a video of what this looks like after the holiday break.  I need to spend some time thinking about how to give this interface a non-trivial mapping, and the kind of “virtual instrument” I want to compose for. And after talking with Brygg Ullmer, I’m now thinking that this may be something that could be adapted for our laptop orchestra, the LOLs.

The good news is that I’ve had the time to become totally engrossed in working on this. The bad news is that I lose all sense of time when that happens. What is certain is that I have way, way too much fun at my job.

Links:

  • Git repository for OpenKinect freenect library
  • Daniel Shiffman’s Processing library for freenect
  • Jean-Marc Pelletier’s Max/Jitter objects for freenect
  • Jean-Marc Pelletier’s OpenCV objects for Max/Jitter
Advertisements

5 thoughts on “Too much fun at my job – Part 2

  1. Hi

    I’ve just started using MaxMSP after being introduced through university, and I’m working on a personal project where I use jit.freenect.grab to follow a red object and then it draws onto another LCD.

    It seems like I could use Kinect to achieve the same thing, and it may even be easier. I guess through following the locations of a hand when it is held forwards at a different depth, and then it would hopefully easy to get the x and y locations then put them into the other LCD.

    However, I was wondering if you could offer any advice because you seemed to have achieved something similar and I’m nervous of buying such an expensive object and finding out that I can’t use it or that it is incredibly difficult.

  2. Are you using jit.freenect.grab with a webcam? Hmm. You can do what you have described (following a red ball) without using Kinect or freenect.

    Still, I recommend getting a Kinect sensor. It’s only ~$150, but the depth info is really (REALLY) useful and easy to work with. The best part is that the depth sensor doesn’t work in the visible light spectrum, so you can use it in any lighting condition (unless you’re flooding your room with Infra-Red light for some unknown reason).

    But I would also recommend getting your Jitter skills up to speed, and learning the OpenCV implementation for Max. I think Jamoma also has some computer vision stuff in it.

  3. Thanks for the reply.

    Sorry I didn’t make what I was doing very clear, I’m currently using tracking as a means to an end so that I can have a brush (in this case a red object).

    But it I had Kinect I would (hopefully) make it so once the hand is through the curtain that you describe it becomes the brush that draws onto another LCD, eliminating the need for any colour tracking.

    I’ll definitely check out OpenCV.

  4. Hey I got Kinect today and it’s really cool to mess around with.

    I know it’s a bit cheeky, but would you be able to put this patch up for download?

    Being able to track hands is what interests me the most and I’ve been struggling to recreate what you’ve done because of the low-res image and the inner patches.

    Hope you can help.

  5. Hello!

    I am currently working on a project for my Interactive Environments class and I am wanting to use hand detection with various Jitter objects and a Kinect just as you have used in this patch you’ve created. Basically, I want to track the position of hands so that the user can use their hands to create a rainbow on a projected screen. I am pretty new to Jitter and it has been a bit rough for me to figure out how I am going to do the hand tracking. I was wondering if you have put up this patch for people to use! This is the absolute ideal patch that would help begin my project.

    Thanks so much! :)) Great work on your patch, I am very inspired!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s