What seems like 100 years ago I set out to write a smallish book about Kinect hacking for artists.
As happens with many technical books (even those attempting to keep for getting terrible technical) there are times when to talk about one thing you need to first talk about another thing. The another thing was Open Sound Control. It ended up being spun off into Just the Best Parts: Open Sound Control for Artists.
Completing that felt great. But for whatever reason my enthusiasm for Kinect hacking had drifted. I had been working some sample programs that seemed like terrific ideas. I tried out small Processing sketches to see how this or that thing worked. When I put a few things together, though, it didn’t work as I hoped. Too slow. And with that frustration I turned my attention to other things.
Recently I was contacted by an art student who had some questions about using the Kinect to control music. I got me to revisit my old code to find out where I left off.
As I recalled I was last using a blob detection library and using size and location to trigger MIDI and OSC. It was too slow to avoid annoying latency. Because I that I took another approach. Sort of a “blob detection lite”.
As with my first attempt I want to only track detected objects that fell within a defined bounding region in front of the Kinect. Things that fell into that select region had been used to generate blobs. When something fell into the bounding reagion is was rendered as a green pixel.
My new approach was to run through the image pixel array and simply count how often a pixel was green. Not the whole screen, but select divisions; basically, squares in each of the corners. This turn out to be faster than the blobs. (I think; I need to revisit the blob code to see what, exactly, made me think it was too slow.)
The problem was that reacting to events (e.g. we have a large enough area of green pixels) would bring the code to a near halt.
One of the nice things about revisiting code after some time is you can see things in a new way. In this case I had some kind of epiphany about using Java threads in Processing. It’s not hard. It’s actually snake simple. Why this was such a revelation I do not know. I imagine I had some preconceived ideas about how such things might work in the Processing world. Wrong ideas that stopped me from looking into it. Silly but it happens.
Things became much improved by having MIDI and OSC message sending happen in their own threads. I had to do some playing around to get good results. There were some initial fugly hacks to just get things to work sensibly (i.e the code to set up and send messages).
After assorted Googling I got something of refresher course on Java threads and cleaned things up.
The code is now up on Github: Just the Best Parts: Win7 Kinect Area Hacking.
The README explains more about the code (and why it’s “Win7”).
It’s still a work-in-progress. It runs. It can be used to send OSC and MIDI. What remains is adding more helper methods to make things easier for still-learning Processing hackers.
And then get back to writing that Kinect book.