Current album

Get it here ...


I’ve been working a lot with OSC (Open Sound Control). I’ve even been writing a book about it (Just the Best Parts: OSC for Artists).

My main interest is in using assorted controllers, such as the Wii remote the XBox Kinect, to control music and graphic applications.

There are some existing programs, such as OSCeleton, that send OSC right out of the box. In other case I needed my own to read data from the controller and generate the OSC. Even in the case of something like OSCeleton I found I needed an intermediary to convert the original OSC into a different set of address patterns and values.

While writing Just the Best Parts: OSC for Artists I ended up building assorted helper programs to work with OSC. Actually, started writing OSC code before I even thought about this book, back when I first learned of Animata.

You can control Animata characters using an XBox Kinect and OSCeleton (I gave a presentation about this to the Tiny Army artist group; you can see my notes here). Since Animata handles OSC you are not limited to any specific controller, or limited to just one OSC source. I used the Kinect to control a character, but also used my phone running TouchOSC to manipulate scene transitions and to animate background objects.

One thing I noticed right away was that having to use a particular controller (Kinect driving OSCeleton) made it hard to quickly test things. I needed a simple tool to send any OSC I wanted without having to set up any hardware, so I wrote a small Ruby script that left all the hard stuff to the osc-ruby gem.

As happens, the script started simple and then acquired assorted features. First I added the option to load a configuration file, and then wrote code to auto-detect argument data-types. The script was a one-shot deal: you invoked the code at the command prompt, passing in the OSC message to send. It worked well, considering that on every call it would load the configuration and instantiate an OSC client before parsing the arguments and sending the message.

Once I started on OSC for Artists I realized that explaining the use of a command-line tool would be a distraction. I had switched to Processing for the book’s examples, and ported the Ruby script a Processing GUI sketch. As happens, the sketch started simple and then acquired assorted features. (See my post about OSC Commander.)

While in many ways an improvement over the initial Ruby script I don’t always want to kick-off Processing to send some OSC messages. Recently I’ve been focusing on using OSC to drive music applications such as Renoise, Reaper, and Ableton Live. (Only the first two have native OSC support; Live offers only MIDI control so a proxy of some sort is needed.)

I’m especially concerned with latency. I’ve written some Process code that reads data from a Kinect and uses blob detection to trigger OSC and MIDI messages. When you’re working with the Kinect have (broadly speaking) two ways to work with the data: skeleton tracking and depth data. Skeleton tracking is perhaps the more interesting, and usually the fastest. The downsides are that you first need to have the Kinect locate and recognize your body parts, and there’s the chance that if you move the wrong way or step out of bounds the Kinect will lose you and stop tracking your movements. Depth data does not have these problems, but it requires you to provide the processing resources to do something useful with the data. Blob detection can be pokey depending on your machine. The upside, though, is that you do not need any setup process; you move in or out of defined ranges and it Just Works.

Latency-encumbered signals are OK for controlling broad behaviors, such as filters or other coloring effects, but I still plan on using a Kinect for direct triggering of notes and change controls that need to work on time, so I revisited by Ruby OSC script and made some changes.

The first thing I wanted to avoid having to reload the configuration data. I decided I wanted a REPL (read-eval-print loop) that let me enter OSC messages. This was simple to do; a `while <user-has-not-quit > do` loop works well. Ask for user input, check it for some “quit” indicator, if not quit than assume the string is an OSC message. Very nice.

Almost right away I found myslef trying to use the arrow keys to recall previous commands. That failed; time for Readline.

Adding Readline made it so much nicer. (Note that in the previous one-shot CLI version I had command recall built-in as part of the shell.) But I still had to type in commands before I could use and reuse them. Oh bother! So I added the option of passing in a list of commands to pre-populate the Readline history.

If you call the program with no arguments it assumes you have a configuration file named .osc-config.yaml in the current directory. If you pass one argument it treats that as the path to the configuration file. If you pass more than one argument than it treats all but the first as OSC messages to stuff into Readline history.

The code is up on Github as osc-repl. It’s available as a gem, too:

gem i osc-repl --source http://gems.neurogami.com.

It’s pretty fast when running locally. I was able to trigger real-time notes in Renoise with no trouble. The ability to recall commands, edit them, and resend is super handy. Comments and suggestions about it are invited.

My Music


American Electronic

Small Guitar Pieces

Maximum R&D

Love me on Tidal!!111!

Neurogami: Dance Noise

Neurogami: Maximum R&D