Back in 2012 I gave a Tiny Army presentation on doing real-time animation with Animata, Kinect, and OSC
I’ve had some issues with program stability. On my Ubuntu laptop Animata will inevitably segfault before rendering anything. On Windows 7 it is much more stable but will still, at times, crash.
I was pretty happy to learn of a Processing library designed to render Animata projects. I’ve had much better stability with Processing, across multiple platforms, and running Animata animations in Processing would allow for additional features.
When I tried the code in what appears to be the most recently maintained animatap5 repo I found that it did not work with the current version of Processing.
(Side note: The “official”, or stable, release version of Processing is 1.5.1. However, all work appears to be going into version 2, and that is what you are first offered for download on the Processing site. Despite the “beta” tag there’s probably little reason to not use version 2. I had to build for source in order to get the command-line tools to work correctly for me on Windows.)
I forked the
zeni/animatap5 repo and have now updated it to work with Processing 2 (specifically Processing 0217, which may equate to version 2.0b9).
You can find my version of AnimataP5 at https://github.com/Neurogami/animatap5
I’ve so far only used it on Ubuntu 10.04 (yes, old, I know). I will need to make sure it plays nice on Windows 7 as well.
I have no reason to expect any problems with this so long as it all compiles.
I’ve so far only tried it with the single example that came from the original repo.
If you try this out and run into trouble, let me know and I’ll see what I can do.
I’ve published Just the Best Parts: OSC for Artists
For $9.00 you get a fun, informative, and to-the-point intro to Open Sound Control.
There are terrific graphics and well-explained examples suitable for non-geeks.
If you are an artist or musician, or any kind of inquisitive creative spirit, and you are not familiar with the freaky magic powers of Open Sound Control, get this book.
You can purchase the mobi/epub/PDF bundle from Indie Aisle.
If you prefer to order from Amazon, you can get it here.
Please note that ordering from Amazon will only get you the mobi file.
In all cases there is no digital restriction management (aka DRM), and you can still read a rough version of it on the book’s Web site .
A few weeks ago I read about some critical security issues that affected Rails. I’m not a Rails user; don’t really care for it (Ramaze rules), but it’s still of interest because often what’s claimed to be a problem with some specific application can in fact be caused by something more fundamental, making it a problem for other Rubyists as well.
I learned of this issue form reading Hacker News, but that seems like a poor way to get security updates. I looked around for a better source. The main Ruby web site has a page for security issues but it seems to be out of date. Some other efforts have sprung up to make security issues more readily available but they all seem to require that people actively go look for the info.
Having to remember to go check for security issues is an unreliable way to stay informed. Better to have that information put in front of you as it occurs.
I run Ruby-doc.org, which serves up API documentation for multiple versions of Ruby and most (if not all) available gems, and it gets a fair amount of traffic. It occurred to me that it would be a really good place to display security alerts.
The NVD search results page presents the data in well-formed HTML. This made it fairly easy (modulo some corner cases) to extract the specifics of current vulnerabilities, generate a short announcement blurb, and write it to a file.
I tried to do it in a way that is noticeable but not terribly in-your-face. I opted for 14 days as a balance between reaching a larger number of people while (I hope) not having a perpetual alert banner that people ignore.
That number of days was based on a guesstimate about how often new alerts come up. I may lower it, maybe to 10 or 7 days, to avoid alert fatigue if there are too many issues reported. Or alter the color of the banner based on the severity of the most recent alert, or color it using some aggregate severity based on what was found. Or skip listing low-severity alerts. I’ll have to see what kind of feedback I get.
The goal isn’t to reach everyone, but to reach enough people who will take notice and help spread the word when there are Ruby vulnerability issues.
Read it here: Socket to me.
About a month or so ago I received my Leap Motion controller, as part of being their developer program.
It’s a very very cool device, and I’ve been hacking around using JRuby and Processing.
Today I launched a Web sited about this: Leap Hacking
I’ll be posting write-ups of my explorations as well as general information useful to people writing apps for the Leap.
The firs post is up now. It’s derived from some presentations I gave over the last few weeks, where I showed off using the Leap with JRuby, Processing, and having it drive a browser-based game using Web sockets.
Read it here: Notes from a Leap Motion Presentation, part I: JRuby
I’m in the Leap Motion developer program; I received my device a few weeks ago and it is very slick.
I’ve been writing some Leap code using JRuby, exploring some ideas. I’m planning an EBook on Leap Motion hacking with JRuby as well.
On Thursday, January 31, I’ll be giving a demo at the Phoenix Ableton Live user group.
I’ve also been playing around with assorted polymer clay and plastics.
Sculpey is a brand of polymer clay. This is stuff that’s soft and pliable at room temperature, but when heated for a period of a time it solidifies. It’s quite handy for making vinyl art toys as well as holders and cases for assorted electronics.
G+ plus post showing the first thing I made; I’ve since glued the missing hand back on.
I first got some “Super Scuply III” which when baked becomes really quite solid. Then I learned of some of the other kinds of Sculpey and got some Sculpey Premo. This is the coolness because when baked it retains some flexibility. This works much better if you’re making, say, an enclosure for a smart phone ( or the Leap Motion) so you can mount it someplace.
It’s a polyester thermoplastic, kinda-sorta like Sculpey but kinda-sorta in reverse. It’s a hard (though somewhat flexible) plastic at room temperature but soft and pliable when heated. This is also very useful for making holders and cases (and art toys). Unlike Sculpey, though, it’s reusable. Drop it into some hot water and it gets all goopy again. The downside: molding hot plastic is awkward.
There are are few people in the Phoenix area doing interesting things with polymer clay and thermoplastic, and at the next Tiny Army meeting (Wednesday, February 6 at 6pm, Art Institute of Phoenix) I’ll be part of a show-and-tell presentation on uses and techniques. You should come if for no other reason than to learn about kit bashing
My main interest is in using assorted controllers, such as the Wii remote the XBox Kinect, to control music and graphic applications.
There are some existing programs, such as OSCeleton, that send OSC right out of the box. In other case I needed my own to read data from the controller and generate the OSC. Even in the case of something like OSCeleton I found I needed an intermediary to convert the original OSC into a different set of address patterns and values.
While writing Just the Best Parts: OSC for Artists I ended up building assorted helper programs to work with OSC. Actually, started writing OSC code before I even thought about this book, back when I first learned of Animata.
You can control Animata characters using an XBox Kinect and OSCeleton (I gave a presentation about this to the Tiny Army artist group; you can see my notes here). Since Animata handles OSC you are not limited to any specific controller, or limited to just one OSC source. I used the Kinect to control a character, but also used my phone running TouchOSC to manipulate scene transitions and to animate background objects.
One thing I noticed right away was that having to use a particular controller (Kinect driving OSCeleton) made it hard to quickly test things. I needed a simple tool to send any OSC I wanted without having to set up any hardware, so I wrote a small Ruby script that left all the hard stuff to the osc-ruby gem.
As happens, the script started simple and then acquired assorted features. First I added the option to load a configuration file, and then wrote code to auto-detect argument data-types. The script was a one-shot deal: you invoked the code at the command prompt, passing in the OSC message to send. It worked well, considering that on every call it would load the configuration and instantiate an OSC client before parsing the arguments and sending the message.
Once I started on OSC for Artists I realized that explaining the use of a command-line tool would be a distraction. I had switched to Processing for the book’s examples, and ported the Ruby script a Processing GUI sketch. As happens, the sketch started simple and then acquired assorted features. (See my post about OSC Commander.)
While in many ways an improvement over the initial Ruby script I don’t always want to kick-off Processing to send some OSC messages. Recently I’ve been focusing on using OSC to drive music applications such as Renoise, Reaper, and Ableton Live. (Only the first two have native OSC support; Live offers only MIDI control so a proxy of some sort is needed.)
I’m especially concerned with latency. I’ve written some Process code that reads data from a Kinect and uses blob detection to trigger OSC and MIDI messages. When you’re working with the Kinect have (broadly speaking) two ways to work with the data: skeleton tracking and depth data. Skeleton tracking is perhaps the more interesting, and usually the fastest. The downsides are that you first need to have the Kinect locate and recognize your body parts, and there’s the chance that if you move the wrong way or step out of bounds the Kinect will lose you and stop tracking your movements. Depth data does not have these problems, but it requires you to provide the processing resources to do something useful with the data. Blob detection can be pokey depending on your machine. The upside, though, is that you do not need any setup process; you move in or out of defined ranges and it Just Works.
Latency-encumbered signals are OK for controlling broad behaviors, such as filters or other coloring effects, but I still plan on using a Kinect for direct triggering of notes and change controls that need to work on time, so I revisited by Ruby OSC script and made some changes.
The first thing I wanted to avoid having to reload the configuration data. I decided I wanted a REPL (read-eval-print loop) that let me enter OSC messages. This was simple to do; a `while <
user-has-not-quit > do` loop works well. Ask for user input, check it for some “quit” indicator, if not quit than assume the string is an OSC message. Very nice.
Almost right away I found myslef trying to use the arrow keys to recall previous commands. That failed; time for Readline.
Adding Readline made it so much nicer. (Note that in the previous one-shot CLI version I had command recall built-in as part of the shell.) But I still had to type in commands before I could use and reuse them. Oh bother! So I added the option of passing in a list of commands to pre-populate the Readline history.
If you call the program with no arguments it assumes you have a configuration file named
.osc-config.yaml in the current directory. If you pass one argument it treats that as the path to the configuration file. If you pass more than one argument than it treats all but the first as OSC messages to stuff into Readline history.
The code is up on Github as osc-repl. It’s available as a gem, too:
gem i osc-repl --source http://gems.neurogami.com.
It’s pretty fast when running locally. I was able to trigger real-time notes in Renoise with no trouble. The ability to recall commands, edit them, and resend is super handy. Comments and suggestions about it are invited.
Way back in 2007 I wrote an article for DevX.com that was part of a “special issue” of sorts about Ruby.
My contribution was “10 Minutes to Your First Ruby Application” and (I hope) it gave something of a whirlwind tour of some of Ruby’s cooler features by walking the reader through a reasonably practicle, albeit slightly contrived, example.
The deal with DevX was that after some period of time the rights reverted back to me, so I put it up on Neurogami.
I also added a comments section, and lately some folks have been kind enough to point out some problems.
I went through the article and the code again, and managed to dig up some of the original source code. I made some changes to the code and article (some errors were fixed and code was changed to reflect ruby 1.9.3), and have now updated that article.
I also uploaded the source code and added links to the relevant files throughout the article.
Writing a demo app that is both practical and educational, while avoiding too many contrivances, is a challenge. I’m still pretty happy with what I did. The article really does introduce a good number of important Ruby topics, including some that many tutorials sadly claim are “advanced” (and hence perpetuate weird fears about learning proper Ruby).
The Phoenix Ableton Live User Group had, for it’s final meeting of the year, a half-day group project.
The goal was to create, from scratch, a complete song (or track or tune or whatever you like to call a standalone piece).
I went, I had a blast, I learned a lot. We didn’t reach our goal, though. We got maybe halfway there; it’s hard to tell with things like this.
When we broke up we had a number of tracks in a session project. Drums (from a sample, with some modifications), some pad synths played via a small MIDI keyboard, some guitar lines, and a bass riff. The guitar and bass we recorded live; I played bass.
We didn’t get a finished piece that day, but we decided each of use in attendance should try to finish up on our own.
I grabbed a copy of the Live project files and set to work. It’s been interesting.
For the impatient, you can listen to what I came up with so far over on Soundcloud.
I have my own quirky taste in music; I’ve found it often differs from most anything mainstream, and knew in advance of the meet-up that the music we would work on would almost certainly not be something I’d pick on my own. I also decided that this didn’t matter, because the goal was not Make Music That James Likes but t engage in a process and see what could be learned.
To that end I tried to set a few rules for myself before doing my revamping of the session tracks.
I’m a fan of the show Chopped, a cooking contest show on Food Network. They get four contestants and they each have to make a series of meals from a box of mystery ingredients.
The rules are that they have to use all of the mystery ingredients in some form, though they can also use stuff from the show’s pantry (milk, eggs, spices, whatever).
The judges always complain when they can’t identify all the mystery ingredients, or when the mystery ingredients are not given at least some focus in the dish. In other words, you can’t get away with making whatever the hell you want while hiding the (usually discordant) mystery ingredients.
So, with that as a general guideline, I decided to work towards something that, to any one who heard the original session tracks, would at least sound familiar. I did slice-n-dice a few ingredients, though.
I avoided re-recording anything or tossing in extra instruments. I kept the BPM, and while I messed with the drums a bit I think they’re mostly the same (at least in general style, though with more tsk-tsk-tsk-tsk and a bit less thump).
I kinda-but-not-really cheated. I know that Live will let you edit and mash-up recorded samples but I’m much more comfortable using Reaper for that. I used Reaper to slice up the recorded samples, in one case reconstructing a more coherent bass phrase than what was recorded live. Editing flubbed notes is a Good Thing.
I also used Reaper to create a new drum pattern based off slices from the one used on the session files. All quite doable in Live, so I wasn’t doing anything un-Livable, I just decided to save some of the hassle.
Once I had some reconstituted samples I loaded them into Live and worked from there.
For what it’s worth, I’m really more of a Renoise fan. Depending on where this all ends up I may decide to do another mix-up using my hacked-up samples with Renoise and just mangling things as I please. My usual way of working is to generate some basic percussion tracks in Renoise, spawn some wav files, and transfer them to a Tascam digital recorder. Then I improvise over those tracks. If I think I have anything interesting then I export from the digital recorder, extract suitable loops, maybe clean them up, and load them into Renoise.
There’s a lot of overhead doing this, but using the Tascam means no latency worries, and far less chance of something crashing and destroying my work. We had some latency issues when recording the Live tracks, and when I was manipulating clips and samples I got the feeling that the timing was still just ever-so-much off.
Some observations on my Live re-mix experiment:
Working in a group as we did means you end up with this or that musical part that you would never have picked yourself, and may not even like. That’s certainly the case for me, but I was very interested in seeing what I could do with the tracks that would make me reasonably happy while trying to stay true to the source.
Please note that I’m not suggesting that what anyone did was bad, just that, in varying degrees, it was not my personal taste. In fact everyone there showed themselves quite skilled and pretty astute about music, albeit within different realms and genres. The range of knowledge and experience was great.
I was able to get a main section arranged that I liked, and was listening to it in a loop, over and over. And I was pretty much happy, could possibly have called it a day right there.
A lot of the music I like doesn’t change in the usual ABABCCAB song format. Often it’s a lot of AAAAAAAA or maybe AAAABBBBBCCCCC.
But even most those pieces have some sense of movement. Much as I liked my loop I had a feeling that I was hearing it through a cognitive filter: My mind, to some degree, was superimposing all sorts of what-ifs and additional context and possibilities, things no one else would bring to it.
Often a guitar riff or a bass pattern will pop into my head while I’m doing stuff around the house, and I try to record it in some way. I typically grab the bass or Strat, usually without any amp, and use a recorder app on my phone to capture the tinny buzzy sound of the guitar, then save it off to Dropbox.
Most of the time, at the time of recording, I’m convinced my riff is the most awesome epic catchy phrase ever. Then, some time later, when I listen to it again, I wonder what in the world I was thinking.
I suspect what happens is that when I hear music in my head there’s aways some actual or implied backing or rhythm music, and this of course never gets recorded when I’m saving off a quick recording.
It’s not the sound quality of the recording that fails me, or that I can’t figure out how to play it again on an instrument, it’s that the riff plus the context is what worked so well in my head; the riff alone is often too weak.
So I left that main section loop alone for a bit, and it sort danced around in my head, and that’s when some ideas for a change came in, as well as some thoughts on changing the mix of the main part. The nice thing in this case is that these ideas were playing off backing tracks that already existed.
I’m actually happier with the second part of my piece than the first. I like the slightly off-kilter aspect of it. Of course, there’s no way I would have come up with it had I not been working with a fixed set of source material not entirely within my usual preferences; I wouldn’t have that second part without having done the first.
I now want to add something of a lead melody line, where “melody line” means something glitchy and angular.
On “Chopped” the judges complain when they feel a dish is not cohesive, when they think they’ve been served a plate of three or four pleasant, but otherwise unrelated, items. I don’t actually *cook * (though I bake kick-ass bread), but the discussions about composing a dish and making disparate flavors all come together is surprisingly useful.
What I think I need to add is some sort of high-level motif that pulls the piece together, or do something in the main part that in some way foreshadows the break so they feel more connected.
Another option is to break ranks and create a new main part, based off the break section, and likely end up with something that pulls away from the original group project, but feels more cohesive and right to me.
Part of my book, Just the Best Parts:OSC for Artists is done in a sort of graphics novel kind of way.
Here are some panels: