This blog has moved

This blog is now at http://www.celesteh.com/blog

Wednesday 3 August 2011

Dissertation Draft: BiLE, XYZ and gesture tracking

My colleague Shelly Knotts wrote a piece that uses the network features much more fully. Her piece, XYZ, uses gestural data and state-based OSC messages as well, to create a game system where players “fight” for control of sounds. Because BiLE does not follow the composer-programmer model of most LOrks, it ended up that I wrote most of the non-sound producing code for the SuperCollider implementation of Knotts's piece. This involved tracking who is “fighting” for a particular value, picking a winner from among them, and specifying the OSC messages that would be used for the piece. I also created the GUI for SuperCollider users. All of this relied heavily on my BileTools classes, especially NetAPI and SharedResource.

Her piece specifically stipulated the use of gestural controllers. Other players used Wiimotes and iPhones, but I was assigned the Microsoft Kinect. There exists an OSC skeleton tracking application, but I was having problems with it segfaulting. Also, full-body tracking sends many more OSC messages than I needed and requires a calibration pose be held while it waits to recognise a user. (http://tohmjudson.com/?p=30 ) This seemed like overkill, so I decided it would be best to write an application to track one hand only.

Microsoft had not released official drivers yet when I started this and, as far as I know, their only drivers so far are windows-only. (http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/about.aspx) I had to use several third-party, open source driver and middleware layers. (http://tohmjudson.com/?p=30) I did most of my coding around the level of the PrimesenseNITE (http://www.primesense.com/?p=515) middleware. There is a NITE sample application called PointViewer that tracks one hand. (http://tohmjudson.com/?p=30) I modified the programme's source code so that in addition to tracking the user's hand, it sends an OSC message with the hand's x, y and z coordinates. This allows a user to generate gesture data from hand position with a Kinect.

In future versions of my Kinect application, I would like to create a “Touch-lessOSC,” similar to the TouchOSC iPhone app, but with dual hand tracking. One hand would simply send it's z,y,z, coordinates, but the other would move within pre-defined regions to send it's location within a particular square, move a slider, or “press” a button. This will require me to create a way for users to define shapes and actions as well as recognise gestures related to button pressing. I expect to release an alpha version of this around January 2012.

For the SuperCollider side of things, I wrote some classes, OSCHID and OscSlot (see attached), that mimmic the Human Interface Device (HID) classes, but for HIDs that communicate via OSC via third-party applications such as the one I wrote. They also work with DarwiinOSC (http://code.google.com/p/darwiinosc/) and TouchOSC (http://hexler.net/software/touchosc) on the iPhone. As they have the same structure and methods as regular HID classes, they should be relatively easy for programmers and the wiimote subclass, WiiOSCClient (see attached), in particular, is drop-in-place compatible with the pre-existing supercollider wiimote class, which, unfortunately, is currently not working.

All of my BiLE-related class libraries have been posted to SourceForge (http://sourceforge.net/projects/biletools/) and will be released as a SuperCollider quark. My Kinect code has been posted to my blog only, (http://celesteh.blogspot.com/2011/05/xyz-with-kinect.html) but I've gotten email indicating that at least a few people are using the programme.

No comments:

Commission Music

Commission Music
Bespoke Noise!!