[Pd] OT: this is the kind of interface I want

Damian Stewart damian at frey.co.nz
Mon Nov 13 00:35:23 CET 2006


Hans-Christoph Steiner wrote:
> 
> Actually, the most difficult thing to do is make it work well in the
>  real world.  Making it work isn't too difficult, there are lots of 
> working variations, including the Pd-powered reacTable.  But video 
> tracking is really limited.  You have to have completely steady
> lighting conditions (notice the lights were turned off in that demo).
> 
i'm working for a small New Zealand company called Lumen Digital at the
moment. we actually have a *very* robust finger-tracking system based on
OpenCV that we currently use for digital map interfaces, and one of our
future research projects could involve some sort of tracking-based music
interactive table.

the lighting conditions do have to be controlled to some degree, but in
our case it's not a matter of needing no external light cast over the
interface so much as it is in needing lighting that doesn't change too
much over the course of interaction. i've just completed install of our
table in a gallery which was lit in the vicinity of the table itself by
more than 16 halogen bulbs, which together generate a considerable
amount of spill and shadow, and through all this our tracker was able to
reliably track hands to within a few pixels accuracy on a 1536x1024,
roughly 3m x 2m projection.

(by the way, if anyone's looking for a job in digital museum
interactives and not averse to moving to beautiful New Zealand to do so,
we're looking for programmers at the moment, email me at
damian at lumendigital.co.nz or jared forbes at jared at lumendigital.co.nz
for details.)

> I used that exact table interface at NIME at IRCAM.  It is certainly
> nifty, but it needs work to work in the real world.  The problem
> with video tracking is that there is no way to to track your finger,
> instead it just tracks shadows.

not so. we track fingers, not shadows.

> What happens is if the video tracking looses track of your finger for
> one instant, then it thinks you picked up your finger and put it back
> on the table. That can definitely screw up your actions.  And
> unfortunately which ever video tracking system thing I have seen,
> that exact thing happens quite frequently.

we currently get around that one by not actually detecting whether 
you're pressing on the table or not (although doing so would just 
involve a relatively simple capacitance sensor on the table surface). 
since our table allows for multi-hand tracking (we've tested with up to 
eight individual users using one or two hands around a 3mx2m table, and 
theoretically it could go higher) there's no reason why you can't have 
the right hand doing pointing the and left hand hovering near an active 
area (or two, or three) that vaguely corresponds to a mouse button 
click. the trick in our industry is to make it intuitive, so joe public 
can pick it up in literally five seconds.

> Sure, its nifty to wiggle images around and zoom and navigate,
> but that's a really simple app.  Try making photoshop with that,
> where the interface "just disappears"  I don't think humans could
> remember enough gestures to map all the functions in Photoshop, so a
> menu would probably be necessary.

gestures are another thing all together. having done tonnes of research 
before starting to code this project, i ended up concluding that 
gestures were far too fragile a system for public use, and having a 
palette of active areas as described above might be better for multiple 
actions. i remember finding a kind of a system that involving pointing 
at a small list that, as you got closer to it, kind of expanded in scope 
so that you were able to zoom through a large tree of options by subtle 
variations in the direction you moved the mouse. something like this 
would be ideal.

-- 
Damian Stewart
+64 27 305 4107

f r e y
live music with machines
http://www.frey.co.nz
http://www.myspace.com/freyed




More information about the Pd-list mailing list