[PD] Live motion tracking using pd / gem / GridFlow / PiDiP
cgc at humboldtblvd.com
Sat Mar 13 04:03:01 CET 2004
On Mar 12, 2004, at 7:29 PM, Max Neupert wrote:
> Hi list,
> I am working on a project where I want to confront the observer with a
> projection that changes according to his/her position, thus
> eliminating the
> effects of perspective.
> So I tried that, the problem is that the object just understands the
> colorspace, converting the YUV camera stream first seems quite a
> task for the computer.
Actually, pix_movement does work with YUV - get the CVS version of GEM.
I wrote an Altivec version of it for PPC. The object is quite fast
and uses well under 10% CPU on a 1Ghz G4 running 720x480 video at
> I succeded in creating a gem patch according to the tutorial patch
> does something, but there is no working tracking
> (see attached patch)
The only problem I see is that you haven't given pix_movement a
threshold argument. Send something like 0.1 to the right inlet to make
it do it's thing. The only way I've been able to get decent tracking
out of movement + blob is to use some sort of data smoothing object
like hyperspasm's smooth object or even a plain old GEM average object.
Without this the output is too erratic. Also, slow movement works a
whole lot better than fast for these objects.
I'm in the process of writing a new luma based tracking object that
might get finished beta testing at some point in the near future. It
will spit out a grid of 1 and 0 based on the comparing the luma in each
grid coordinate to the luma value you are looking for. The output is a
generic pd list for you to use in whatever way you see fit.
I've attached a simplified version of your patch.
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 2204 bytes
Desc: not available
More information about the Pd-list