[PD] video midi

Christian Klippel c_klippel at gmx.net
Fri Jun 15 03:25:29 CEST 2001


hi michael,

this sounds very interresting to me in regards to our video project for jmax.
maybe we can co-operate on this ??

would be nice if you can contact me.

thanks,

chris

Am Freitag, 15. Juni 2001 02:17 schrieb Michael Droettboom:
> I've been working on exactly this on and off for about three months.  I
> was hesitant to post it to the list until things got more finalised, but
> I'd also hate for people to take off and duplicate efforts at this point.
> I've been very pleased to read on the list that there is so much interest
> in such a thing.
>
> My primary concern with this project has been video tracking (i.e. taking
> a video stream in real time and outputting values that can be used to
> control audio or other media) though the system has also proven useful for
> basic 2D pixel-based graphics.
>
> TECHNICAL DETAIL:
> My overall architecture is to pass video data as if they were very large
> blocks of audio data between objects.  My video_in_rgb object, for
> instance, outputs three data streams, one for red, green and blue.  Since
> the "data rate" for video is much higher than for audio, all the video
> processing objects have to be in their own subpatch where a block~ object
> is created behind the scenes to fake everything.  Perhaps it's best to
> explain this by way of example:  a 320x240 @ 15 fps greyscale video stream
> where each pixel is represented by one float requires 320 * 240 * 15 =
> 1152000 floats per second, which is much higher than the sampling rate of
> most audio hardware.  So, to get video and audio running in the same Pd
> process harmoniously, I "fake" it with block overlapping.  For simplicity
> of writing video objects, each "block" of samples contains one frame of
> video data.  This becomes the first (block size) parameter for the block~
> object.  Then the overlap is set such that at the current audio sampling
> rate, the rate number of blocks (frames of video) are processed per
> second.  The equation is:
>
>   pixels_per_frame / sampling_rate / frames_per_second
>
> Of course, both these numbers have to be powers of two, so there's all
> kinds of round-off that makes this not always optimal.
> But anyway, that's the fundamental premise upon which everything else
> works.
>
> A quick status report:
>
> - Video4Linux input support (I don't have access to other platforms,
>   but maybe GEM's Windows and SGI video-in support (which I think it has)
>   would be a good starting point.)  I use a cheap Creative Labs USB
>   webcam for everything and it works great.
>
> - Video out support uses the SDL library, because it's very portable and I
>   didn't want to rely on something like OpenGL for output which may be is
>   a bit of overkill in this case.
>
> - Only very basic objects are currently implemented, though many are in
>   the works.
>
> - Basic thresholding kinds of things are there
>
> - A "blob" tracker, which finds the center of mass of an object, good for
>   tracking it's location.
>
> - "Video delay" delays a video stream by a specified number of frames
>
> - snapshot, to store a single frame of video
>
> - image file i/o
>
> - colour conversion  (RGB <-> HSV)
>
> - It's also important to note that my approach, using Pd audio streams for
>   video data, means one can use any ~ object to process video data.  For
>   example *~ can be used to control the brightness of a stream.  +~ mixes
>   two streams.
>   Visualizing things like osc~ can be fun, though the use is more limited.
>
> There are example patches for colour tracking, motion tracking, and
> amount-of-motion sensing.  All these work quite well on a P-III 700MHz
> laptop running Linux.
>
> I've been extremely busy lately, but since there seems to be so much
> momentum here, I'll try to post some extremely-alpha "only-works-for-me"
> code by the end of the weekend.  I'd love to hear comments and feedback on
> all this as well.
>
> Glad to see the interest!
>
> Michael Droettboom
> mdboom at peabody.jhu.edu
> 410.625.7596
>
> Computer Music Research
> Peabody Conservatory of Music
> Johns Hopkins University
>
> On Thu, 14 Jun 2001, Mark Danks wrote:
> >   I actually have a number of ideas for how to deal with this, but
> > haven't had any time to do it...it has been on my gem.todo.txt list for
> > ~3 years now
> >
> > :-)
> >
> >   Are you looking for prebuilt objects? code snippets? thoughts and
> > ideas? I did some primitive analysis back when I was working on SGIs, and
> > got some meaningful numbers back...
> >
> > Later, Mark
> >
> > ============================
> > = mdanks at stormfront.com
> > = Lead Programmer PS2
> > = http://www.danks.org/mark
> > ============================
> >
> > > -----Original Message-----
> > > From: Miller Puckette [mailto:mpuckett at man104-1.ucsd.edu]
> > > Sent: Thursday, June 14, 2001 2:03 PM
> > > To: greg paynter
> > > Cc: pure data
> > > Subject: Re: [PD] video midi
> > >
> > >
> > > Hi Greg,
> > >
> > > I don't think anyone's done that yet in Pd, but I'm hoping
> > > someone will...
> > >
> > > cheers
> > > MIller
> > >
> > > On Fri, Jun 15, 2001 at 02:36:28AM +1000, greg paynter wrote:
> > > >  does anyone have any answers as to how to trigger a midi
> > >
> > > input signal from
> > >
> > > > a video source??
> > > >
> > > >  has anyone considered writing a library for pd to include
> > >
> > > processing midi
> > >
> > > >  values based on pixel movement in a video source???
> > > >
> > > >  is anyone using a kinetic input to inspire a musical and
> > >
> > > or gem  visuals in
> > >
> > > >  their performance work ,,
> > > >
> > > >  please any ideas greatfull????
> > > >
> > > >  thanks

-- 
visit me at http://mamalala.de



More information about the Pd-list mailing list