[PD] per-pixel time based(with memory) effects, is it possible with fragment shaders?

vade doktorp at mac.com
Sun Jan 6 10:45:26 CET 2008


This is entirely possible, however you would want to use a 3D texture,  
something on the order of dimensions 320 by 240 by x (where x is how  
many frames of time 'back' you want to go). This will be relatively  
heavy video memory wise I would imagine. However, I have a shader for  
you that does this.

Absolutely no idea what so over if GEM can deal with 3D textures. Id  
assume since it has low level openGl support it should be able to, but  
how to get 2D video frames into that 3D texture via pix_xxx I have no  
idea.

Here is the shader, written by Andrew Benson from Cycling74 iirc.

This is sans vertex shader, but the vertex shader is basically a  
passthrough, so its very simple.

texa is a 2d lookup table - greyscale, where the luminance of the  
pixel at the point determines how far back in time to go. This shader  
assumes a 512x512 map, but you could make that dynamic by passing a  
varying variable from the vertex shader that looks at the dims of the  
3D texture.

HTH.

varying vec2 texcoord0;
varying vec2 texcoord1;
uniform float slice;
uniform sampler3D texo;
uniform sampler2DRect texa;
const vec4 coeff = vec4(0.299, .587, 0.114, 0.);

void main( void )
{
	float v1 = dot(texture2DRect(texa,texcoord0*vec2(512.)),coeff);
	//assumes 512  x 512 slice map.  Pretty arbitrary...
	vec4 v0 = texture3D(texo, vec3(texcoord0.xy,v1));
	gl_FragColor = v0;
}


On Jan 6, 2008, at 4:21 AM, Batuhan Bozkurt wrote:

> Hello,
>
> Let me start off by saying that I don't have much experience with
> computer graphics, so my knowledge is very limited on this area, I'm
> more of an audio guy but I have some ideas in mind that would apply to
> graphics well. I just want to experiment.
>
> I want to give a simple example to show my question. Suppose I have a
> video running and I want to delay each pixel between 0 and a maximum
> time seperately. This is the most basic time based effect I can think
> of. So for a 320x200 video for example, I would need 64000 delaying
> units running seperately and to be able to do this in realtime, I  
> think
> I'd need to use GPU for computation.
>
> So I'm thinking of using pixel shaders. But I'm not really sure if  
> this
> is possible or not with it. I grabbed the orange book to have an idea
> about the process, I did not have much time tolook in detail but I  
> could
> not find any references to such an operation that made me think that  
> I'm
> on the wrong track. So before going any further learning GLSL, I'd  
> like
> to have your ideas on this.
>
> Is GLSL along with GEM is a nice way to do such an operation? Is it  
> even
> possible to do it in realtime?
> If GPU powered realtime operation is not possible, is there any tool
> that you know that is capable of doing such things(realtime or  
> offline)?
> I have many ideas on processes that modifies pixels which are  
> dependant
> on the states of pixels before them(i.e. with memory) and I'm trying  
> to
> find a way to implement them. Any help is appreciated.
>
> Thanks
> BB
>
> _______________________________________________
> PD-list at iem.at mailing list
> UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list





More information about the Pd-list mailing list