[GEM-dev] noise problem with glsl
cyrille henry
cyrille.henry at la-kitchen.fr
Thu Sep 14 19:10:11 CEST 2006
chris clepper a écrit :
>
>
> On 9/14/06, *cyrille henry* <cyrille.henry at la-kitchen.fr
> <mailto:cyrille.henry at la-kitchen.fr>> wrote:
>
> hello chris,
> thanks for your sugestion
>
> chris clepper a écrit :
> > I haven't used the noise function becuase it does not run in
> hardware on
> > ATI or Nvidia (only 3DLabs).
> i did not know that.
> what did you use then? a jpg texture?
>
>
> Yes, a texture is what is normally used. In GEM you can use sig2pix to
> generate random values. The GLSL noise is not random but a Perlin noise
> generator.
ok, so i'm back with multi texture problem.
>
> >You might have to use temporary variables
> > or explicitly cast like:
> >
> > vx += (float) (0.1 * noise1())
> i tried this, but it does not change anything.
> the error : "<stdlib>(3998)" does not look like the error i've
got when
> i do this kind of mistake.
>
>
> The first error doesn't make sense because GLSL does not support
> integers of any type according to the orange book.
yes. thas why i though the problem did not cam from the shader. is it
possible the problem come from my graphyc card, or my OS or graphic driver?
(nvidia Quadro FX 1400 : it's a good card, but not optimized for video game)
>
> Can you post the fragment portion of the shader so I can test the whole
> thing?
there is nothing in the frag shader.
void main()
{
gl_FragColor = gl_Color;
}
thanks
cyrille
More information about the GEM-dev
mailing list