[GEM-dev] Gem vs ATi fglrx vs GLEW vs OpenGL 2.0

Claude Heiland-Allen claudiusmaximus at goto10.org
Mon Jan 7 11:30:47 CET 2008


Hi all,

I found this:

http://ati.cchtml.com/show_bug.cgi?id=193

I ran the test program attached to that bug report, it reported success.

But Gem (without "--with-glversion=1.5") still fails to load with
glUniform2i undefined.

Then I tried some semi-random hacking of code:

#include <GL/glew.h> // just before #include <GL/gl.h> in Base/GemGL.h

and adding -lGLEW to the Make.config (iirc).

Gem now compiles without error and loads without error, but segfaults in
numerous situations, including plain texturing (gdb indicates jumping to 
a null pointer).

The key function in the OpenGL 2.0 tester seems to be:

typedef void (*__GLXextFuncPtr)(void);
extern "C" __GLXextFuncPtr glXGetProcAddressARB (const GLubyte *);

which is only referenced in:

pd-gem/Gem/src/Base/glxew.h
pd-gem/Gem/src/Base/glew.cpp

I noticed glewInit() is only called on Windows:

pd-gem/Gem/src/Base/GemWinCreateNT.cpp:  GLenum err = glewInit();

So I hacked something in GemWinCreateXWin.cpp:

#ifdef USE_GLEW
   GLenum err = glewInit();
   if (GLEW_OK != err) error("failed to init GLEW");
   else post("GLEW version %s",glewGetString(GLEW_VERSION));
#endif

and added #include <GL/glew.h> in GemWinCreate.h, and I got a Gem that 
compiled and loaded without errors, no segfaults so far either, but also 
no working shaders:

Direct Rendering enabled!
GLEW version 1.3.4
GEM: Start rendering
linking: link 1.07374e+09 0
linking: link 1.07374e+09 5.36871e+08
[glsl_program]: Info_log:
[glsl_program]:  Link successful. There are no attached shader objects.
[pix_image]: GEM: thread loaded image: 
/home/claude/src/pd-gem/Gem/examples/10.glsl/img3.jpg
GL: invalid value

With 02_primitive_distortion.pd I get an undistorted textured sphere, 
similarly the other examples (everything looks like no special shaders 
are running).

The large numbers for the shader ids look a bit suspicious to me, as 
does the "no attached shader objects".  Would be useful if "GL: invalid 
value" could be made more verbose, too.

So, any clues?

Some more info from Pd:

[glsl_vertex]: Vertex_shader Hardware Info
[glsl_vertex]: ============================
[glsl_vertex]: MAX_VERTEX_ATTRIBS: 32
[glsl_vertex]: MAX_VERTEX_UNIFORM_COMPONENTS_ARB: 4096
[glsl_vertex]: MAX_VARYING_FLOATS: 44
[glsl_vertex]: MAX_COMBINED_TEXTURE_IMAGE_UNITS: 16
[glsl_vertex]: MAX_VERTEX_TEXTURE_IMAGE_UNITS: 0
[glsl_vertex]: MAX_TEXTURE_IMAGE_UNITS: 16
[glsl_vertex]: MAX_TEXTURE_COORDS: 8
[glsl_fragment]: glsl_fragment Hardware Info
[glsl_fragment]: ============================
[glsl_fragment]: MAX_FRAGMENT_UNIFORM_COMPONENTS: 4096
[glsl_fragment]: MAX_TEXTURE_COORDS: 8
[glsl_fragment]: MAX_TEXTURE_IMAGE_UNITS: 16
[glsl_program]: glsl_Program Hardware Info
[glsl_program]: ============================
[glsl_program]:

I'm running Debian Stable with ATi proprietary fglrx driver for a 
Mobility Radeon 9700 card.


Thanks,


Claude
-- 
http://claudiusmaximus.goto10.org





More information about the GEM-dev mailing list