[GEM-dev] help with glsl abstractions

IOhannes m zmoelnig zmoelnig at iem.at
Mon Sep 2 18:34:35 CEST 2013


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 2013-09-02 12:41, Nicolas Montgermont wrote:
> 
> Le 01/09/13 11:31, Jack a écrit :
>> number of texunit available is return by 
>> GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS. Here on Intel HD 4000,
>> GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS is 32. And on NVidia GTX660M
>> is 160.
> Mmmm, here creating [GLdefine GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS] 
> and banging it gives: 35661... did I miss something obvious?

yes.

GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS is a label, that can be used to
query the maximum number of combined texture image units. it is not
the number itself. you have to use it with glGet() in order to query
the result.

[gemhead]
|
[GEMglGetIntegerv GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS]
                                                     |
                                                     [ \

btw, this gives 20 on my old and trusted GeForce 7800 GTX.

fgamsdr
IOhannes
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.14 (GNU/Linux)
Comment: Using GnuPG with Icedove - http://www.enigmail.net/

iEYEARECAAYFAlIkvhgACgkQkX2Xpv6ydvT/oACdF9tgeTBP2eq/ORF9M1K1bVPe
WzAAoKcfs4acrWskoiMGR0wwstJP28c9
=lo6L
-----END PGP SIGNATURE-----



More information about the GEM-dev mailing list