Apple GLSL driver bug - attribute aliasing

· by Steve · Read in about 4 min · (776 Words)

One of the unfortunate things about Mac OS X is that graphics driver support lags behind other platforms. Drivers are bundled with OS X system updates rather than being updated separately and occasionally there are bugs which take longer to get resolved on OS X than on other platforms as a result.

We’ve had this problem before, and we appear to have got it again now. In our example media, we have some hardware skinning shaders written in GLSL; we also have Cg and HLSL versions, but the GLSL version is there to prove certain features such as passing arrays of uniforms - bone matrices - to GLSL. Works fine everywhere I’d tried it, but I did notice that on OS X our skeletal animation demo runs like a pig, and yet it runs perfectly fine on Vista on the same hardware when using GL.

I’ve got a lot of things on my plate so I hadn’t looked into it much yet, but hellcatv picked up on it in the forums recently. Luckily it appears he’s got an Apple support contract or at least better contacts than me (Apple don’t answer tech support without a contract or per-incident fee, and my posts so far to the public Apple developer forums have remained mostly unanswered) so we managed to get to the bottom of it.

The issue is to do with ‘aliasing’ of vertex attributes - that is, two different vertex components accidentally sharing a single attribute number. Skeletal animation requires the use of extended attributes, at least if you don’t want to burn texture coordinates pointlessly on it. Ogre recognises certain attributes in GLSL shaders such as ‘tangent’, ‘binormal’, ‘blendWeights’ and ‘blendIndices’ and automatically hooks them up to those semantics as passed through our geometry format (for those who don’t know, our geometry format is completely flexible and described by a declaration in which you link up buffer offsets and buffer ids to a ‘semantic’ to indicate how vertex data should be interpreted). These attributes don’t have fixed ids, GL lets you bind custom attributes to any number; but we don’t assume here, we compile and link the GLSL and ask the GL runtime what it’s assigned the custom attributes to. And herein lies the problem.

In theory, attributes can be assigned any ID, even the builtin attributes like gl_Vertex and gl_Normal. However, in practice many manufacturers make the builtins refer to fixed IDs all the time and don’t like you to try to change that or override them with custom attributes. Thus, for example gl_Normal is usually attribute #2, so you’d better not use gl_Normal in your shader and also bind a custom attribute to #2, otherwise you get ‘attribute aliasing’ which drags performance into the gutter. We avoid making any assumptions here and after compiling and linking the GLSL, we ask GL what the attribute numbers are.

On Windows and Linux, if you’ve used gl_Vertex (attrib 0) and gl_Normal (attrib 2) and 2 custom attributes (blend weights and blend indexes), the custom attributes come back from the GLSL compiler as #1 and #3 respectively. Obviously it’s worked out that it can’t use #2 because it’s being used by gl_Normal. If you don’t refer to gl_Normal in your shader, #2 is used.

On Apple however, it appears that the custom attributes are bound to #1 and #2 even if you’re referring to gl_Normal. This immediately causes attribute aliasing and hence the performance drop. Apple initially said they thought we were causing this, but after clarifying what we do and why I think it’s correct (and works on other drivers), Apple have now indicated that “We are working to resolve this bug in future updates.“.

Hopefully this gets fixed soon. One possible work around is to change our code so that we pick fixed, very high numbered attributes that are always out of the way instead of asking the GLSL compiler to tell us what to use. This isn't as elegant though, so I'd prefer to wait to see if Apple fix their driver in the short term. Interestingly the same problem occurs on ATI and nvidia so it might be a mistake in some shared implementation perhaps.

It's a shame - games like Bioshock get driver updates specifically for them, as did iD's games beforehand - I remember how many GL driver issues we had when using new extensions iD hadn't in their games, stuff that I'm sure would have gotten fixed much quicker if iD had been releasing a game using them. The rest of us, having a much lower profile, have to just cross our fingers and hope it won't be too long.