Oops, I think I must not have explained myself correctly
I just want lines like, "multiply X by Y" to read "multiply X.rgb by Y.rgb", where it doesn't need the alpha input.
What happens, if you have an alpha defined in anything but the diffuse is that the alpha is getting used, giving weird results, because X.a and Y.a weren't being defined by the texture before, so OpenGL treated them as 1. With them defined, they're getting used in multiply steps in ways that screw up the final result.
Try putting a cloudy alpha, just any old thing, into a normalmap; you'll see it immediately, I think, unless this is another one of those fun ATi-only issues.
But exposing the GLSL would be better, yeah; then I could re-write it to reflect the HLSL stuff fairly well and distribute that to help anybody wanting to mess with the new shaders. I don't know how much work that would be, though, so I hesitated to ask.