Bugs with Ogre and Cg->glsles using HLSL2GLSL
Posted: Mon Oct 29, 2012 4:04 pm
I found that none of my Cg shaders were working when using HLSL2GLSLFork. By saving the generated shaders as new glsles versions and examining them, I think I've found a couple of bugs.
Here's a working Cg shader:
Here is what Ogre/HLSL2GLSL has created:
So I noticed these weird "xlu_" uniforms being created, the same name as Ogre's auto-vars but with the xlu_prefix. I changed the code to use the ogre versions instead of the xlu_ ones, and suddenly it was working!
So what's the deal here? Am I doing something wrong, or is there a problem with Ogre Vs HLSL2GLSL?
Here's a working Cg shader:
Code: Select all
struct vs_out {
float4 position:POSITION;
};
vs_out test_vs(float4 position: POSITION, uniform float4x4 worldViewProj)
{
vs_out ret;
ret.position = mul(worldViewProj, position);
return ret;
}
Code: Select all
#version 100
precision mediump int;
precision mediump float;
struct vs_out {
vec4 position;
};
uniform mat4 worldViewProj;
vs_out test_vs( in vec4 position, mat4 worldViewProj ) {
vs_out ret;
ret.position = (worldViewProj * position);
#line 33
return ret;
}
uniform mat4 xlu_worldViewProj;
attribute vec4 vertex;
void main() {
vs_out xl_retval;
xl_retval = test_vs( vec4(vertex), xlu_worldViewProj);
gl_Position = vec4( xl_retval.position);
}
So what's the deal here? Am I doing something wrong, or is there a problem with Ogre Vs HLSL2GLSL?