Hlsl shaders in Ogre from RenderMonkey

The place for artists, modellers, level designers et al to discuss their approaches for creating content for OGRE.
Post Reply
bloudon
Gnoblar
Posts: 5
Joined: Wed Sep 01, 2004 3:43 pm
Location: Glasgow

Hlsl shaders in Ogre from RenderMonkey

Post by bloudon »

I'm a beginner and i'm trying to get some of the effects I've seen in Render Monkey into Ogre. I'm using Hlsl because it's what I have the most info on at the moment, with all the samples I have in RenderMonkey.

Two particular things I don't understand is in the hlsl it refers to textures by name, (I noticed that it does the same in the cg code for your parallax bump mapping) how do you set this up when the texture_units aren't named? Also I was trying to convert the param_named variables the RenderMonkey example required, but there is one variable that is not a float4 it is a float4x4, how can I declare this in a texture unit?

I take it that in the offset bump map cg example that the way that the rockwall texture is applied to the ogrehead could equally well work with the athene normal map applied to the Athene mesh?


In general, apart from these questions, my aim with Ogre is to use my ATI Radeon 9700 to do some off-line rendering work for me. I spend a LOT of time watching progress bars doing software renders of consumer products I'm designing or of architectural visualisations. I'm just starting out with Ogre and I'm very impressed at how well put together it all is and I'm excited about the capabilities available. As an artist though I am struggling abit with things, and am just getting into C++ (I've only done java, actionscript and visual basic before). If I can just get off the ground with understanding shaders then I can get onto my next task which is to try grabbing frames from the render target nad writing them as bitmaps to be compiled as movies... and end sleepless nights watching computer(s) chuntering away!

PS, does anybody have any opinions on an approach to doing such a scripted animation as an input to an Ogre scene? Would renderman be the way to go? Or a PovRay animation script? Just starting to think about this so any thoughts would be great.

anyway, before I go further off my own topic,

thanks, and cheers to you guys for writing Ogre!

Brian
nfz
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 1263
Joined: Wed Sep 24, 2003 4:00 pm
Location: Halifax, Nova Scotia, Canada

Post by nfz »

Can you give a specific RM example that you are trying to bring into Ogre? I have converted most of the examples from RM to Ogre use so I might be able to help you out if I knew which example you are talking about.

You can use the rockwall offset mapping material on the Athene model but you may want to change the UV mapping since its rather chopped up.

I also use Ogre to render movies and it is a very fast way to produce CG movies. When dumping the screen to a file it takes about 250 msec on my machine so I can render to file about 4 frames a second. A one minute movie with 24 fps takes about 6 minutes to render. Of course the rate depends on scene complexity and screen size but on average that is what I get. The cool thing is being able to work with the scene in real time and see all the effects before rendering to file so you only have to render to file once when you are happy with the real time renderings.
bloudon
Gnoblar
Posts: 5
Joined: Wed Sep 01, 2004 3:43 pm
Location: Glasgow

Post by bloudon »

thanks for your reply, what you're saying is exactly what I'm aiming for! Sorry for not posting specific code, I appreciate that there's nothing more annoying than a newbie "this isn't working" post . I was trying to transfer across the anisotropic hlsl D3D example in RenderMonkey. Which resulted in the following mess;

.material script:-
------------------------------------------------------------------------------------

// -------------------------------
// Anisotropic shader, based on RenderMonkey 1.5
// -------------------------------

vertex_program vAnisotropic hlsl
{
source vAnisotropic.txt
entry_point main
target vs_1_1
}

fragment_program pAnisotropic hlsl
{
source pAnisotropic.txt
entry_point main
target ps_2_0
}

material Examples/Anisotropic
{
technique
{
pass
{
vertex_program_ref vAnisotropic
{
param_named light2Position float4 300 500 1 1
param_named light1Position float4 -500 500 1 1
param_named vRingScale float4 0.25 0.5 0.75 1
param_named vRingCenter float4 -1.5 -20 26 1
param_named fAmbient float 0.1
param_named texture_space_matrix float4x4 1.0 0.0 0.0 0.0 0.0 -2.0 -4.0 0.0 0.0 -1.0 -2.0 0.0 0.0 0.0 0.0 1.0 // didn't know what to do with this param really...
param_named_auto view_proj_matrix worldviewproj_matrix
param_named ambient float4 0.1294 0.1615 0.1327 1.0000
param_named_auto mWorld worldview_matrix
param_named_auto inv_view_matrix inverse_world_matrix
}
fragment_program_ref pAnisotropic
{
param_named ringAmbientColor float4 0.2314 0.2534 0.2024 1.0000
param_named ringColor float4 1.0000 1.0000 1.0000 1.0000
param_named lightColor float4 1.0000 0.9385 0.8440 1.0000
}
texture_unit
{
texture RingGradient.bmp
tex_address_mode clamp
filtering none
}
texture_unit
{
texture AnisoStrand.tga
tex_address_mode clamp
filtering none
}
}
}

}

vertex program:-
---------------------------------------------------------------------------------

float4 light2Position;
float4 light1Position;
float4 vRingScale;
float4 vRingCenter;
float fAmbient;
float4x4 texture_space_matrix;
float4x4 view_proj_matrix;
float4 ambient;
float4x4 mWorld;
float4x4 inv_view_matrix;
struct VS_OUTPUT
{
float4 ProjPos : POSITION;
float4 Normal : COLOR0; // Has ambient in W
float3 View : TEXCOORD0;
float3 DirAniso : COLOR1;
float4 TexPos : TEXCOORD1; // Has light intensity in W
float3 Light1 : TEXCOORD2;
float3 Light2 : TEXCOORD3;
};

VS_OUTPUT main( float4 inPos: POSITION, float4 inNormal: NORMAL )
{
VS_OUTPUT o = (VS_OUTPUT)0;

// Texture space coordinates:
float3 texPos = ((float3)inPos) - vRingCenter;
float3 texSpacePos = texPos * vRingScale + float3(0, -40, -40);

texSpacePos = mul(texSpacePos, (float3x3)texture_space_matrix );

o.TexPos.xyz = texSpacePos;

// Ambient lighting:
o.TexPos.w = ambient.x * 5;

// Compute direction of anisotropy
float3 dirAniso = cross(inNormal, normalize(texPos));

o.ProjPos = mul( view_proj_matrix, inPos );

// Output the normal vector ( w component contains ambient factor )
o.Normal = float4( inNormal * .5 + (float3).5, fAmbient );

// Propagate direction of anisotropy:
o.DirAniso = dirAniso * 0.5 +(float3)0.5;

// Compute camera position:
float4 vCameraPosition = mul( inv_view_matrix, float4(0,0,0,1) );

// Calculate view vector:
o.View = normalize( vCameraPosition - mul( mWorld, inPos ) );

// Compute light direction vectors:
light1Position = mul( inv_view_matrix, light1Position );
light2Position = mul( inv_view_matrix, light2Position );

o.Light1 = normalize(light1Position - inPos );
o.Light2 = normalize(light2Position - inPos );

return o;
}

fragment program:-
-------------------------------------------------------------------------------------

float4 ringAmbientColor;
float4 ringColor;
float4 lightColor;
sampler ring_gradient;
sampler strand;
struct StrandPair
{
float Diffuse;
float Specular;
};

//----------------------------------------------------------------------------------------------//
// Pixel shader input struct //
//----------------------------------------------------------------------------------------------//
struct PS_INPUT
{
float4 ProjPos : POSITION;
float4 Normal : COLOR0; // Normal vector, w component stores ambient contribution coefficient
float3 View : TEXCOORD0; // View vector
float3 DirAniso : COLOR1; // Direction of anisotropy
float4 TexPos : TEXCOORD1; // Lookup texture coordinate, w component stores light intensity coefficient
float3 Light1 : TEXCOORD2; // Direction vector for light 1
float3 Light2 : TEXCOORD3; // Direction vector for light 2
};

//----------------------------------------------------------------------------------------------//
StrandPair StrandLight( float3 normal, float3 light, float3 view, float3 dirAniso )
{
StrandPair o;

float LdA = dot( light, dirAniso );
float VdA = dot( view, dirAniso );
float2 fnLookup = tex2D( strand, float2( LdA, VdA ) * 0.5 + (float2)0.5 );
float spec = fnLookup.y * fnLookup.y;
float diff = fnLookup.x;
float selfShadow = saturate( dot( normal, light ) );

o.Diffuse = diff * selfShadow;
o.Specular = spec * selfShadow;
return o;
}


//----------------------------------------------------------------------------------------------//
// Pixel shader //
//----------------------------------------------------------------------------------------------//
float4 main( PS_INPUT i ) : COLOR
{
// Extract components from inputs //
float3 dirAniso = i.DirAniso * 2 - (float3)1;
float3 normal = i.Normal * 2 - (float3)1;
float3 view = (float3) i.View;
float3 light[2] = { (float3) i.Light1, (float3)i.Light2 };
float shadow = i.TexPos.w;
float ambient = i.Normal.w;

// Strand Lighting //
float3 color = 0;
for (int l = 0; l < 2; l++)
{
StrandPair strand = StrandLight(normal, light[l], view, dirAniso);
color += (strand.Diffuse + strand.Specular) * lightColor;
}

float3 ringNewColor = tex1D(ring_gradient, length(i.TexPos)) * ringColor + ringAmbientColor;

color = (color*shadow + ambient) * ringNewColor;

return float4( color, 1);

}

The programmes above are just copied and pasted from RenderMonkey - yup that'll be the artist in me then...

Are texture samplers just taken from the order declared in the ogre material script? or is there more to it? ie. texture_unit declared first is the first sampler declared in the hlsl.

The other point which I reckon I've tripped up on is the declaration of the the float4x4 texture_space_matrix, which is not an auto declaration as far as I can tell.

I hope you don't mind me asking but how have you gone about capturing frames yourself? At the moment I just feel like such an eejit for not realising sooner that my graphics card could have (properly used) reduced my render time to an insignificant fraction of what it was.... I did a small 360degrees animation to be indexed in flash raytraced using Rhino3D and Flamingo about a month ago and it literally involved hours of computing time. Anyway I can get round that is worth it, even if I'm spending a lot of time learning C++, hlsl and anything in between in the meantime.

I'm taken with the idea of submitting some artwork once I work out what I'm doing with the shaders... maybe a razor replacement.. I've noticed that "artists" have got some bad press on these forums for being tight-fisted, well I'm certainly not intending to live up to any national stereotypes anyway,

cheers

Brian.
User avatar
sinbad
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 19269
Joined: Sun Oct 06, 2002 11:19 pm
Location: Guernsey, Channel Islands
x 66
Contact:

Post by sinbad »

Just thought I'd chime in - did you check the Ogre.log and look for any error messages? 2 things I can think of:

float4x4 is not a supported type in the ogre material definitions (yet), you have to specify floatN ie float16. This will map to a HLSL float4x4.

I don't use HLSL much, and I only tested scripts which took all the parameters as inputs to the shader, not ones where the parameters are declared as globals. I've noticed recently that globals are quite common in HLSL definitions and I'm not currently sure whether they show up in the parameters list as reported by the HLSL compiler - I assume they are though.

Anyway, keep an eye on Ogre.log and let us know what the errors are.
bloudon
Gnoblar
Posts: 5
Joined: Wed Sep 01, 2004 3:43 pm
Location: Glasgow

ogre.log

Post by bloudon »

ta much for the tips. I changed that dodgy float4x4 declaration to float16, but as you can see below from the end of this ogre.log, it crashed out when it got to my hlsl.material.

14:48:52: ***************************************
14:48:52: *** D3D9 : Subsystem Initialised OK ***
14:48:52: ***************************************
14:48:52: Parsing material script: Examples.program
14:48:52: Parsing material script: Example-Water.material
14:48:52: Parsing material script: Example.material
14:48:52: Parsing material script: Examples-Advanced.material
14:48:53: Parsing material script: hlsl.material

I was thinking I could use my code to force the value in there, but it'll still parse the material first...

It would be easier for me to use nfz's glsl demo as a learning tool, but I noticed that there isn't any full screen anti-aliasing and as i was saying I want to produce pre-rendered aliased movies/images, plus I thought that DX9 was faster?

to answer your post here, http://www.ogre3d.org/phpBB2/viewtopic. ... c&start=50 since it's me asking the same daft questions... :oops: I just noticed that the cel shading demo DOES work, the only change is that I recently downloaded DX9 SDK 9.0c as opposed to 9.0b. But it didn't work before honest... From what you're saying though, would glsl code be parsed by Ogre and then run fine in an Ogre DX9 window?
nfz
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 1263
Joined: Wed Sep 24, 2003 4:00 pm
Location: Halifax, Nova Scotia, Canada

Post by nfz »

You can get full screen anti-aliasing by setting it in the ATI control panel. It works for both DX9 and OpenGL rendering modes. I use 6X which gets rid of all the jagged edges when doing video renderings at 640x480.

GLSL is only available in OpenGL rendering mode. Ogre does not parse the GLSL source code but passes it to the OpenGL API to get it compiled to GPU micro code (not assembly).

I have done comparisons between Cg, HLSL, and GLSL for performance and its a close match between Cg and HLSL on ATI hardware in DX9. Cg (ARBfp and ARBvp) wins when compared against GLSL in OpenGL but the GLSL compiler is young and gets better on each driver update. GLSL should eventualy be faster since the compiler can optimize directly from source to the gfx card GPU micro code without having to go through assembly.

When doing renderings to a file its the speed of writing from the frame buffer to disk that has more of an impact on production time and not which rendering mode or high level shader format you use.
bloudon
Gnoblar
Posts: 5
Joined: Wed Sep 01, 2004 3:43 pm
Location: Glasgow

Post by bloudon »

very good point about the writing to disk time. I'll have another go at something else in hlsl and if I fail miserably I'll try glsl then. Sounds like glsl could be the way of the future then. I would prefer to use something not microsoft specific.

Talking of the time to write the buffer to disk, is it possible to make the frameended() method pause and get confirmation that the file has been written?
nfz
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 1263
Joined: Wed Sep 24, 2003 4:00 pm
Location: Halifax, Nova Scotia, Canada

Post by nfz »

The code you posted above works with a few small corrections in the ogre material script section. Here is a pic with full screen anti-aliasing 6X turned on:

Image

I think I might know why you are having some problems with HLSL in Ogre. When I submitted the GLSL patch it inlcuded some modifications to the DX9 rendering system to enhance HLSL support. But I forgot to include the changes to OgreD3D9HLSLProgram.cpp in the patch. Ogre 0.14.1 does not support variables of type D3DXPT_SAMPLER, D3DXPT_SAMPLER1D, D3DXPT_SAMPLER2D, D3DXPT_SAMPLER3D, D3DXPT_SAMPLERCUBE which are needed for using texture samplers. I'll get the patch submitted ASAP.

Don't let the fps in the pic fool you. The material editor allows you to set the refresh rate so I normally use 50 fps. Banging away at 400 fps is just silly and a waste.
nfz
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 1263
Joined: Wed Sep 24, 2003 4:00 pm
Location: Halifax, Nova Scotia, Canada

Post by nfz »

Talking of the time to write the buffer to disk, is it possible to make the frameended() method pause and get confirmation that the file has been written?
Its possible but why?
nfz
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 1263
Joined: Wed Sep 24, 2003 4:00 pm
Location: Halifax, Nova Scotia, Canada

Post by nfz »

Now I remember why I didn't include OgreD3D9HLSLProgram.cpp in the patch, I discovered it wasn't necessary to query the HLSL compiler for the sampler stage indexes since they are predictable. The first sampler variable defined is s0, the second is s1 and so on. The only time it doesn't work is if the sampler variable is not used in the shader or if you are combining fragmented HLSL shaders (similar to shader objects in GLSL) which Ogre doesnt support (yet).

To sum it up: the first sampler you define in your hlsl source corresponds to the first texture_unit in the pass of the technique in the material script and the second sampler defined refers to the second texture_unit and so on.
nfz
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 1263
Joined: Wed Sep 24, 2003 4:00 pm
Location: Halifax, Nova Scotia, Canada

Post by nfz »

Image

Here is the version of the anisotropic shader I used for the image. I taylored it to ogre and it produces a better result by normalizing all direction vectors in the pixel shader.


vAnisotropic.txt:

Code: Select all

float4 light2Position; 
float4 light1Position; 
float4 vRingScale; 
float4 vRingCenter; 
float4 eyePosition;
float fAmbient; 
float4x4 texture_space_matrix; 
float4x4 view_proj_matrix; 
float shadow; 
// nfz: dont need inv_view_matrix since all vectors already in object space
//float4x4 mWorld; 
//float4x4 inv_view_matrix; 

struct VS_OUTPUT 
{ 
	float4 ProjPos : POSITION; 
	float4 Normal : COLOR0; // Has ambient in W 
	float3 View : TEXCOORD0; 
	float3 DirAniso : COLOR1; 
	float4 TexPos : TEXCOORD1; // Has light intensity in W 
	float3 Light1 : TEXCOORD2; 
	float3 Light2 : TEXCOORD3; 
}; 

VS_OUTPUT main( float4 inPos: POSITION, float4 inNormal: NORMAL ) 
{ 
	VS_OUTPUT o = (VS_OUTPUT)0; 

	// Texture space coordinates: 
	float3 texPos = ((float3)inPos) - vRingCenter; 
	float3 texSpacePos = texPos * vRingScale + float3(0, -40, -40); 

	texSpacePos = mul(texSpacePos, (float3x3)texture_space_matrix ); 

	o.TexPos.xyz = texSpacePos; 

	// Ambient lighting: 
	o.TexPos.w = shadow; 

	// Compute direction of anisotropy 
	float3 dirAniso = cross(inNormal, normalize(texPos)); 

	o.ProjPos = mul( view_proj_matrix, inPos ); 

	// Output the normal vector ( w component contains ambient factor ) 
	o.Normal = float4( inNormal.xyz * 0.5 + 0.5, fAmbient ); 

	// Propagate direction of anisotropy: 
	o.DirAniso = dirAniso * 0.5 +(float3)0.5; 

	// Compute camera position:
	// nfz: ogre can give us the eye position in object space
	//float4 vCameraPosition = mul( inv_view_matrix, float4(0,0,0,1) ); 

	// Calculate view vector: 
	//o.View = normalize( vCameraPosition - mul( mWorld, inPos ) ); 
	o.View = normalize( eyePosition - inPos ); 

	// Compute light direction vectors: 
	// nfz: ogre can give us light positions in object space
	//light1Position = mul( inv_view_matrix, light1Position ); 
	//light2Position = mul( inv_view_matrix, light2Position ); 

	o.Light1 = normalize(light1Position - inPos ); 
	o.Light2 = normalize(light2Position - inPos ); 

	return o; 
} 

pAnisotropic.txt:

Code: Select all

float4 ringAmbientColor; 
float4 ringColor; 
float4 lightColor; 
sampler ring_gradient; 
sampler strand; 

struct StrandPair 
{ 
	float Diffuse; 
	float Specular; 
}; 

//----------------------------------------------------------------------------------------------// 
// Pixel shader input struct // 
//----------------------------------------------------------------------------------------------// 
struct PS_INPUT 
{ 
	float4 ProjPos : POSITION; 
	float4 Normal : COLOR0; // Normal vector, w component stores ambient contribution coefficient 
	float3 View : TEXCOORD0; // View vector 
	float3 DirAniso : COLOR1; // Direction of anisotropy 
	float4 TexPos : TEXCOORD1; // Lookup texture coordinate, w component stores light intensity coefficient 
	float3 Light1 : TEXCOORD2; // Direction vector for light 1 
	float3 Light2 : TEXCOORD3; // Direction vector for light 2 
}; 

//----------------------------------------------------------------------------------------------// 
StrandPair StrandLight( float3 normal, float3 light, float3 view, float3 dirAniso ) 
{ 
	StrandPair o; 

	float LdA = dot( light, dirAniso ); 
	float VdA = dot( view, dirAniso ); 
	float2 fnLookup = tex2D( strand, float2( LdA, VdA ) * 0.5 + (float2)0.5 ); 
	float spec = fnLookup.y * fnLookup.y; 
	float diff = fnLookup.x; 
	float selfShadow = saturate( dot( normal, light ) ); 

	o.Diffuse = diff * selfShadow; 
	o.Specular = spec * selfShadow; 
	return o; 
} 


//----------------------------------------------------------------------------------------------// 
// Pixel shader // 
//----------------------------------------------------------------------------------------------// 
float4 main( PS_INPUT i ) : COLOR 
{ 
	// Extract components from inputs // 
	// nfz: normalize all the direction vectors fed from vertex program
	float3 dirAniso = normalize(i.DirAniso.xyz * 2 - 1); 
	float3 normal = normalize(i.Normal.xyz * 2 - 1); 
	float3 view = normalize(i.View).xyz; 
	float3 light[2] = { normalize(i.Light1.xyz), normalize(i.Light2.xyz) }; 
	float shadow = i.TexPos.w; 
	float ambient = i.Normal.w; 

	// Strand Lighting // 
	float3 color = 0; 
	for (int l = 0; l < 2; l++) 
	{ 
		StrandPair strand = StrandLight(normal, light[l], view, dirAniso); 
		color += (strand.Diffuse + strand.Specular) * lightColor; 
	} 

	float3 ringNewColor = tex1D(ring_gradient, length(i.TexPos)) * ringColor + ringAmbientColor; 

	color = (color*shadow + ambient) * ringNewColor; 

	return float4( color, 1); 

} 

Ogre Material Script:

Code: Select all

vertex_program vAnisotropic hlsl 
{ 
	source vAnisotropic.txt 
	entry_point main 
	target vs_1_1 
} 

fragment_program pAnisotropic hlsl 
{ 
	source pAnisotropic.txt 
	entry_point main 
	target ps_2_0 
} 

material HLSL/Anisotropic 
{ 
	technique 
	{ 
		pass 
		{ 
			vertex_program_ref vAnisotropic 
			{ 
				param_named_auto light1Position light_position_object_space 0
				param_named_auto light2Position light_position_object_space 1
				param_named_auto eyePosition camera_position_object_space
				param_named_auto view_proj_matrix worldviewproj_matrix 
				//param_named_auto mWorld worldview_matrix 
				//param_named_auto inv_view_matrix inverse_world_matrix 

				param_named vRingScale float4 0.25 0.5 0.75 1 
				param_named vRingCenter float4 -1.5 -20 26 1 
				param_named fAmbient float 0.1 
				param_named texture_space_matrix float16 1.0 0.0 0.0 0.0  0.0 -2.0 -4.0 0.0  0.0 -1.0 -2.0 0.0 0.0 0.0 0.0 1.0  
				param_named shadow float 0.5  
			} 

			fragment_program_ref pAnisotropic 
			{ 
				param_named ringAmbientColor float4 0.2314 0.2534 0.2024 1.0000 
				param_named ringColor float4 1.0000 1.0000 1.0000 1.0000 
				param_named lightColor float4 1.0000 0.9385 0.8440 1.0000
				 
			} 

			texture_unit 
			{ 
				texture RingGradient.bmp 
				tex_address_mode clamp 
				filtering none 
			} 

			texture_unit 
			{ 
				texture AnisoStrand.tga 
				tex_address_mode clamp 
				filtering none 
			} 


		} 
	} 

} 

The above shader code and material script works in Ogre 0.14.1 and in the latest cvs.
Last edited by nfz on Mon Sep 06, 2004 5:44 pm, edited 1 time in total.
Crashy
Google Summer of Code Student
Google Summer of Code Student
Posts: 1005
Joined: Wed Jan 08, 2003 9:15 pm
Location: Lyon, France
x 49
Contact:

Post by Crashy »

Sure that it works with Ogre 14.1?(It works with last CVS, I've just tested).

It works(understand that the sample launch itself, but there is errors in the ogre.log, and after all it's normal, OGL doesn't support HLSL) perfectly in OpenGL, but in DirectX9(and Ogre 14.1), I got this

Code: Select all

Runtime Error!

Program: blablablablabla.exe
abnormal program termination
:shock:
Follow la Moustache on Twitter or on Facebook
Image
nfz
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 1263
Joined: Wed Sep 24, 2003 4:00 pm
Location: Halifax, Nova Scotia, Canada

Post by nfz »

Crashy: not sure what you are refering to. Your own app using the above material with Ogre 0.14.1?
Crashy
Google Summer of Code Student
Google Summer of Code Student
Posts: 1005
Joined: Wed Jan 08, 2003 9:15 pm
Location: Lyon, France
x 49
Contact:

Post by Crashy »

I didn't create a specific sample to use this material.

I've this error only when I include the .material file given in this post in the medi directory, and with every samples.

So it isn't a programmation error I should have made.

And as I said, It works perfectly with last CVS.
Follow la Moustache on Twitter or on Facebook
Image
User avatar
sinbad
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 19269
Joined: Sun Oct 06, 2002 11:19 pm
Location: Guernsey, Channel Islands
x 66
Contact:

Post by sinbad »

It might be the use of 'float' in the parameter type. Perviously, multiples of 4 were required so you would need to use float4 and initialise the other members to 0 (this is how it's bound in the assembler anyway).
nfz
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 1263
Joined: Wed Sep 24, 2003 4:00 pm
Location: Halifax, Nova Scotia, Canada

Post by nfz »

oops, excellent point Sinbad. I thought the 0.14.1 version I tried it with was unaltered, I should have checked before I opened my yap. Sorry about that.
Crashy
Google Summer of Code Student
Google Summer of Code Student
Posts: 1005
Joined: Wed Jan 08, 2003 9:15 pm
Location: Lyon, France
x 49
Contact:

Post by Crashy »

Hello, I take advantage of this topic to post something about a shader I try to import in OGRE.

The main interrogation is about the error message I do not understant.
-----------------------------------
Error #: -2147467259
Function: D3D9HLSLProgram::loadFromSource
Description: Cannot assemble D3D9 high-level shader FurVPerror X3503: 'Transform_FID4': function return value missing semantics
.
Here is the shader code

Code: Select all

//*********************************************************************************
//*
//* Description: HLSL Shader Code
//*
//* Note: This shader was built using Mad Software's ShaderWorks 1.0 
//*
//*********************************************************************************




//---------------------------------------------------------------------------------
//Dynamic constants
//---------------------------------------------------------------------------------

float4x4 View_Projection_ID3 ;// SEM_MATRIX_VIEWPROJECTION;  //no comments
float4   Camera_Position_ID3 ;// SEM_VECTOR_CAMERA_POSITION; //no comments


//---------------------------------------------------------------------------------
//Functions
//---------------------------------------------------------------------------------

//Function (Block FID4)

struct OUTPUT_ID4
{
	float4 TPosition:POSITION;
	float4 Diffuse;
};

OUTPUT_ID4 Transform_FID4(float4 Vertex_Position:POSITION, float4 Vertex_Normal:NORMAL,uniform float4x4 View_Projection,uniform float4 Camera_Position)
{
	OUTPUT_ID4 Out = (OUTPUT_ID4)0;
	Out.TPosition = mul(Vertex_Position, View_Projection);
	
	float3 E = normalize( Vertex_Position.xyz - Camera_Position.xyz );
	
	float  I = saturate( dot( E, -Vertex_Normal.xyz ) );
	
	Out.Diffuse   = 0;
	    
	return Out;
}


//---------------------------------------------------------------------------------
//Input
//---------------------------------------------------------------------------------

struct INPUT
{
	float3 Vertex_Position : POSITION; //no comments
	float3 Vertex_Normal   : NORMAL;   //no comments
};


//---------------------------------------------------------------------------------
//Output
//---------------------------------------------------------------------------------

struct OUTPUT
{
	float4 TPosition : POSITION; //no comments
	float4 Color0    : COLOR0;   //no comments
};


//---------------------------------------------------------------------------------
//Shader entry
//---------------------------------------------------------------------------------

OUTPUT Shader(INPUT In)
{
	OUTPUT Out = (OUTPUT)0;


	//Block level

	float4 Vertex_Position_ID4 = float4(In.Vertex_Position, 1);
	float4 Vertex_Normal_ID4 = float4(In.Vertex_Normal, 1);

	OUTPUT_ID4 OUT_Transform_FID4 = Transform_FID4(Vertex_Position_ID4, Vertex_Normal_ID4, View_Projection_ID3, Camera_Position_ID3);

	//Block level

	Out.TPosition = OUT_Transform_FID4.TPosition;
	Out.Color0 = OUT_Transform_FID4.Diffuse;

	return Out;
}

and here the material file. I'm not really sure of some things, especially the fragment program part.

Code: Select all

// -------------------------------
// Fur shader, from ShaderWorks
// -------------------------------

vertex_program FurVP hlsl
{
source Fur.fx
entry_point Transform_FID4
target vs_2_0
}

fragment_program FurFP hlsl
{
source Fur.fx
entry_point Transform_FID4
target ps_2_0
}

material Examples/Fur
{
technique
{
pass
{
vertex_program_ref FurVP
{
param_named_auto View_Projection_ID3 worldviewproj_matrix 
param_named_auto  Camera_Position_ID3 camera_position_object_space 

}
fragment_program_ref FurFP
{

}
texture_unit
{
texture texture.jpg
tex_address_mode clamp

}
texture_unit
{
texture texture.jpg
tex_address_mode clamp

}
}
}

} 
Follow la Moustache on Twitter or on Facebook
Image
User avatar
sinbad
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 19269
Joined: Sun Oct 06, 2002 11:19 pm
Location: Guernsey, Channel Islands
x 66
Contact:

Post by sinbad »

The error is pretty definitive - you made your entry point 'Transform_FID4', which returns the following struct:

Code: Select all

struct OUTPUT_ID4
{
   float4 TPosition:POSITION;
   float4 Diffuse;
};
.. however the 'Diffuse' attribute does not have a semantic, which is invalid for the output of a program. The reason is that you used the wrong entry point. The entry point is actually 'Shader'.
Crashy
Google Summer of Code Student
Google Summer of Code Student
Posts: 1005
Joined: Wed Jan 08, 2003 9:15 pm
Location: Lyon, France
x 49
Contact:

Post by Crashy »

Ok, but what type of semantic would I give to my Diffuse parameter?

apart from that, there is still an error when changind the entry point to shader.
I'll see it this evening.

Thx for answer :)
Follow la Moustache on Twitter or on Facebook
Image
User avatar
epopov
Halfling
Posts: 85
Joined: Tue Jun 10, 2003 2:57 pm
Contact:

Post by epopov »

No need to give a semantic to diffuse, as Sinbad said, because the structure is not used by the entry point function but by a 'sub-function'.
Semantics have only to be defined for in/out parameters of the entry point program.
nfz
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 1263
Joined: Wed Sep 24, 2003 4:00 pm
Location: Halifax, Nova Scotia, Canada

Post by nfz »

epopov: good point. The reason why it gave the error was because the ogre material script had Transform_FID4 as the entry point so the compiler had good reason to complain about the Diffuse missing a semantic. The entry point should be "Shader" as Sinbad stated. The other problem is that a fragment_program is defined in the script with an entry point "Transform_FID4" which is incorrect. There is no fragment program listed above so I don't know what the entry point would be. If its a standalone vertex program (looks like it) then don't define a fragment program in the material script.
nfz
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 1263
Joined: Wed Sep 24, 2003 4:00 pm
Location: Halifax, Nova Scotia, Canada

Post by nfz »

I checked the fur examples in shaderworks and I think I found the example fur shader you are trying to import into Ogre. it doesn't have a fragment program for the first pass, its just has a vertex program. The passes after the first one use a different vertex program and use a fragment program.
Crashy
Google Summer of Code Student
Google Summer of Code Student
Posts: 1005
Joined: Wed Jan 08, 2003 9:15 pm
Location: Lyon, France
x 49
Contact:

Post by Crashy »

Okay, I understand this. I was not sure if I needed or not to define a fragment program in the material if there is no fragment program.
For the moment the sample launch, but the object is invisible, and I have strange artifacts, like blacks surfaces at the screen.
view this screen
http://www.antharia.org/include/screens ... shot_1.jpg

and in the .log, this message
-----------------------------------
Details:
-----------------------------------
Error #: 7
Function: GpuProgramParameters::getParamIndex
Description: Cannot find a parameter named Camera_Position_ID3.
File: F:\Ogre14\Ogre14\ogrenew\OgreMain\src\OgreGpuProgram.cpp
Line: 393
Stack unwinding: <<beginning of stack>>
18:49:05: Error in material Examples/Fur at line 22 of fur.material: Invalid param_named_auto attribute - An exception has been thrown!

-----------------------------------
Details:
-----------------------------------
Error #: 7
Function: GpuProgramParameters::getParamIndex
Description: Cannot find a parameter named Camera_Position_ID3.
Note that I reset the shader program a it was in Shader Works, because the modification I made were without effects(both good or bad).

Here is a pic of the shader taken under shaderworks.
http://www.antharia.org/include/screens/furpic.jpg

Note that I did as for the bump mapping, building the tangent for the mesh.
Follow la Moustache on Twitter or on Facebook
Image
Captain Nemo
Greenskin
Posts: 134
Joined: Sun May 02, 2004 5:06 pm
Location: Kassel, Germany
Contact:

Post by Captain Nemo »

nfz wrote: To sum it up: the first sampler you define in your hlsl source corresponds to the first texture_unit in the pass of the technique in the material script and the second sampler defined refers to the second texture_unit and so on.
I would like to resurrect this thread, because I am having a problem assigning texture units to my HLSL shader. I have a simple fragment program (just for testing the feature):

Code: Select all

struct VS_OUTPUT
{
    float2 vTex0  : TEXCOORD0;
};

sampler sampl0;
sampler sampl1;

float4 main(VS_OUTPUT In) : COLOR
{  
	float4 texColor0 = tex2D(sampl0, In.vTex0);
	float4 texColor1 = tex2D(sampl1, In.vTex0);
	return texColor0 * texColor1;
}
and a corresponding material with two texture units in the same pass. According to the post above the first texture unit should be assigned to sampl0 and the second to sampl1. But texColor1 is always set to the first texture. What is wrong here? Am I misunderstanding something very simple? (Using 0.15)
Post Reply