[2.3] Combining the Tutorials OpenVR and Sky_Postprocess

A place for users of OGRE to discuss ideas and experiences of utilitising OGRE in their games / demos / applications.
Post Reply
Slamy
Gnoblar
Posts: 6
Joined: Sat Mar 27, 2021 10:49 pm

[2.3] Combining the Tutorials OpenVR and Sky_Postprocess

Post by Slamy »

Hello you all. As I'm new to this board I think I shall introduce myself first.
My name is Slamy and I like to program stuff in my free time for fun. In January I got my hands on a VR headset as I thought it would be time to try them out. After playing some games I thought It would be cool to develop something for it. I've tried Unity and Unreal but it was all too much and as I'm already familiar with SDL and a little bit of OpenGL I searched for 3D game engines and ogre-next seemed to be a good choice as It already comes with a working example.
After tinkering with it for quite some time I wanted to get rid of the blue sky and replace with a skybox. Luckily there was already a working example of a Skybox using the Compositor. I tried to combine them as I thought that adding a render_quad to the main_stereo_render might do the trick.

Code: Select all

abstract target rt_renderwindow
{
	//Render opaque stuff
	pass render_scene
	{
		load
		{
			all				clear
			clear_colour	0.2 0.4 0.6 1
		}
		store
		{
			depth	dont_care
			stencil	dont_care
		}

		overlays	off

		shadows		ShadowMapDebuggingShadowNode
	}
	
	//Render sky after opaque stuff (performance optimization)
	pass render_quad
	{
		quad_normals	camera_direction
		material SkyPostprocess
	}
}

abstract target main_stereo_render
{
	//Eye render
	pass render_scene
	{
		load
		{
			all				clear
			clear_colour	0.2 0.4 0.6 1
		}
		store
		{
			depth	dont_care
			stencil	dont_care
		}

		//0x01234567
		identifier 19088743

		overlays	off

		cull_camera VrCullCamera

		shadows		ShadowMapDebuggingShadowNode

		instanced_stereo true
		viewport 0 0.0 0.0 0.5 1.0
		viewport 1 0.5 0.0 0.5 1.0
		
	}

	pass render_quad
	{
		quad_normals	camera_direction
		material SkyPostprocess
		viewport 0.0 0.0 0.5 1.0
	}
	
	pass render_quad
	{
		quad_normals	camera_direction
		material SkyPostprocess
		viewport 0 0.5 0.0 0.5 1.0
	}
}



compositor_node Tutorial_OpenVRNodeNoRDM
{
	in 0 stereo_output

	target stereo_output : main_stereo_render {}
}

compositor_node Tutorial_OpenVRMirrorWindowNode
{
	in 0 rt_renderwindow

	target rt_renderwindow : rt_renderwindow {}

}


workspace Tutorial_OpenVRWorkspaceNoRDM
{
	connect_output Tutorial_OpenVRNodeNoRDM 0
}

workspace Tutorial_OpenVRMirrorWindowWorkspace
{
	//connect_output Tutorial_OpenVRMirrorWindowNode 0
	connect_output Tutorial_OpenVRNodeNoRDM 0
}
Sadly though it only looks as expected if the define USE_OPEN_VR is inactive. It then goes for the NullCompositorListener and I can see both eyes. If I rotate the scene and the skybox match up. But if I activate USE_OPEN_VR it seems to alter the projection and if I rotate the view it looks weird. It almost seems that the Fov of the rendered scene doesn't match up with the skybox.
I had some thoughts but I can't put my finger on it. Maybe its because the Eye positions on the NullCompositor are equal and not equal on the OpenVRCompositorListener?
Is "pass render_quad" even considered a good option for something which also uses "instanced_stereo true" in the rendered scene?
To my shame I'm new to Ogre and also new to OpenVR. So maybe educating myself in both topics at the same time was a bad way to learn. :?

I hope haven't hurt any rule of the forum by asking this. Maybe someone has an idea for me. :wink:

Slamy

User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 4654
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 988
Contact:

Re: [2.3] Combining the Tutorials OpenVR and Sky_Postprocess

Post by dark_sylinc »

Hi!

Like you found out, running a render_pass in VR is not trivial because you need to figure out the math of the reprojections and make sure each half of the image gets treated correctly.

We tried to get the sky as a postprocess sample in VR but there was an issue with aspect ratio and tilting which causes a slight distortion with the sky (and that makes you really dizzy in VR) thus it's better and easier to use SceneManager::setSky which will work in VR.

Just call that function and you're set.

Cheers
Matias

Slamy
Gnoblar
Posts: 6
Joined: Sat Mar 27, 2021 10:49 pm

Re: [2.3] Combining the Tutorials OpenVR and Sky_Postprocess

Post by Slamy »

Hi!
First of all, thank you for your fast answer.
I didn't knew there was a function like setSky. An Example is surely lacking there. :D
But it seems there is some bug or so in it.
I've added this to Tutorial_OpenVRGameState::createScene01(void) to be sure that I base this on the example and not that my project is faulty:

Code: Select all

sceneManager->setSky(true, Ogre::SceneManager::SkyCubemap, "SaintPetersBasilica.dds",
						 Ogre::ResourceGroupManager::AUTODETECT_RESOURCE_GROUP_NAME);
But it seems that only the left eye is getting the sky rendered while the right eye is clear colored blue. Maybe I'm still missing something?

User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 4654
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 988
Contact:

Re: [2.3] Combining the Tutorials OpenVR and Sky_Postprocess

Post by dark_sylinc »

Mmmm that's weird. Unlit isn't using instanced stereo.

Arrghh!!! It uses low level materials. I forgot!

QuadCameraDirNoUV_vs would have to be edited to use gl_ViewportIndex and gl_InstanceID & 0x02 to get which eye you're rendering to.

This was quite the oversight.

Something like:

Code: Select all

gl_Position.xy = (worldViewProj * vec4( vertex.xy, 0, 1.0f )).xy;
if( (gl_InstanceID & 0x02) != 0 )
    gl_Position.x += 0.5f;
gl_ViewportIndex = gl_InstanceID & 0x02;
May just work, but needs testing

Slamy
Gnoblar
Posts: 6
Joined: Sat Mar 27, 2021 10:49 pm

Re: [2.3] Combining the Tutorials OpenVR and Sky_Postprocess

Post by Slamy »

I'm feeling like I just did a math test or so.
Something like this at least got the sky also on the right side as well.

Code: Select all

#version ogre_glsl_ver_330
#extension GL_ARB_shader_viewport_layer_array : require

vulkan_layout( OGRE_POSITION )	in vec2 vertex;
vulkan_layout( OGRE_NORMAL )	in vec3 normal;

in uint drawId;

vulkan( layout( ogre_P0 ) uniform Params { )
	uniform vec2 rsDepthRange;
	uniform mat4 worldViewProj;
vulkan( }; )

out gl_PerVertex
{
	vec4 gl_Position;
};

vulkan_layout( location = 0 )
out block
{
	vec3 cameraDir;
} outVs;

void main()
{
	gl_Position.xy = (worldViewProj * vec4( vertex.xy, 0, 1.0f )).xy;

	gl_Position.z = rsDepthRange.y;
	gl_Position.w = 1.0f;
	
	outVs.cameraDir.xyz	= normal.xyz;
	gl_ViewportIndex	= int( drawId & 0x01u );
}
But something is still wrong and its the FOV as it seems.

When I remove

Code: Select all

//mRenderSystem->_convertOpenVrProjectionMatrix(projectionMatrix[i], projectionMatrixRS[i]);
and provide a fixed matrix like below, I get of course the wrong transformation for my VR headset but at least the Fov of the rendered scene and the QuadCameraDirNoUV_vs do match up and when I rotate the camera, the scene does match up with my skybox.

Code: Select all

		Ogre::Matrix4 projectionMatrixRS[2] = {mCamera->getProjectionMatrixWithRSDepth(),
											   mCamera->getProjectionMatrixWithRSDepth()};
So there is something missing. It's almost as the FOV is not taken from the openvr projection matrix but a standard one instead.
These photos show this.

Image
Image

The skybox looks like a direct copy with no dependence on the VR projection matrix.
To my defense, I don't know anything about GSLS. This is all very new to me.

Slamy
Gnoblar
Posts: 6
Joined: Sat Mar 27, 2021 10:49 pm

Re: [2.3] Combining the Tutorials OpenVR and Sky_Postprocess

Post by Slamy »

dark_sylinc wrote:
Mon Mar 29, 2021 7:05 am
Arrghh!!! It uses low level materials. I forgot!
Ok, this is something I now understand. In OgreHlmsPbs.cpp there are these lines:

Code: Select all

	if( !isInstancedStereo )
        {
           // some non vr stuff :-D
        }
        else
        {
            //float4x4 viewProj[2];
            Matrix4 vrViewMat[2];
            for( size_t eyeIdx=0u; eyeIdx<2u; ++eyeIdx )
            {
                vrViewMat[eyeIdx] = cameras.renderingCamera->getVrViewMatrix( eyeIdx );
                Matrix4 vrProjMat = cameras.renderingCamera->getVrProjectionMatrix( eyeIdx );
                if( renderPassDesc->requiresTextureFlipping() )
                {
                    vrProjMat[1][0] = -vrProjMat[1][0];
                    vrProjMat[1][1] = -vrProjMat[1][1];
                    vrProjMat[1][2] = -vrProjMat[1][2];
                    vrProjMat[1][3] = -vrProjMat[1][3];
                }
                Matrix4 viewProjMatrix = vrProjMat * vrViewMat[eyeIdx];
                for( size_t i=0; i<16; ++i )
                    *passBufferPtr++ = (float)viewProjMatrix[0][i];
            }

            //float4x4 leftEyeViewSpaceToCullCamClipSpace
            if( forwardPlus )
            {
                Matrix4 cullViewMat = cameras.cullingCamera->getViewMatrix( true );
                Matrix4 cullProjMat = cameras.cullingCamera->getProjectionMatrix();
                if( renderPassDesc->requiresTextureFlipping() )
                {
                    cullProjMat[1][0] = -cullProjMat[1][0];
                    cullProjMat[1][1] = -cullProjMat[1][1];
                    cullProjMat[1][2] = -cullProjMat[1][2];
                    cullProjMat[1][3] = -cullProjMat[1][3];
                }
                Matrix4 leftEyeViewSpaceToCullCamClipSpace;
                leftEyeViewSpaceToCullCamClipSpace = cullProjMat * cullViewMat *
                                                     vrViewMat[0].inverseAffine();
                for( size_t i=0u; i<16u; ++i )
                    *passBufferPtr++ = (float)leftEyeViewSpaceToCullCamClipSpace[0][i];
            }
It fills the PassBuffer of the Pbs material containing:

Code: Select all

layout( std140, binding = 0 ) uniform PassBuffer
{
mat4 viewProj[2];
vec4 leftToRightView;
Now the QuadCameraDirNoUV_vs.glsl has only

Code: Select all

vulkan( layout( ogre_P0 ) uniform Params { )
	uniform vec2 rsDepthRange;
	uniform mat4 worldViewProj;
vulkan( }; )
So this means that for the Low Level Materials there is the dependency missing to the isInstancedStereo mode which only exists for Pbs and Unlit.
I think I would be close to a solution If I knew the place where the worldViewProj matrix is provided to the vertex shader. I'm currently unable to find it.

User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 4654
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 988
Contact:

Re: [2.3] Combining the Tutorials OpenVR and Sky_Postprocess

Post by dark_sylinc »

Yes! The code responsible of handling this for low level materials is in AutoParamDataSource (see AutoParamDataSource::getWorldViewProjMatrix)

IMHO it looks like this would be better handling by adding a new auto param for VR matrices. See ACT_WORLDVIEWPROJ_MATRIX.
You'd basically have to add a new ACT_*_MATRIX and a new function to AutoParamDataSource for handling the stereo matrices.

Since you need to send two matrices, you can see ACT_WORLD_MATRIX_ARRAY_3x4 / ACT_WORLD_MATRIX_ARRAY for reference which send more than 1 matrix to the shader.

Slamy
Gnoblar
Posts: 6
Joined: Sat Mar 27, 2021 10:49 pm

Re: [2.3] Combining the Tutorials OpenVR and Sky_Postprocess

Post by Slamy »

It seems I'm not up to this task.
I've added a ACT_VR_WORLDVIEWPROJ_MATRIX to the AutoParams and also a
const Matrix4& AutoParamDataSource::getVrWorldViewProjMatrix(size_t eye) const
to acompany it.
I've integrated getVrProjectionMatrix() into it and first thought that getVrViewMatrix is needed as well. But this wasn't the case as the cube seems to make use of the IdentitiyView and therefore I guess the view matrix shall be the identity even for VR.

But even so I've applied all this and made the shader use the newly added matrix pair by adding it to Quad.program there is still something off.
The view on the sky is squashed horizontally and I need to apply a 2* scale on X to get this out. Then there was also a mysterious 0.43 scaling factor which needed to be applied to all axes to get the FOV right. I've got this number empirically through visual experimentation.
Then - as also mentioned by you - there is an offset for the both eyes applied in the shader depending on the IPD, I guess?

The result is a small window in the view which allows a peek into the skybox without much eye damage. But this can't be the solution and it still feels wrong.
Image

I've forked the main repo and put all what I did in this. It should be compilable out of the box as only small things were changed.
https://github.com/OGRECave/ogre-next/c ... re/skyTest

I think I'll go just with 6 planes and Hlms Unlit to avoid any further headaches.

Post Reply