Oculus Rift in Ogre

A place for users of OGRE to discuss ideas and experiences of utilitising OGRE in their games / demos / applications.
User avatar
Kojack
OGRE Moderator
OGRE Moderator
Posts: 7157
Joined: Sun Jan 25, 2004 7:35 am
Location: Brisbane, Australia
x 534

Re: Oculus Rift in Ogre

Post by Kojack »

It's not that I don't trust them, it's just that I don't like the idea of the post process shaders being hidden inside of a library (they compile the shaders then embed them into the lib) with no way to change them. There might be things we want to change, or special effects we can do there that will be much more efficient than in the render textures (in theory the render textures have 3 times the pixels of the native display, and the distortion mesh only renders to 75% of the native display. So any post processing that can be done in the distortion shader will have a dramatically lower hit than doing it to the undistorted render textures.

The way I'm looking at will get the distortion mesh from the sdk (basically a 64 x 64 grid for each eye, so around 15876 triangles), then send that to the screen using a manual render call. Which should be a tiny bit of code, except manual render is poorly documented, and people who asked for info on the forum were told to not use it if they didn't understand it. I think I've got it partly working.
User avatar
cybereality
Hobgoblin
Posts: 563
Joined: Wed Jul 12, 2006 5:40 pm
x 12

Re: Oculus Rift in Ogre

Post by cybereality »

@KojackL Yes, I see what you are saying. One one hand, it's nice to have a pre-built "plug and play" solution for the distortion (since almost everyone was getting it wrong rolling their own). On the other hand, you do give up some basic control over the rendering when you do this. For people using something like Unity, it makes perfect sense that they don't want to mess with things. For people using DirectX/OpenGL, they are probably doing more custom stuff and want full control.
amartin
Halfling
Posts: 87
Joined: Wed Aug 14, 2013 6:55 am
Location: Norway
x 13

Re: Oculus Rift in Ogre

Post by amartin »

I think one of the big problems with rolling your own and getting it wrong at least from my experience. Is that it was very hard to tell if you infact had it right. There seem to be two types of distortion I've seen. The first is the distorted areas just touch in the centre but have a gap top bottom and sides. The second seems to be the distorted areas touch the top bottom and sides but get cut off a bit in the centre. It's really hard to tell if one is more correct than the other a quick google image search showed that both had been done and both of them seem to look ok when viewed through the goggles.

At least with the sdk distortion there is now something to test the hand rolled aproach against which is probably priceless.
User avatar
Kojack
OGRE Moderator
OGRE Moderator
Posts: 7157
Joined: Sun Jan 25, 2004 7:35 am
Location: Brisbane, Australia
x 534

Re: Oculus Rift in Ogre

Post by Kojack »

One concern is that ogre wouldn't know that something else is fiddling with directx/opengl, and the oculus sdk wouldn't know it's being called by ogre. That sounds potentially troublesome. Could be fine, I've never tried mixing ogre with native dx/gl api calls.

With the new mesh system for distortion (clever solution), it should be harder for people to get things wrong, most of the distortion math is being done in the sdk. The pixel shader does very little now.
Here's the sdk doc's pixel shader:

Code: Select all

Texture2D Texture : register(t0);
SamplerState Linear : register(s0);
float4 main(in float4 oPosition : SV_Position, in float4 oColor : COLOR, in float2 oTexCoord0 : TEXCOORD0, in float2 oTexCoord1 : TEXCOORD1, in float2 oTexCoord2 : TEXCOORD2) : SV_Target
{
   // 3 samples for fixing chromatic aberrations
   float ResultR = Texture.Sample(Linear, oTexCoord0.xy).r;
   float ResultG = Texture.Sample(Linear, oTexCoord1.xy).g;
   float ResultB = Texture.Sample(Linear, oTexCoord2.xy).b;
   return float4(ResultR * oColor.r, ResultG * oColor.g, ResultB * oColor.b, 1.0);
}
Far simpler than before, the previous one had 6 extra parameters needed. This should be faster too, thanks to the shape of the meshes (not filling the screen).
amartin
Halfling
Posts: 87
Joined: Wed Aug 14, 2013 6:55 am
Location: Norway
x 13

Re: Oculus Rift in Ogre

Post by amartin »

I'm not sure if you saw it but this was posted earlier if you are looking at mesh based distortion it's worth the read just for the comparison and backgrounds they give.
bullale wrote:While this thread has the attention of a few people that are quite capable with Ogre, I would like to point you to a thread on mtbs3d.
http://www.mtbs3d.com/phpBB/viewtopic.p ... 659ee8e9e5 which links to a blog which links to this (very short) research paper: http://www.qwrt.de/pdf/Improved-Pre-War ... splays.pdf

Their goal is to decrease the blurriness caused by the barrel distortion and chromatic aberration correction. They compared the following techniques:
  • Bicubic texture interpolation in the pixel shader (instead of bilinear texture lookup).
    Barrel-warping the scene (vertex-space).
    Over-sampling.
They also mentioned some chromatic aberration correction techniques that I don't understand.

Barrel-warping the scene seems to work the best but it requires the ability to specify how the scene is sampled (i.e., not using a typical rectilinear grid). They describe this as warping the rays that are cast from the camera to the scene. Is this is something that could be implemented in Ogre? One thing that wasn't clear to me was whether or not this warping had to be done separately for each color channel. If so, it may be too slow.

*Edit* The mtbs3d thread has a link to the following which is definitely worth a look and has some sample code for barrel warping in the vertex shader: http://github.prideout.net/barrel-distortion/
User avatar
Kojack
OGRE Moderator
OGRE Moderator
Posts: 7157
Joined: Sun Jan 25, 2004 7:35 am
Location: Brisbane, Australia
x 534

Re: Oculus Rift in Ogre

Post by Kojack »

The improved technique they mention is ray tracing, which ogre can't natively do. As soon as I saw Saarland University mentioned at the top of the paper I knew ray tracing would be in there somewhere (they made the first raytracing chip).

It's odd that in the performance part they measure normal rendering using ms and ray tracing with fps. 2ms vs 20fps doesn't sound as bad as 500fps vs 20fps. :)

Ray tracing is definitely the way to go, especially if we get wider fov (normal gpu rendering can't handle very wide fov, the frustum approaches infinitely wide as the fov approaches 180 degrees. Ray tracing can do any fov, even greater than 360). But we still don't have hardware that can handle it at required speeds. The PowerVR ray tracing chip for mobiles sounds interesting, but only if they dramatically boost it's power and put it on pc. :)
Gdlk
Halfling
Posts: 53
Joined: Mon Dec 05, 2011 9:43 pm
x 1

Re: Oculus Rift in Ogre

Post by Gdlk »

Hi there!

I am trying to use the Oculus 0.3.2 SDK distortion rendering approach with Ogre 1.9 (Linux/OpenGL), but I can't get it work. I replicate the TaaTT4 solution ( http://www.ogre3d.org/forums/viewtopic. ... 00#p505652 ), but I only got a black screen =( ... (when I deactive the oculus rendering (i.e. not calling ovrHmd_ConfigureRendering), all the scene and RTT textures show fine)

Someone had the same problem? or someone have an idea what could be?

Thanks!!

PS1: Just by curiosity, if I let the Ogre call swapBuffer and the Oculus SDK call it too, the app could render to screen anyways, although with performance penalties, right??

PS2: Just by curiosity (again =P ), Kojack so far as you had developed, has been too dificult integrate the other approach with Ogre?? (would be nice have the two ways working)
User avatar
TaaTT4
OGRE Contributor
OGRE Contributor
Posts: 267
Joined: Wed Apr 23, 2014 3:49 pm
Location: Bologna, Italy
x 75
Contact:

Re: Oculus Rift in Ogre

Post by TaaTT4 »

Gdlk wrote: I am trying to use the Oculus 0.3.2 SDK distortion rendering approach with Ogre 1.9 (Linux/OpenGL), but I can't get it work. I replicate the TaaTT4 solution ( http://www.ogre3d.org/forums/viewtopic. ... 00#p505652 ), but I only got a black screen =( ... (when I deactive the oculus rendering (i.e. not calling ovrHmd_ConfigureRendering), all the scene and RTT textures show fine)

Someone had the same problem? or someone have an idea what could be?
You have to set the render target where the Oculus SDK will render and swap to.
Do it before calling the Oculus SDK EndFrame function.

Code: Select all

Root::getSingleton().getRenderSystem()->_setRenderTarget(m_window);

ovrHmd_EndFrame(m_hmd);
Gdlk
Halfling
Posts: 53
Joined: Mon Dec 05, 2011 9:43 pm
x 1

Re: Oculus Rift in Ogre

Post by Gdlk »

TaaTT4 wrote:
Gdlk wrote: I am trying to use the Oculus 0.3.2 SDK distortion rendering approach with Ogre 1.9 (Linux/OpenGL), but I can't get it work. I replicate the TaaTT4 solution ( http://www.ogre3d.org/forums/viewtopic. ... 00#p505652 ), but I only got a black screen =( ... (when I deactive the oculus rendering (i.e. not calling ovrHmd_ConfigureRendering), all the scene and RTT textures show fine)

Someone had the same problem? or someone have an idea what could be?
You have to set the render target where the Oculus SDK will render and swap to.
Do it before calling the Oculus SDK EndFrame function.

Code: Select all

Root::getSingleton().getRenderSystem()->_setRenderTarget(m_window);

ovrHmd_EndFrame(m_hmd);

Thanks TaaTT4!!

I got the viewport background color! (actually was my fault because I set it black -.-'), however I can't show objects yet =( (even setting the renderTarget like you said)

The strange thing is, if I set a normal ogre application and I call only up to the ovrHmd_ConfigureRendering (i.e., never call ovrHmd_BeginFrame, ovrHmd_BeginEyeRender and ovrHmd_EndFrame; let the ogre made the swapBuffer), the window only show the viewport background color (although the _getNumRenderedFaces method on the viewport is ok (if I move the camera, the number of faces change according to the mesh that theorical are "show there"))
Gdlk
Halfling
Posts: 53
Joined: Mon Dec 05, 2011 9:43 pm
x 1

Re: Oculus Rift in Ogre

Post by Gdlk »

Well, the problem now is the same that you TaaTT4, and the solution is by the same way that you say in your post =)

Edited: a more clean way

after call ovrHmd_ConfigureRendering, add the next line:

- glUseProgram(0);

Regards!!
Gdlk
Halfling
Posts: 53
Joined: Mon Dec 05, 2011 9:43 pm
x 1

Re: Oculus Rift in Ogre

Post by Gdlk »

Hi there!

I have other strange problem with the Oculus 0.3.2 (Linux/OpenGL)... Using oculus all the time, when I see the intermediate RTT texture (saving in a file for example), it looks all fine. However, when is showed in screen (i.e., oculus sdk take the texture and made their stuffs), the image is flipped (like in a mirror).

Someone had the same problem? or someone have an idea what could be?

Thanks!!
Gdlk
Halfling
Posts: 53
Joined: Mon Dec 05, 2011 9:43 pm
x 1

Re: Oculus Rift in Ogre

Post by Gdlk »

I don't know why I had that problem but I can solve it...

If someone has this problem:

in the file CAPI_GL_DistortionRenderer.cpp, in the method renderDistortion, before call DistortionShader->setUniform2f(...), add:

- eachEye[eyeNum].UVScaleOffset[0].y = -eachEye[eyeNum].UVScaleOffset[0].y;


Regards!!
cullam
Kobold
Posts: 30
Joined: Wed Aug 15, 2012 7:19 pm

Re: Oculus Rift in Ogre

Post by cullam »

I just got my hands on the new Oculus DK2. I'm about to try hooking it up for my Ogre project, replacing the previous version, and just see what happens. But has any work been done on setting up Ogre to use Oculus DK2? No sense in me re-doing what's already been done.
User avatar
cybereality
Hobgoblin
Posts: 563
Joined: Wed Jul 12, 2006 5:40 pm
x 12

Re: Oculus Rift in Ogre

Post by cybereality »

Kojack is working on DK2 support in Ogre 2.0. I don't believe it was finished or released. In any case, the Oculus SDK has changed significantly with DK2, and the old stuff will not work anymore. Head-tracking support is very simple, but the distortion is a little more complex now. The suggested method is to let the Oculus SDK render the distortion for you. So you allocate and render to a stereo buffer (either 2 buffers or 1 combined buffer) and you pass this to the OVR SDK and it handles the distortion and the present. However, this can be a big change to the rendering code, and I haven't dug deep enough to understand how to do this in Ogre yet.
User avatar
Kojack
OGRE Moderator
OGRE Moderator
Posts: 7157
Joined: Sun Jan 25, 2004 7:35 am
Location: Brisbane, Australia
x 534

Re: Oculus Rift in Ogre

Post by Kojack »

Hopefully I'll get something going over the weekend.
I'm on teacher crunch time at the moment (end of trimester, got to get all my marking finished by tonight. Wait, why the hell am I on here. ARGH!), I should be able to code for a few days soon.
(Although I'm getting robots tomorrow too, that might distract me. But one has a streaming webcam and windows sdk, maybe I can get the DK2 to show it...)

Head tracking of the DK2 is very easy to handle in Ogre. That won't be a problem (already got it working in a test). Even things like handling changing orientation to the user lying down is simple.
Distortion rendering is a little trickier. The mesh part is easy, it's getting the ogre compositor to handle non quad compositor passes that's harder. The old 1.9 compositor had support for it, but no documentation on that aspect. I also don't like the way the old compositor handled materials (it cloned them then hid them from the material manager, making real time shader changes very annoying). The 2.0 compositor seems better, but is less documented so far.
Direct mode works fine with ogre, as long as it's in DirectX 11. I haven't tested the new 0.4.2 sdk to see if it fixes the dx9 or opengl Direct mode support.
User avatar
Ybalrid
Halfling
Posts: 89
Joined: Thu Jul 10, 2014 6:52 pm
Location: France
x 31
Contact:

Re: Oculus Rift in Ogre

Post by Ybalrid »

Apparently DirectMode sitll doesn't works with OpenGL with 0.4.2 as far as i've tested. I'm trying to avoid Direct X since I want my code to be simply portable with Linux (even if Oculus SDK 0.4 branch still doesn't have linux support yet)
ChristophLGDV
Gnoblar
Posts: 18
Joined: Mon Jun 01, 2015 3:09 pm

Re: Oculus Rift in Ogre

Post by ChristophLGDV »

First of, Thanks to all who have contributed to the samples regarding Ogre and Oculus.
Specifically the OgreOculusSample has been very educational.

However I was wondering if it were possible to add SDK Distortion to the Ogre rendering loop and replace the Client side distortion.

Thus I have removed the final Camera/Viewport and attached the hmd to the ogre window. and configured the rendering (somehow).
I have

I have replaced the frametiming with beginFrame and endFrame.

Unfortunately, I get a segfault when calling endFrame with some xqueryextension complaining.


I would be very grateful for some hint, as to what I am doing wrong.
kokonut
Gnoblar
Posts: 7
Joined: Wed Feb 27, 2013 8:07 pm
Location: USA

Re: Oculus Rift in Ogre

Post by kokonut »

Hello all,
It has been quite a while and Oculus has completely changed their SDK, in a way that I'm not even sure it is possible to integrate into Ogre anymore. Also, you might already know this, but for now Oculus SDK is Windows only. Nevertheless, I wanted to try to get it working with Ogre.

Here are my following issues:
  1. The Oculus SDK gives you render textures and expects you to use them. How do you use third party supplied textures (int for OpenGL and a pointer for DirectX) in Ogre. Am I correct in thinking there is no way to do this right now? I have a hideous solution involving "#define private public" and static_cast to a dummy subclass to access protected methods, but that obviously isn't stable and portable between OpenGL and DirectX. Is it possible to do this without patching Ogre?
    https://github.com/raymond-w-ko/dk2test ... t.cpp#L179
  2. I am experiencing judder and random frame stalls, even with an empty scene. I suspect it is because I used the older OpenGL 2.0 rendersystem, but not sure if this is Ogre's fault. It probably is combination of Oculus and Nvidia's drivers. To be investigated later.
  3. Looking at a potential fixes, I was trying out Ogre 2.0, but currently in the process of figuring out how to do RTT with the new compositors. Has there been any new documentation on this besides code samples? I basically have 2 cameras (one for each eye) and 4 render textures (left_eye_swap0, left_eye_swap1, right_eye_swap_0, right_eye_swap_1). I want to increment the a texture index variable, manual RTT to left_eye_swap1, manual RTT to right_eye_swap_1, call the Oculus submit frame option, swap buffers (wglSwapBuffersEXT), increment that variable again, mod 2, RTT to left_eye_swap0, right_eye_swap_0, call the Oculus submit frame option, and so on in a loop. Can the new compositor systems do this? Especially control order of rendering, disable rendering of certain RTT that is not time yet to render, and callback to call a third party function (ovr_SubmitFrame), and then internally swapBuffers()?

    I had this all figured out in 1.9/1.10, but from what I understand in 2.0 this is no longer the way to go.
kokonut
Gnoblar
Posts: 7
Joined: Wed Feb 27, 2013 8:07 pm
Location: USA

Re: Oculus Rift in Ogre

Post by kokonut »

After some reading and experimentation, I was able to figure out the answers to my own questions.
  1. The answer is still probably not. I will rely on my hackish shim function for now as I wanted to avoid maintaining my own fork of Ogre which supports these features. It probably would be too controversial to propose accepting foreign textures. Like why would not let Ogre the rendering engine manage and do all the rendering? Could it be because they are actually DirectX textures masquerading as OpenGL textures via DX_Interop https://www.opengl.org/registry/specs/NV/DX_interop.txt :wink:
  2. After moving to using Ogre 2.0 and the GL3+ render system, all the crazy stuttering issues are gone, except for the occasional frame drop. Apparently other people on the Oculus forum experience this with OpenGL too, so this is probably an Oculus / graphics driver bug and definitely not Ogre specific. I think it was caused by all the fixed function pipeline OpenGL functions in the older OpenGL rendersystem.
  3. Yes! I was able to setup the compositors to render the way I wanted to. It was not too bad. I generated 4 workspaces (2 eyes x 2 RenderTargets) and was just toggling the necessary ones on and off via setEnabled each frame. RenderWindow also had a manual swapBuffers() method, which seemed to make everything smoother, even though I wasn't rendering anything to the render window.
I think at this point I have something other can use to serve as a reference:
https://github.com/raymond-w-ko/dk2test

Now you too can experience dancing Sinbad up close and in 3D, and see jaiqua walk around :D

EDIT: Too bad right now the preset materials don't work since they are fixed function, so I just found the first shaders that worked, so all colors are wonky.
amartin
Halfling
Posts: 87
Joined: Wed Aug 14, 2013 6:55 am
Location: Norway
x 13

Re: Oculus Rift in Ogre

Post by amartin »

kokonut wrote: The answer is still probably not. I will rely on my hackish shim function for now as I wanted to avoid maintaining my own fork of Ogre which supports these features. It probably would be too controversial to propose accepting foreign textures. Like why would not let Ogre the rendering engine manage and do all the rendering? Could it be because they are actually DirectX textures masquerading as OpenGL textures via DX_Interop https://www.opengl.org/registry/specs/NV/DX_interop.txt :wink:
There is a custom resource loader which can be used to change how a resource is loaded. Textures being a resource can be custom loaded this way. I did have to do a little change to the code when I set up an ogre texture to use a shared DirectX 11 texture created by something else but it's relatively minor. I have no idea if what you need is publicly exposed so can be loaded in the custom loader but the mechanism you need is there. I have not been keeping track of how the rift SDK functions since I stopped working on VR projects. I can't really help with details on what you'd need to do but a custom resource loader is where you should probably do it.
Post Reply