[2.2] Oculus SDK

Discussion area about developing with Ogre-Next (2.1, 2.2 and beyond)


xrgo
OGRE Expert User
OGRE Expert User
Posts: 1148
Joined: Sat Jul 06, 2013 10:59 pm
Location: Chile
x 169

[2.2] Oculus SDK

Post by xrgo »

Hello! I am trying to integrate the Oculus SDK with D3D11 (OGL for later and I already have OpenVR working) and I have this code (a mix from here: https://github.com/Ybalrid/Ogre21_VR/bl ... nderer.cpp and here:viewtopic.php?t=94627#p543300)

at init:

Code: Select all

    ovr_Initialize(nullptr);

    if (ovr_Create(&session, &luid) != ovrSuccess)
    {
        ovr_Shutdown();
    }
    else{
        hmdDesc = ovr_GetHmdDesc(session);
    }
then also at init:

Code: Select all

void yGraphicsOculusVR::initTextures(){

    const auto texSizeL = ovr_GetFovTextureSize(session, ovrEye_Left, hmdDesc.DefaultEyeFov[0], 1);
    const auto texSizeR = ovr_GetFovTextureSize(session, ovrEye_Right, hmdDesc.DefaultEyeFov[1], 1);

    mRenderWidth = texSizeL.w + texSizeR.w;
    mRenderHeight = std::max(texSizeL.h, texSizeR.h);

    ovrTextureSwapChainDesc textureSwapChainDesc = {};
    textureSwapChainDesc.Type = ovrTexture_2D;
    textureSwapChainDesc.ArraySize = 1;
    textureSwapChainDesc.Format = OVR_FORMAT_R8G8B8A8_UNORM_SRGB;
    textureSwapChainDesc.Width = mRenderWidth;
    textureSwapChainDesc.Height = mRenderHeight;
    textureSwapChainDesc.MipLevels = 1;
    textureSwapChainDesc.SampleCount = 1;
    textureSwapChainDesc.BindFlags = ovrTextureBind_DX_RenderTarget;
    textureSwapChainDesc.StaticImage = ovrFalse;

    Ogre::D3D11RenderSystem* renderSystem = dynamic_cast<Ogre::D3D11RenderSystem*>( Ogre::Root::getSingletonPtr()->getRenderSystem() );

    if (ovr_CreateTextureSwapChainDX(session, renderSystem->_getDevice().get(), &textureSwapChainDesc, &textureSwapchain) == ovrSuccess)
    {
        int textureCount = 0;
        ovr_GetTextureSwapChainLength(session, textureSwapchain, &textureCount);
        mRenderTargetViews.resize(textureCount);
        for (int i = 0; i < textureCount; ++i)
        {
            ID3D11Texture2D* texture = nullptr;
            ID3D11RenderTargetView* rtv = nullptr;
            ovr_GetTextureSwapChainBufferDX(session, textureSwapchain, i, IID_PPV_ARGS(&texture));

            D3D11_RENDER_TARGET_VIEW_DESC rtvd = {};
            rtvd.Format = DXGI_FORMAT_R8G8B8A8_UNORM_SRGB;
            rtvd.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D;

            HRESULT hr = renderSystem->_getDevice().get()->CreateRenderTargetView(texture, &rtvd, &mRenderTargetViews[i]);
            if( hr != ERROR_SUCCESS){
                std::cout<<"Oculus: Error creating render target view "<<i<<std::endl;
            }
            texture->Release();
        }
    }


    Ogre::TextureGpuManager* textureManager = Ogre::Root::getSingletonPtr()->getRenderSystem()->getTextureGpuManager();
    mEyesRenderTexture = textureManager->createTexture( "OculusVRRenderTexture",
                                                              Ogre::GpuPageOutStrategy::Discard,
                                                              Ogre::TextureFlags::RenderToTexture | Ogre::TextureFlags::Reinterpretable |
                                                                Ogre::TextureFlags::Uav,
                                                              Ogre::TextureTypes::Type2D );

    mEyesRenderTexture->setResolution( mRenderWidth, mRenderHeight );
    mEyesRenderTexture->setNumMipmaps( 1u );
    mEyesRenderTexture->setPixelFormat( Ogre::PixelFormatGpu::PFG_RGBA8_UNORM_SRGB );
    mEyesRenderTexture->setMsaa( 1u );
    mEyesRenderTexture->_transitionTo( Ogre::GpuResidency::Resident, (uint8*)0 );
    mEyesRenderTexture->notifyDataIsReady();


    //Create a layer with our single swaptexture on it. Each side is an eye.
    layer.Header.Type = ovrLayerType_EyeFov;
    layer.Header.Flags = 0;
    layer.ColorTexture[0] = textureSwapchain;
    layer.ColorTexture[1] = textureSwapchain;
    layer.Fov[0] = EyeRenderDesc[0].Fov;
    layer.Fov[1] = EyeRenderDesc[1].Fov;
    ovrRecti leftRect, rightRect;
    leftRect.Size.w = mRenderWidth / 2.0f;
    leftRect.Size.h = mRenderHeight;
    rightRect = leftRect;
    leftRect.Pos.x = 0;
    leftRect.Pos.y = 0;
    rightRect.Pos.x = mRenderWidth / 2.0f;
    rightRect.Pos.y = 0;
    layer.Viewport[0] = leftRect;
    layer.Viewport[1] = rightRect;

}
on framerenderingqueue:

Code: Select all

void yGraphicsOculusVR::update(){

    if( mSubmit ){
        
        ts = ovr_GetTrackingState(session, currentFrameDisplayTime = ovr_GetPredictedDisplayTime(session, 0), ovrTrue);

        pose = ts.HeadPose.ThePose;

        ovr_CalcEyePoses( pose, offset.data(), layer.RenderPose );
        mCameraNode->setOrientation( oculusToOgreQuat(pose.Orientation) );
        mCameraNode->setPosition( oculusToOgreVect3(pose.Position) );


        //copy ogre textures to oculus swap chain
        int currentIndex = 0;
        ovr_GetTextureSwapChainCurrentIndex(session, textureSwapchain, &currentIndex);

        ID3D11Resource* oculusTexture;
        mRenderTargetViews[currentIndex]->GetResource(&oculusTexture);

        ID3D11DeviceContext* context;
        Ogre::D3D11RenderSystem* renderSystem = dynamic_cast<Ogre::D3D11RenderSystem*>( Ogre::Root::getSingletonPtr()->getRenderSystem() );
        renderSystem->_getDevice()->GetImmediateContext(&context);
        context->CopyResource( oculusTexture, static_cast<Ogre::D3D11TextureGpu*>(mEyesRenderTexture)->getFinalTextureName() );
        ovrResult result = ovr_CommitTextureSwapChain(session, textureSwapchain);
        if (OVR_FAILURE(result))
        {
            ovrErrorInfo errorInfo;
            ovr_GetLastErrorInfo(&errorInfo);
        }

        ovrLayerHeader* layers = &layer.Header;
        result = ovr_SubmitFrame(session, 0, nullptr, &layers, 1);
        if (OVR_FAILURE(result))
        {
            ovrErrorInfo errorInfo;
            ovr_GetLastErrorInfo(&errorInfo);
        }
    }

}
the camera setup is irrelevant at this point. The eyes textures is being rendered correctly since I see it in the mirror window, and the sensors are also working (I can see the camera moving on the mirror window as I move the headset)...

the problem is that I get no image in the headset =(

I summon you al2950! =D

thank you so much in advance!

Edit: in debug build I can see something, like half of each eye fill with a solid of some of the color of the image
User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 5511
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 1379

Re: [2.2] Oculus SDK

Post by dark_sylinc »

I have no idea and making randome guesses. The link you posted has code regarding ovr_EndFrame, which looks important and you don't seem to have it (which probably is related to ovr_WaitToBeginFrame & ovr_BeginFrame).

The only difference between what is being done here: https://developer.oculus.com/documentat ... dg-render/ and what you're doing, is that you don't need to set viewports, and instead of rendering commands, you issue a copy.

PS. If you modify the D3D11 rendersystem so that you implement your own 'D3D11TextureGpuCustomVendorResource', base it off from D3D11TextureGpuWindow (probably) which derives from D3D11TextureGpuWindow, and have instead your D3D11TextureGpuCustomVendorResource fill mFinalTextureName & mDisplayTextureName with what Oculus provides you; then add a new flag TextureFlags::CustomVendorResource; you can finally modify D3D11RenderPassDescriptor::updateColourRtv so that when TextureFlags::CustomVendorResource, it calls a listener to your code to use the ID3D11RenderTargetView that Oculus provided you into mColourRtv[ i].

That way you would be able to use a TextureGpu that you can render directly. Sounds complicated but it's not. First get your oculus to render with the copy, then read a couple times what I wrote, and you'll realize the changes are rather easy.
You just have to copy-paste our class to a new one that fills variables with Oculus' pointers, and a new flag to mark your TextureGpu special, so that you can also introduce custom code into D3D11RenderPassDescriptor::updateColourRtv to also use Oculus' pointers.
xrgo
OGRE Expert User
OGRE Expert User
Posts: 1148
Joined: Sat Jul 06, 2013 10:59 pm
Location: Chile
x 169

Re: [2.2] Oculus SDK

Post by xrgo »

thanks I tried adding ovr_EndFrame/ovr_BeginFrame/ovr_WaitToBeginFrame and I get a "loading" window in a white room, but there's seems to be no need for that since even the basic sample in the sdk dont use them:https://github.com/jherico/OculusSDK/bl ... )/main.cpp
dark_sylinc wrote: Fri Sep 20, 2019 6:44 am The only difference between what is being done here: https://developer.oculus.com/documentat ... dg-render/ and what you're doing, is that you don't need to set viewports, and instead of rendering commands, you issue a copy.
thanks I am going to look that link!
dark_sylinc wrote: Fri Sep 20, 2019 6:44 am PS. If you modify the D3D11 rendersystem so that you implement your own......
thanks! I would like to check how it works without modifying ogre first =)
xrgo
OGRE Expert User
OGRE Expert User
Posts: 1148
Joined: Sat Jul 06, 2013 10:59 pm
Location: Chile
x 169

Re: [2.2] Oculus SDK

Post by xrgo »

this code worked! viewtopic.php?t=81627&start=100#p525592
thanks Kojack!! <3

now I have to get the correct camera projection matrix for each eye to fill in the vrData for instanced stereo rendering =)
User avatar
Kojack
OGRE Moderator
OGRE Moderator
Posts: 7157
Joined: Sun Jan 25, 2004 7:35 am
Location: Brisbane, Australia
x 535

Re: [2.2] Oculus SDK

Post by Kojack »

dark_sylinc wrote: Fri Sep 20, 2019 6:44 am I have no idea and making randome guesses. The link you posted has code regarding ovr_EndFrame, which looks important and you don't seem to have it (which probably is related to ovr_WaitToBeginFrame & ovr_BeginFrame).
xrgo wrote: Fri Sep 20, 2019 3:12 pm thanks I tried adding ovr_EndFrame/ovr_BeginFrame/ovr_WaitToBeginFrame and I get a "loading" window in a white room, but there's seems to be no need for that since even the basic sample in the sdk dont use them:https://github.com/jherico/OculusSDK/bl ... )/main.cpp
That repo is a very old clone of the 1.10 sdk from 2016. We're up to 1.40 now.

ovr_SubmitFrame was the original way to submit frames to the sdk. It was deprecated in sdk 1.18 and replaced with ovr_WaitToBeginFrame, ovr_BeginFrame and ovr_EndFrame.
Lax
Gnoll
Posts: 683
Joined: Mon Aug 06, 2007 12:53 pm
Location: Saarland, Germany
x 65

Re: [2.2] Oculus SDK

Post by Lax »

Hi, is there somewhere example code or project, that is working with the newst repo?
I'm also interested in, getting occulus rift working in my Engine.
I know, there is a whole topic, that begun in 2014, but its hard to read out the right information.

Best regards
Lax

http://www.lukas-kalinowski.com/Homepage/?page_id=1631
Please support Second Earth Technic Base built of Lego bricks for Lego ideas: https://ideas.lego.com/projects/81b9bd1 ... b97b79be62

xrgo
OGRE Expert User
OGRE Expert User
Posts: 1148
Joined: Sat Jul 06, 2013 10:59 pm
Location: Chile
x 169

Re: [2.2] Oculus SDK

Post by xrgo »

Kojack wrote: Sun Sep 22, 2019 3:05 pm That repo is a very old clone of the 1.10 sdk from 2016. We're up to 1.40 now.

ovr_SubmitFrame was the original way to submit frames to the sdk. It was deprecated in sdk 1.18 and replaced with ovr_WaitToBeginFrame, ovr_BeginFrame and ovr_EndFrame.
the tinyroom sample that comes with 1.40 is the same as that link, and its using just ovr_SubmitFrame :S
Lax wrote: Sun Sep 22, 2019 3:56 pm Hi, is there somewhere example code or project, that is working with the newst repo?
I'm also interested in, getting occulus rift working in my Engine.
when I have everything working I'll share my code, but be aware that its going to use Ogre 2.2 and the new vr features
Lax
Gnoll
Posts: 683
Joined: Mon Aug 06, 2007 12:53 pm
Location: Saarland, Germany
x 65

Re: [2.2] Oculus SDK

Post by Lax »

Sounds great!
I also use Ogre2.2 Features.

http://www.lukas-kalinowski.com/Homepage/?page_id=1631
Please support Second Earth Technic Base built of Lego bricks for Lego ideas: https://ideas.lego.com/projects/81b9bd1 ... b97b79be62