Render target cubemap mipmap bug on Direct3D11

Problems building or running the engine, queries about how to use features etc.
rpgplayerrobin
Orc Shaman
Posts: 736
Joined: Wed Mar 18, 2009 3:03 am
x 411

Render target cubemap mipmap bug on Direct3D11

Post by rpgplayerrobin »

Ogre Version: Any
Operating System: Windows
Render System: Direct3D11

Hey!

I am implementing IBL into my game as a test, and I had an issue with D3D11 regarding this.
I think I have solved it by changing the Ogre source, but I am not sure if it is the best way to solve it.

Basically, I want to render into a specific mipmap of a cubemap texture, and that each mipmap should have a unique appearance by using specific shaders.
To do that, I simply create a render target of the texture for the mipmap that I want to target:

Code: Select all

RenderTarget* tmpRenderTarget = m_cubemapTexture->getBuffer(i, (size_t)mipmapTarget)->getRenderTarget();

This works great in Direct3D9, but, it does not work for Direct3D11.

In Direct3D11, I get this error:

ID3D11DeviceContext::OMSetRenderTargets:

The RenderTargetView at slot 0 is not compatible with the DepthStencilView.

DepthStencilViews may only be used with RenderTargetViews if the effective dimensions of the Views are equal, as well as the Resource types, multisample count, and multisample quality.

The RenderTargetView at slot 0 has (w:512,h:512,as:1), while the Resource is a Texture2D with (mc:1,mq:0).
The DepthStencilView has (w:256,h:256,as:1), while the Resource is a Texture2D with (mc:1,mq:0).

It seems that its RTV is never setting its mipmap for Direct3D11, so I just added it:

D3D11RenderTexture::rebind:

Code: Select all

case D3D11_SRV_DIMENSION_TEXTURECUBE:
RTVDesc.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2DARRAY;
RTVDesc.Texture2DArray.FirstArraySlice = buffer->getFace();
RTVDesc.Texture2DArray.MipSlice = buffer->getMipLevel(); <- My new code
RTVDesc.Texture2DArray.ArraySize = 1;
break;

To get access to the miplevel I added these to D3D11HardwarePixelBuffer:
.cpp:

Code: Select all

UINT D3D11HardwarePixelBuffer::getMipLevel() const
{
	return mMipLevel;
}

.h:

Code: Select all

UINT getMipLevel() const;

That makes it work for Direct3D11 as well, and I have debugged all mipmaps of it in a shader and it seems to work as well as Direct3D9 now.

I am just wondering, is this is the best way to fix it?
And in that case, I guess this should be used in the next version of Ogre?

paroj
OGRE Team Member
OGRE Team Member
Posts: 2157
Joined: Sun Mar 30, 2014 2:51 pm
x 1158

Re: Render target cubemap mipmap bug on Direct3D11

Post by paroj »

looks good to me. Please create a PR on github.

rpgplayerrobin
Orc Shaman
Posts: 736
Joined: Wed Mar 18, 2009 3:03 am
x 411

Re: Render target cubemap mipmap bug on Direct3D11

Post by rpgplayerrobin »

I did it now, but I don't understand how to make all files into one commit by just using the Github website alone, so it was done with 3 pull requests. :lol:

rpgplayerrobin
Orc Shaman
Posts: 736
Joined: Wed Mar 18, 2009 3:03 am
x 411

Re: Render target cubemap mipmap bug on Direct3D11

Post by rpgplayerrobin »

I have no idea what happened to the commits, if they were added or not, or if only some were.
I am really not familiar with Git.

I posted the exact code changes above at least.