D3D11RenderTexture::getCustomAttribute missing? Topic is solved

Problems building or running the engine, queries about how to use features etc.
User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

Ogre Version: 14.3.1
Operating System: Windows
Render System: DirectX 11

Hey everyone!

I have a post here discussing the integration of Ogre with WPF, specifically using D3DImage. After successfully initializing Ogre inside D3DImage with DirectX 9, I am now attempting to do the same with DirectX 11.

There are some challenges: the first is that D3DImage does not work directly with DirectX 11. I am currently researching potential solutions. Another issue I encountered is that the DirectX 11 render system in Ogre does not have a getCustomAttribute method for RenderTexture. Every time I tried to use a custom attribute, it would fail with the same generic error:

Code: Select all

Attribute not found. " + name, " RenderTarget::getCustomAttribute"

To address this, I made manual changes to the source code, specifically in the OgreD3D11Texture.cpp class:

Code: Select all

void D3D11RenderTexture::getCustomAttribute(const String& name, void* pData)
{
    if (name == "D3DDEVICE")
    {
        *(ID3D11DeviceN**)pData = mDevice.get();
    }
    else
    {
        D3D11RenderTexture::getCustomAttribute(name, pData);
    }
}

And in OgreD3D11Texture.h:

Code: Select all

void getCustomAttribute(const String& name, void* pData);

I'm not sure if this is the correct approach, but now, when I execute the following code:

Code: Select all

renderTarget.getCustomAttribute("D3DDEVICE", out surface);

I can successfully obtain a pointer, unlike before.

paroj
OGRE Team Member
OGRE Team Member
Posts: 2128
Joined: Sun Mar 30, 2014 2:51 pm
x 1141

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by paroj »

yes, this is the correct approach

User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

So, theoretically, I created a method in OgreD3D11Texture.cpp that gives me a shared handle which I can use in Microsoft.Wpf.Interop.Directx.D3D11Image.

Code: Select all

void D3D11Texture::_createShared2DTex()
{
    assert(mSrcWidth > 0 || mSrcHeight > 0);

UINT numMips = (mNumMipmaps == MIP_UNLIMITED || (1U << mNumMipmaps) > std::max(mSrcWidth, mSrcHeight))
                   ? 0
                   : mNumMipmaps + 1;
if (D3D11Mappings::_isBinaryCompressedFormat(mD3DFormat) && numMips > 1)
    numMips = std::max(1U, numMips - 2);

D3D11_TEXTURE2D_DESC desc;
desc.Width = static_cast<UINT>(mSrcWidth);
desc.Height = static_cast<UINT>(mSrcHeight);
desc.MipLevels = numMips;
desc.ArraySize = mDepth == 0 ? 1 : mDepth;
desc.Format = mD3DFormat;
desc.SampleDesc = mFSAAType;
desc.Usage = D3D11Mappings::_getUsage(_getTextureUsage());
desc.BindFlags = D3D11Mappings::_getTextureBindFlags(mD3DFormat, _getTextureUsage());
desc.CPUAccessFlags = D3D11Mappings::_getAccessFlags(_getTextureUsage());
desc.MiscFlags = D3D11Mappings::_getTextureMiscFlags(desc.BindFlags, getTextureType(), _getTextureUsage());

// Ensure that the texture is shared
desc.MiscFlags |= D3D11_RESOURCE_MISC_SHARED;

if (PixelUtil::isDepth(mFormat))
{
    desc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_DEPTH_STENCIL;
    desc.Usage = D3D11_USAGE_DEFAULT;
    desc.CPUAccessFlags = 0;
    desc.MiscFlags = 0;
}

if (this->getTextureType() == TEX_TYPE_CUBE_MAP)
{
    desc.ArraySize = 6;
}

D3D11RenderSystem* rs = static_cast<D3D11RenderSystem*>(Root::getSingleton().getRenderSystem());
if (rs->_getFeatureLevel() < D3D_FEATURE_LEVEL_10_0)
{
    if (!IsPowerOfTwo(desc.Width) || !IsPowerOfTwo(desc.Height))
    {
        desc.MipLevels = 1;
    }
}

HRESULT hr = mDevice->CreateTexture2D(&desc,                          
                                      NULL,                            
                                      mp2DTex.ReleaseAndGetAddressOf() 
);

if (FAILED(hr) || mDevice.isError())
{
    this->unloadImpl();
    String errorDescription = mDevice.getErrorDescription(hr);
    OGRE_EXCEPT_EX(Exception::ERR_RENDERINGAPI_ERROR, hr,
                   "Error creating texture\nError Description:" + errorDescription,
                   "D3D11Texture::_create2DTex");
}

// Getting shared Handle
ComPtr<IDXGIResource> dxgiResource;
mp2DTex.As(&dxgiResource); // Converte para IDXGIResource
HANDLE sharedHandle;
hr = dxgiResource->GetSharedHandle(&sharedHandle);
if (FAILED(hr) || !sharedHandle)
{
    throw std::runtime_error("Failed to retrieve shared handle for the texture");
}

// Here, I would pass the shared handle to the D3D11Image.
_queryInterface<ID3D11Texture2D, ID3D11Resource>(mp2DTex, &mpTex);

_create2DResourceView();
}

The problem is that I don't know how to access this method after building Ogre. Can I access this method directly, or is this kind of implementation specific? (I only know basic things about C++ :( )

paroj
OGRE Team Member
OGRE Team Member
Posts: 2128
Joined: Sun Mar 30, 2014 2:51 pm
x 1141

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by paroj »

the changes are these lines, right?

Code: Select all

desc.MiscFlags |= D3D11_RESOURCE_MISC_SHARED;
...
ComPtr<IDXGIResource> dxgiResource;
mp2DTex.As(&dxgiResource); // Converte para IDXGIResource
HANDLE sharedHandle;
hr = dxgiResource->GetSharedHandle(&sharedHandle);

then you will need to go through the Ogre public API to get what you want:

  1. add TU_SHARED_RESOURCE=0x100 to TextureUsage enum and specify it when creating that resource
  2. convert it to D3D11 in D3D11Mappings::_getTextureMiscFlags
  3. check it in _create2DTex and create sharedHandle if it is passed
  4. return sharedHandle via getCustomAttribute

that would be needed if you want to contribute that code back to Ogre.

If you just want a quick & dirty test, just add virtual void* _createShared2DTex() {} to the Texture class. This way it gets exposed in the bindings. However you will probably still need TU_SHARED_RESOURCE as you dont want both _createShared2DTex and _create2DTex to be called.

User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

Quick question while I'm trying to change everything necessary. Is creating a texture like this correct for the situation?

Code: Select all

texturePtr = TextureManager.getSingleton().createManual(
            "Ogre Render",
            ResourceGroupManager.DEFAULT_RESOURCE_GROUP_NAME,
            TextureType.TEX_TYPE_SHARED_2D,
            (uint)ViewportSize.Width,
            (uint)ViewportSize.Height,
            32,
            0,
            PixelFormat.PF_R8G8B8A8,
            (int)TextureUsage.TU_SHARED_RESOURCE);

I was stuck on this for a while, but now it seems to be okay since in debug mode, it's no longer getting stuck here.

Another thing: I can't specify everything I've changed in the code, but now I'm getting a "System.ApplicationException: 'invalid vector subscript'" in getRenderTarget() from HardwarePixelBufferPtr.cs. This error happens right after creating the texture. Any tips?

Code: Select all

renderTarget = texturePtr.getBuffer().getRenderTarget();
User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

So, if I use (int)TextureUsage.TU_RENDERTARGET, I can get the renderTarget. However, if I choose (int)TextureUsage.TU_SHARED_RESOURCE, that error appears. I think it's now something more specific to investigate, but I still don't know exactly what to do. :?

paroj
OGRE Team Member
OGRE Team Member
Posts: 2128
Joined: Sun Mar 30, 2014 2:51 pm
x 1141

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by paroj »

you should be able to specify both flags as TextureUsage.TU_RENDERTARGET|TextureUsage.TU_SHARED_RESOURCE

User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

So, I'm facing a "System.AccessViolationException" again, this time when trying to initialize the shader with ShaderGenerator. I'm following the sample instructions by initializing it before setting up anything related to the scene, but the error still occurs. Could this be related to the SWIG bindings again?

Code: Select all

scnMgr = root.createSceneManager();
    
ShaderGenerator shadergen = ShaderGenerator.getSingleton(); shadergen.addSceneManager(scnMgr);

Theoretically, the shadergen variable is created, but when I try to use it for anything, the Access Violation error appears. At this point in the code, I already have the root initialized with default settings, resources.cfg loaded, the render system set, and the render window created.

paroj
OGRE Team Member
OGRE Team Member
Posts: 2128
Joined: Sun Mar 30, 2014 2:51 pm
x 1141

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by paroj »

if you dont use OgreBites, you have to intialise shadergen like this:
https://github.com/OGRECave/ogre/blob/0 ... ple.py#L40

User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

I'm trying a different approach to the DX11 and DX9 'translation'. Now, I'm attempting to handle everything in OgreD3D11Texture.cpp:

Code: Select all

void D3D11Texture::_createShared2DTex()
{
    assert(mSrcWidth > 0 || mSrcHeight > 0);

UINT numMips =
    (mNumMipmaps == MIP_UNLIMITED || (1U << mNumMipmaps) > std::max(mSrcWidth, mSrcHeight)) ? 0 : mNumMipmaps + 1;

if (D3D11Mappings::_isBinaryCompressedFormat(mD3DFormat) && numMips > 1)
    numMips = std::max(1U, numMips - 2);

D3D11_TEXTURE2D_DESC desc = {};
desc.Width = static_cast<UINT>(mSrcWidth);
desc.Height = static_cast<UINT>(mSrcHeight);
desc.MipLevels = numMips;
desc.ArraySize = mDepth == 0 ? 1 : mDepth;
desc.Format = mD3DFormat;
desc.SampleDesc = mFSAAType;
desc.Usage = D3D11Mappings::_getUsage(_getTextureUsage());
desc.BindFlags = D3D11Mappings::_getTextureBindFlags(mD3DFormat, _getTextureUsage());
desc.CPUAccessFlags = D3D11Mappings::_getAccessFlags(_getTextureUsage());
desc.MiscFlags = D3D11Mappings::_getTextureMiscFlags(desc.BindFlags, getTextureType(), _getTextureUsage());

desc.MiscFlags |= D3D11_RESOURCE_MISC_SHARED;

HRESULT hr = mDevice->CreateTexture2D(&desc,
                                      NULL, // Sem dados iniciais
                                      mp2DTex.ReleaseAndGetAddressOf());

if (FAILED(hr) || mDevice.isError())
{
    this->unloadImpl();
    String errorDescription = mDevice.getErrorDescription(hr);
    OGRE_EXCEPT_EX(Exception::ERR_RENDERINGAPI_ERROR, hr,
                   "Error creating DirectX 11 texture\nError Description:" + errorDescription,
                   "D3D11Texture::_createShared2DTex");
}

IDXGIResource* dxgiResource = nullptr;
HANDLE sharedHandle = nullptr;

hr = mp2DTex->QueryInterface(__uuidof(IDXGIResource), (void**)&dxgiResource);
if (FAILED(hr) || !dxgiResource)
{
    OGRE_EXCEPT(Exception::ERR_RENDERINGAPI_ERROR, "Failed to query IDXGIResource for shared handle",
                "D3D11Texture::_createShared2DTex");
}

dxgiResource->GetSharedHandle(&sharedHandle);
dxgiResource->Release();

if (!sharedHandle)
{
    OGRE_EXCEPT(Exception::ERR_RENDERINGAPI_ERROR, "Failed to obtain shared handle for DirectX 11 texture",
                "D3D11Texture::_createShared2DTex");
}

IDirect3D9Ex* d3d9Ex = nullptr;
hr = Direct3DCreate9Ex(D3D_SDK_VERSION, &d3d9Ex);
if (FAILED(hr) || !d3d9Ex)
{
    OGRE_EXCEPT(Exception::ERR_RENDERINGAPI_ERROR, "Failed to create Direct3D9Ex object",
                "D3D11Texture::_createShared2DTex");
}

D3DPRESENT_PARAMETERS d3dpp = {};
d3dpp.Windowed = TRUE;
d3dpp.SwapEffect = D3DSWAPEFFECT_DISCARD;
d3dpp.hDeviceWindow = nullptr;
d3dpp.PresentationInterval = D3DPRESENT_INTERVAL_IMMEDIATE;

IDirect3DDevice9Ex* d3d9Device = nullptr;
hr = d3d9Ex->CreateDeviceEx(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, NULL,
                            D3DCREATE_HARDWARE_VERTEXPROCESSING | D3DCREATE_MULTITHREADED | D3DCREATE_FPU_PRESERVE,
                            &d3dpp, NULL, &d3d9Device);

d3d9Ex->Release();

if (FAILED(hr) || !d3d9Device)
{
    OGRE_EXCEPT(Exception::ERR_RENDERINGAPI_ERROR, "Failed to create Direct3D9Ex device",
                "D3D11Texture::_createShared2DTex");
}

IDirect3DTexture9* d3d9Texture = nullptr;
hr = d3d9Device->CreateTexture(desc.Width, desc.Height, desc.MipLevels, D3DUSAGE_RENDERTARGET, D3DFMT_A8R8G8B8,
                               D3DPOOL_DEFAULT, &d3d9Texture, &sharedHandle);

if (FAILED(hr))
{
    d3d9Device->Release();
    OGRE_EXCEPT(Exception::ERR_RENDERINGAPI_ERROR, "Failed to create Direct3D 9 texture from shared handle",
                "D3D11Texture::_createShared2DTex");
}

mD3D9Texture = d3d9Texture;

_queryInterface<ID3D11Texture2D, ID3D11Resource>(mp2DTex, &mpTex);
_create2DResourceView();
}

This is based on https://stackoverflow.com/a/70987056/22795139.

I'm not able to test this method because I don't really know how to use it as getCustomAttribute. In my understanding, if the method in OgreD3D9Texture.cpp is working, I was supposed to just copy and use it, but this method uses

Code: Select all

auto device = D3D9RenderSystem::getActiveD3D9Device();
auto d3dBuffer = static_cast<D3D9HardwarePixelBuffer*>(mBuffer);

and I'm not able to use D3D9RenderSystem and D3D9HardwarePixelBuffer in OgreD3D11Texture. What can I do to work around this?

rpgplayerrobin
Orc Shaman
Posts: 710
Joined: Wed Mar 18, 2009 3:03 am
x 391

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by rpgplayerrobin »

The problem is that D3D9 and D3D11 are plugins, so they can never borrow code from each other.

Therefore, I would not do it there, I would instead do it in your user code.

For example, in your user code you can easily include what you want, like:
#include "RenderSystems/Direct3D11/OgreD3D11Texture.h"
#include "RenderSystems/Direct3D9/OgreD3D9Texture.h"

Then you will most likely be able to include everything you need and then use that to recreate the D3D11 texture to a D3D9 texture like in your example.

User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

I discovered how D3D11Image works. Basically, the component generates a surface where you need to 'paint' your 2D texture. With that said, I’m now trying to create a method in Ogre to use the given surface to build an Ogre texture. As far as I know, Ogre needs to retrieve the render texture to manipulate it with things that will be used in the scene. So, I made this method to return my render texture for use as a render target.

Code: Select all

RenderTexture* D3D11Texture::_create2DTexWithSurface(void* surface)
{
    if (surface == nullptr)
    {
        OGRE_EXCEPT(Exception::ERR_INVALIDPARAMS, "Surface pointer is null.",
                       "D3D11Texture::_create2DTexWithSurface");
        return nullptr;
    }

HRESULT hr;

IUnknown* pUnk = static_cast<IUnknown*>(surface);
if (pUnk == nullptr)
{
    OGRE_EXCEPT(Exception::ERR_INVALIDPARAMS, "IUnknown pointer is null.",
                   "D3D11Texture::_create2DTexWithSurface");
    return nullptr;
}

IDXGIResource* pDXGIResource = nullptr;
hr = pUnk->QueryInterface(__uuidof(IDXGIResource), reinterpret_cast<void**>(&pDXGIResource));
if (FAILED(hr))
{
    OGRE_EXCEPT_EX(Exception::ERR_RENDERINGAPI_ERROR, hr, "Failed to query IDXGIResource interface.",
                   "D3D11Texture::_create2DTexWithSurface");
    return nullptr;
}
if (pDXGIResource == nullptr)
{
    OGRE_EXCEPT(Exception::ERR_INVALIDPARAMS, "IDXGIResource interface is null.",
                   "D3D11Texture::_create2DTexWithSurface");
    return nullptr;
}


HANDLE sharedHandle = nullptr;
hr = pDXGIResource->GetSharedHandle(&sharedHandle);
if (FAILED(hr) || sharedHandle == nullptr)
{
    pDXGIResource->Release();
    OGRE_EXCEPT_EX(Exception::ERR_RENDERINGAPI_ERROR, hr, "Failed to get shared handle.",
                   "D3D11Texture::_create2DTexWithSurface");
    return nullptr;
}

pDXGIResource->Release();

ComPtr<ID3D11Resource> tempResource;
hr = mDevice->OpenSharedResource(sharedHandle, __uuidof(ID3D11Resource),
                                 reinterpret_cast<void**>(tempResource.GetAddressOf()));
if (FAILED(hr) || !tempResource)
{
    OGRE_EXCEPT_EX(Exception::ERR_RENDERINGAPI_ERROR, hr, "Failed to open shared resource.",
                   "D3D11Texture::_create2DTexWithSurface");
    return nullptr;
}

ComPtr<ID3D11Texture2D> pOutputResource;
hr = tempResource->QueryInterface(__uuidof(ID3D11Texture2D),
                                  reinterpret_cast<void**>(pOutputResource.GetAddressOf()));
if (FAILED(hr) || !pOutputResource)
{
    OGRE_EXCEPT_EX(Exception::ERR_RENDERINGAPI_ERROR, hr, "Failed to query ID3D11Texture2D interface.",
                   "D3D11Texture::_create2DTexWithSurface");
    return nullptr;
}

D3D11_TEXTURE2D_DESC outputResourceDesc;
pOutputResource->GetDesc(&outputResourceDesc);
if (outputResourceDesc.Width == 0 || outputResourceDesc.Height == 0)
{
    OGRE_EXCEPT(Exception::ERR_INVALIDPARAMS, "Texture has invalid dimensions.",
                   "D3D11Texture::_create2DTexWithSurface");
    return nullptr;
}

mSrcWidth = outputResourceDesc.Width;
mSrcHeight = outputResourceDesc.Height;

if (mDevice.isNull())
{
    OGRE_EXCEPT(Exception::ERR_RENDERINGAPI_ERROR, "mDevice is null.", "D3D11Texture::_create2DTexWithSurface");
    return nullptr;
}

UINT mipLevel = 0;                          
size_t depth = 1;                           
UINT face = 0;                              
PixelFormat ogreFormat = Ogre::PF_B8G8R8A8; 

auto pixelBuffer = OGRE_NEW D3D11HardwarePixelBuffer(this,     
                                                     mDevice,  
                                                     mipLevel, 
                                                     outputResourceDesc.Width,  
                                                     outputResourceDesc.Height, 
                                                     depth,                     
                                                     face,                      
                                                     ogreFormat,                
                                                     Ogre::HardwareBuffer::HBU_STATIC
);

if (pixelBuffer == nullptr)
{
    OGRE_EXCEPT(Exception::ERR_INTERNAL_ERROR, "Failed to create D3D11HardwarePixelBuffer.",
                   "D3D11Texture::_create2DTexWithSurface");
    return nullptr;
}
   
RenderTexture* ogreRenderTexture = pixelBuffer->getRenderTarget();
if (ogreRenderTexture == nullptr)
{
    OGRE_EXCEPT(Exception::ERR_RENDERINGAPI_ERROR, "Failed to get RenderTarget from PixelBuffer.",
                   "D3D11Texture::_create2DTexWithSurface");
    return nullptr;
}

return ogreRenderTexture;
}

This is based on how the sample in D3D11Image works.

Code: Select all

HRESULT CCube::InitRenderTarget(void * pResource)
{
    HRESULT hr = S_OK;

IUnknown *pUnk = (IUnknown*)pResource;

IDXGIResource * pDXGIResource;
hr = pUnk->QueryInterface(__uuidof(IDXGIResource), (void**)&pDXGIResource);
if (FAILED(hr))
{
    return hr;
}

HANDLE sharedHandle;
hr = pDXGIResource->GetSharedHandle(&sharedHandle);
if (FAILED(hr))
{
    return hr;
}

pDXGIResource->Release();

IUnknown * tempResource11;
hr = m_pd3dDevice->OpenSharedResource(sharedHandle, __uuidof(ID3D11Resource), (void**)(&tempResource11));
if (FAILED(hr))
{
    return hr;
}

ID3D11Texture2D * pOutputResource;
hr = tempResource11->QueryInterface(__uuidof(ID3D11Texture2D), (void**)(&pOutputResource));
if (FAILED(hr))
{
    return hr;
}
tempResource11->Release();

D3D11_RENDER_TARGET_VIEW_DESC rtDesc;
rtDesc.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
rtDesc.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D;
rtDesc.Texture2D.MipSlice = 0;

hr = m_pd3dDevice->CreateRenderTargetView(pOutputResource, &rtDesc, &m_pRenderTargetView);
if (FAILED(hr))
{
    return hr;
}

D3D11_TEXTURE2D_DESC outputResourceDesc;
pOutputResource->GetDesc(&outputResourceDesc);
if ( outputResourceDesc.Width != m_Width || outputResourceDesc.Height != m_Height )
{
    m_Width = outputResourceDesc.Width;
    m_Height = outputResourceDesc.Height;

    SetUpViewport();
}

m_pImmediateContext->OMSetRenderTargets(1, &m_pRenderTargetView, NULL);

if ( NULL != pOutputResource )
{
    pOutputResource->Release();
}

return hr;
}

The problem is that my method in Ogre is not working. I exposed this method in Texture.h and overridden it in OgreD3D11Texture.h to use it.
The way I'm trying to use it in my code is as follows:"

Code: Select all

texturePtr = TextureManager.getSingleton().create("MyText", ResourceGroupManager.DEFAULT_RESOURCE_GROUP_NAME);

renderTarget = texturePtr._create2DTexWithSurface(surface);
User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

After debugging the source code, I can be more specific. The error appears in this line:

Code: Select all

RenderTexture* ogreRenderTexture = pixelBuffer->getRenderTarget();

As I showed before, this is how the pixelBuffer is created:

Code: Select all

D3D11HardwarePixelBuffer* pixelBuffer = new D3D11HardwarePixelBuffer
                                                    (this,     //Parent Texture
                                                     mDevice,  
0, //MipLevel mSrcWidth,
mSrcHeight, 32, //Depth 0, //Face PF_B8G8R8A8, //Pixel Format HardwareBuffer::HBU_STATIC );

The Width and Height are set correctly, but I manually specified the MipLevel, Depth, Face, PixelFormat, and HBU_STATIC. These values might be incorrect.

The error:

Image

User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

Maybe everything is working now, but the shaders aren't allowing me to test. A new problem has appeared after initializing the shader as in the Python repository:

Code: Select all

ShaderGenerator.initialize();
ShaderGenerator shadergen = ShaderGenerator.getSingleton();

SGResolver sgres = new SGResolver(shadergen);
MaterialManager.getSingleton().addListener(sgres);

resourceGroupManager.initialiseAllResourceGroups();

RenderState renderState = shadergen.getRenderState(ShaderGenerator.DEFAULT_SCHEME_NAME);
renderState.addTemplateSubRenderState(shadergen.createSubRenderState("SGX_PerPixelLighting"));

scnMgr = root.createSceneManager();
shadergen.addSceneManager(scnMgr);

shadergen.validateMaterial(ShaderGenerator.DEFAULT_SCHEME_NAME, "Sinbad.material", "General");

I'm getting this error now:

Image

paroj
OGRE Team Member
OGRE Team Member
Posts: 2128
Joined: Sun Mar 30, 2014 2:51 pm
x 1141

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by paroj »

try this change: https://github.com/paroj/ogre/commit/6d ... 0bc5e8f62d

it should give you a SharedHandle for all applicable textures

User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

Maybe a little bit of background is necessary to explain the error more clearly.

I created a whole new method to use in Ogre. Basically, the D3D11Image component provides a DX11 texture as a generic pointer for rendering. So, I created a method that receives this pointer and uses it to create an Ogre texture. Perhaps the code will explain it better:

Code: Select all

void D3D11Texture::_create2DTexSurface()
{   
if (!mSurface) throw std::runtime_error("Invalid resource provided to _create2DTex."); HRESULT hr = S_OK; // Convert void* to IUnknown* and than to ID3D11Texture2D* IUnknown* pUnk = (IUnknown*)mSurface; IDXGIResource* pDXGIResource; hr = pUnk->QueryInterface(__uuidof(IDXGIResource), (void**)&pDXGIResource); if (FAILED(hr)) { throw std::runtime_error("Failed to create RenderTargetView for D3D11 texture."); } HANDLE sharedHandle; hr = pDXGIResource->GetSharedHandle(&sharedHandle); if (FAILED(hr)) { throw std::runtime_error("Failed to create RenderTargetView for D3D11 texture."); } pDXGIResource->Release(); IUnknown* tempResource11; hr = mDevice->OpenSharedResource(sharedHandle, __uuidof(ID3D11Resource), (void**)(&tempResource11)); if (FAILED(hr)) { throw std::runtime_error("Failed to create RenderTargetView for D3D11 texture."); } ID3D11Texture2D* pOutputResource; hr = tempResource11->QueryInterface(__uuidof(ID3D11Texture2D), (void**)(&pOutputResource)); if (FAILED(hr)) { throw std::runtime_error("Failed to create RenderTargetView for D3D11 texture."); } tempResource11->Release(); mp2DTex = pOutputResource; D3D11_TEXTURE2D_DESC texDesc; mp2DTex->GetDesc(&texDesc); ComPtr<ID3D11RenderTargetView> renderTargetView; hr = mDevice->CreateRenderTargetView(mp2DTex.Get(), nullptr, renderTargetView.GetAddressOf()); if (FAILED(hr)) throw std::runtime_error("Failed to create RenderTargetView for D3D11 texture."); // set the base texture we'll use in the render system _queryInterface<ID3D11Texture2D, ID3D11Resource>(mp2DTex, &mpTex); _create2DResourceView(); }

I'm using the enum TEX_TYPE_2D_SHARED with this method.

Code: Select all

case TEX_TYPE_2D_SHARED:
	this->_create2DTexSurface();

I believe I made the necessary changes to use it correctly. In the TextureManager, I created a new method to manually create a texture using a surface pointer. It's similar to the existing method, but the difference is the new attribute: surface.

Code: Select all

TexturePtr createManualWithSurface(const String& name, const String& group, TextureType texType, uint width, uint height,
                        int numMipmaps, PixelFormat format, int usage = TU_DEFAULT, void* surface = nullptr,
                        ManualResourceLoader* loader = 0, bool hwGammaCorrection = false, uint fsaa = 0,
                        const String& fsaaHint = BLANKSTRING)
{
    return createManualWithSurface(name, group, texType, width, height, 1, numMipmaps, format, usage, surface, loader,
                        hwGammaCorrection, fsaa, fsaaHint);
}

In my C# code I'm using like this:

Code: Select all

texturePtr = TextureManager.getSingleton().createManualWithSurface(
                "Ogre Render",
                ResourceGroupManager.DEFAULT_RESOURCE_GROUP_NAME,
                TextureType.TEX_TYPE_2D_SHARED,
                (uint)ViewportSize.Width,
                (uint)ViewportSize.Height,
                32,
                0,
                PixelFormat.PF_B8G8R8A8,
                0x20,
                surface);

renderTarget = texturePtr.getBuffer().getRenderTarget();

The thing is, I no longer use SetBackBuffer. Instead, I receive a texture from the component and need to use it in Ogre.

Code: Select all

private void DoRender(IntPtr surface, bool isNewSurface)
{
    if (ogre.DX==OgreImage.Engine.DX11 )
    {
        if(isNewSurface)
        {
            ogre.SetSurface(surface);

    }
        ogre.RenderOneFrame();
}
}

This is how the D3D11Image sample uses the surface attribute:

Code: Select all

HRESULT CCube::InitRenderTarget(void* pResource)
{
	HRESULT hr = S_OK;

IUnknown* pUnk = (IUnknown*)pResource;

IDXGIResource* pDXGIResource;
hr = pUnk->QueryInterface(__uuidof(IDXGIResource), (void**)&pDXGIResource);
if (FAILED(hr))
{
	return hr;
}

HANDLE sharedHandle;
hr = pDXGIResource->GetSharedHandle(&sharedHandle);
if (FAILED(hr))
{
	return hr;
}

pDXGIResource->Release();

IUnknown* tempResource11;
hr = m_pd3dDevice->OpenSharedResource(sharedHandle, __uuidof(ID3D11Resource), (void**)(&tempResource11));
if (FAILED(hr))
{
	return hr;
}

ID3D11Texture2D* pOutputResource;
hr = tempResource11->QueryInterface(__uuidof(ID3D11Texture2D), (void**)(&pOutputResource));
if (FAILED(hr))
{
	return hr;
}
tempResource11->Release();

D3D11_RENDER_TARGET_VIEW_DESC rtDesc;
rtDesc.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
rtDesc.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D;
rtDesc.Texture2D.MipSlice = 0;

hr = m_pd3dDevice->CreateRenderTargetView(pOutputResource, &rtDesc, &m_pRenderTargetView);
if (FAILED(hr))
{
	return hr;
}

D3D11_TEXTURE2D_DESC outputResourceDesc;
pOutputResource->GetDesc(&outputResourceDesc);
if (outputResourceDesc.Width != m_Width || outputResourceDesc.Height != m_Height)
{
	m_Width = outputResourceDesc.Width;
	m_Height = outputResourceDesc.Height;

	SetUpViewport();
}

m_pImmediateContext->OMSetRenderTargets(1, &m_pRenderTargetView, NULL);

if (NULL != pOutputResource)
{
	pOutputResource->Release();
}

return hr;
}

This is the texture description provided by the component.

**************TEXTURE DESCRIPTION**************
Width: 884
Height: 501
MipLevel: 1
ArraySize: 1
Format: 87
Usage: 0
BindFlags: 40
CPUAccessFlags: 0
MiscFlags: 2
SampleDesc Count: 1
SampleDesc Quality: 0


User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

Okay, so everything was indeed working, but I was initializing all resource groups in the wrong place, which caused an error. :cry:

Anyway, the code works now. The only issue I need to fix is how the rendering is happening because it flickers when I run the code. Sometimes only the head appears, then the legs, and sometimes the whole body. However, when I move the scene rendering out of the component, Sinbad appears correctly. I think this might be related to the Flush function in DirectX. In the example from the D3D11Image repository, where they render a cube, there's an explicit call to Flush at the end of the code. If I remove it, nothing shows in the window. Half of the task is done, but there are still things to do.
Image

User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

All done. I don't know if it would be interesting for the Ogre repository to have this, but basically, there are some methods necessary to run Ogre in a D3D11Image.

A modified createManual method in TextureManager. The only difference is a new parameter that receives a generic pointer provided by the component. This pointer represents a DirectX 11 texture generated by the D3D11Image.

Code: Select all

TexturePtr TextureManager::createManualWithSurface(const String& name, const String& group, TextureType texType, uint width,
                                        uint height, uint depth, int numMipmaps, PixelFormat format, int usage, void* surface,
                                        ManualResourceLoader* loader, bool hwGamma, uint fsaa,
                                        const String& fsaaHint)
{
    TexturePtr ret;

OgreAssert(width && height && depth, "total size of texture must not be zero");

// Check for texture support
const auto caps = Root::getSingleton().getRenderSystem()->getCapabilities();
if (((texType == TEX_TYPE_3D) && !caps->hasCapability(RSC_TEXTURE_3D)) ||
    ((texType == TEX_TYPE_2D_ARRAY) && !caps->hasCapability(RSC_TEXTURE_2D_ARRAY)))
    return ret;

ret = create(name, group, true, loader);

if (!ret)
    return ret;

ret->setSurface(surface);
ret->setTextureType(texType);
ret->setWidth(width);
ret->setHeight(height);
ret->setDepth(depth);
ret->setNumMipmaps((numMipmaps == MIP_DEFAULT) ? mDefaultNumMipmaps : static_cast<uint32>(numMipmaps));
ret->setFormat(format);
ret->setUsage(usage);
ret->setHardwareGammaEnabled(hwGamma);
ret->setFSAA(fsaa, fsaaHint);
ret->createInternalResources();
return ret;
}

What I actually use is the surface; the other attributes come from the surface, but I pass them anyway since they might be useful for something else.

In the OgreD3D11Texture class, I also created a new method that essentially converts the generic pointer to my DirectX 11 texture and then performs the necessary Ogre operations:

Code: Select all

void D3D11Texture::_create2DTexSurface()
{ 
    LogManager::getSingleton().logMessage("Entrou no método _create2DTexSurface");

if (!mSurface)
    throw std::runtime_error("Invalid resource provided to _create2DTex.");

HRESULT hr = S_OK;

IUnknown* pUnk = (IUnknown*)mSurface;

IDXGIResource* pDXGIResource;
hr = pUnk->QueryInterface(__uuidof(IDXGIResource), (void**)&pDXGIResource);
if (FAILED(hr))
{
    throw std::runtime_error("Failed to create RenderTargetView for D3D11 texture.");
}

HANDLE sharedHandle;
hr = pDXGIResource->GetSharedHandle(&sharedHandle);
if (FAILED(hr))
{
    throw std::runtime_error("Failed to create RenderTargetView for D3D11 texture.");
}

pDXGIResource->Release();

IUnknown* tempResource11;
hr = mDevice->OpenSharedResource(sharedHandle, __uuidof(ID3D11Resource), (void**)(&tempResource11));
if (FAILED(hr))
{
    throw std::runtime_error("Failed to create RenderTargetView for D3D11 texture.");
}

ID3D11Texture2D* pOutputResource;
hr = tempResource11->QueryInterface(__uuidof(ID3D11Texture2D), (void**)(&pOutputResource));
if (FAILED(hr))
{
    throw std::runtime_error("Failed to create RenderTargetView for D3D11 texture.");
}
tempResource11->Release();

mp2DTex = pOutputResource;

D3D11_TEXTURE2D_DESC desc;
mp2DTex->GetDesc(&desc);

D3D11_RENDER_TARGET_VIEW_DESC rtDesc;
rtDesc.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
rtDesc.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D;
rtDesc.Texture2D.MipSlice = 0;    

ComPtr<ID3D11RenderTargetView> renderTargetView;
hr = mDevice->CreateRenderTargetView(mp2DTex.Get(), nullptr, renderTargetView.GetAddressOf());
if (FAILED(hr))
    throw std::runtime_error("Failed to create RenderTargetView for D3D11 texture.");

_queryInterface<ID3D11Texture2D, ID3D11Resource>(mp2DTex, &mpTex);

_create2DResourceView();    
}

And finally, the flush. I had to create a direct method in the OgreD3D11Device class:

Code: Select all

void D3D11Device::Flush()
{
    if (mImmediateContext)
    {
        mImmediateContext->Flush();
    }

And then I use it in D3D11RenderTexture:

Code: Select all

void D3D11RenderTexture::doFlush() { mDevice.Flush(); }

I think that’s everything I’ve done. It’s a mess, and I plan to refactor everything, but it’s working! :D

paroj
OGRE Team Member
OGRE Team Member
Posts: 2128
Joined: Sun Mar 30, 2014 2:51 pm
x 1141

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by paroj »

Arthurfernandes wrote: Mon Dec 30, 2024 3:36 pm

I don't know if it would be interesting for the Ogre repository to have this,

sure. Its likely that somebody else would want to use this as well. I guess your github is not yet updated with the working version?

Could you try to refactor this to work as I drafted above? I.e. let Ogre create a shareable D3D11 Texture and pass that to your D3D11Image as you did for D3D9? This allows us to use the exiting API without the need of a new create function.

User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

paroj wrote: Thu Jan 02, 2025 2:49 pm
Arthurfernandes wrote: Mon Dec 30, 2024 3:36 pm

I don't know if it would be interesting for the Ogre repository to have this,

sure. Its likely that somebody else would want to use this as well. I guess your github is not yet updated with the working version?

Could you try to refactor this to work as I drafted above? I.e. let Ogre create a shareable D3D11 Texture and pass that to your D3D11Image as you did for D3D9? This allows us to use the exiting API without the need of a new create function.

I could implement it as you suggested, but the problem is that this component inherits from D3DImage, so the "old" method SetBackBuffer remains the same. I can't use DirectX 11 directly because, to do so, I need the new methods created in D3D11Image (OnRender() and RequestRender()). The workflow is as follows: the component generates a texture, which I need to use and then pass to Ogre for manipulation.

What I could do to avoid creating a whole new method is to simply add a new parameter to the createManual method to accept this texture—optional, of course. If the parameter is provided, it will be handled within the texture creation method. Would that work for you?

Regarding GitHub, I’ve already committed my changes to a fork. I cleaned up everything unnecessary and kept only the working components.
(https://github.com/Arthurfernades/ogre/ ... 16f92f0be1)

paroj
OGRE Team Member
OGRE Team Member
Posts: 2128
Joined: Sun Mar 30, 2014 2:51 pm
x 1141

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by paroj »

can you push your D3D11Image implementation so I can check how you do the interop?

also TEX_TYPE_2D_WITH_SURFACE is not really needed. You can check whether mSurface != NULL instead.

User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

paroj wrote: Sun Jan 05, 2025 3:00 pm

can you push your D3D11Image implementation so I can check how you do the interop?

also TEX_TYPE_2D_WITH_SURFACE is not really needed. You can check whether mSurface != NULL instead.

Sure. Here's the link: https://github.com/Arthurfernades/Ogre-D3D11Image. You can download the ZIP file, open the solution, and build the project. Some code in MainWindow.xaml.cs is from the original D3D11Image sample (https://github.com/microsoft/WPFDXInterop). I left it there in case you need to understand how the original sample works.

paroj
OGRE Team Member
OGRE Team Member
Posts: 2128
Joined: Sun Mar 30, 2014 2:51 pm
x 1141

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by paroj »

finally had the time to look into this. Turns out things are more complicated then anticipated.

My idea was to keep surface ownership with ogre and give a shared handle to WPF - just as it is done on D3D9.

However currently the surface is owned by the SurfaceQueue in WPFDXInterop. Interestingly they also use the queue on D3D9, which works just fine for us.
I guess they use the queue for some kind of streaming to improve performance. I also assume that the flickering you observe comes from that, as you never switch the target surface.

Anyway, I assume you do not feel like diving into this and fixing this proper. So the best solution is to start with what you got. For that you should:

  • drop TEX_TYPE_2D_WITH_SURFACE
  • export virtual void _setSurface via Texture

The API would then be

Code: Select all

auto tex = TextureManager::create(...)
tex->setUsage(TU_RENDERTARGET);
tex->_setSurface(handle);
tex->load();

createInternalResourcesImpl will only be called at load, so you can check for a shared surface there and do what you do now.

User avatar
Arthurfernandes
Kobold
Posts: 30
Joined: Mon Oct 28, 2024 10:50 pm
x 2

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by Arthurfernandes »

I've implemented the changes (https://github.com/OGRECave/ogre/commit ... a9dde08343). Everything is working as expected, exactly as you described.

However, I have a question regarding the flush method. In the D3D11Image sample code, after initializing the surface, there's an explicit call to flush at the end of the cube code. When I create the texture and call renderOneFrame, nothing appears on the screen unless I also call flush. To address this, I exposed the flush method in Ogre. Is this the correct approach, or is there an alternative way to achieve this?

paroj
OGRE Team Member
OGRE Team Member
Posts: 2128
Joined: Sun Mar 30, 2014 2:51 pm
x 1141

Re: D3D11RenderTexture::getCustomAttribute missing?

Post by paroj »

feel free to create a pull-request for your changes, so it is easier to discuss the code and so it is checked by the CI tests.

However, I have a question regarding the flush method

first, this method should be added to rendersystem and not texture, as it is not texture specific.

You could also try using texture->getBuffer()->blitToMemory and see whether it forces a flush.

However I still think that the underlying issue is that WPFDXInterop creates a SurfaceQueue with multiple textures, but you only render to one of them.