possible bug when destroying render texture and compositor workspace?

Design / architecture / roadmap discussions related to future of Ogre3D (version 2.0 and above)
Post Reply
igamenovoer
Gnoblar
Posts: 1
Joined: Thu Dec 13, 2018 7:22 am

possible bug when destroying render texture and compositor workspace?

Post by igamenovoer » Thu Dec 13, 2018 7:57 am

Ogre Version: :?: v2.1, pulled from git master on Dec.10
Operating System: :?: windows 10, build: 17134
Render System: :?: D3D11 and OpenGL 3+

I am working on an application where I need to create and destroy render textures of different sizes a lot.
Originally, I created a compositor workspace and several render textures, and wanted to use the single workspace to render to different textures.
However, going through the Ogre tutorial and examples, I can not find any clue about how to change the output target of the workspace.
So, each time I create an RT, I also add a workspace rendering to it. Then during destruction, the RT and the workspace are destroyed together.
At this point, I ran into memory leak. As the textures and workspaces are destroyed, the memory usage goes up quickly and never goes down.

I tried to see if those textures and workspaces are actually destroyed, by adding some printf to their destructors.
They are destroyed, the destructors are called.

I was using D3D11, now I tried OpenGL 3+, the memory leak is GONE! Seems like this is a D3D11 issue.

Searching the forum, I realize that D3D11 will cache the resources and not release them until ID3D11DeviceContext::Flush() is called.
So, I added a call to D3D11RenderSystem::_clearStateAndFlushCommandBuffer(). It works, partially. If the textures have fixed size,
no memory leak, otherwise leaking still occurs.

Then I looked through the forum and find that I may have to call RenderSystem::_cleanupDepthBuffers(false) to get rid of Ogre's caching.
Well, adding this call works, no memory leaks with changing texture size.

However, surprisingly, OpenGL 3+ now leaks memory! In OpenGL 3+, calling RenderSystem::_cleanupDepthBuffers(false) leads to memory leak,
but it works fine without the call.

So I wonder if this is a bug? Or am I doing something wrong ?
Btw, I don't feel right to call D3D11RenderSystem::_clearStateAndFlushCommandBuffer(), I think this kind of issue should be handled in RenderSystem, without going down to the concrete type like D3D11RenderSystem.

The test code is attached. It leaks in OpenGL but works in D3D11.

Code: Select all

#include <iostream>
#include <random>
#include "Ogre/Ogre.h"
#include "Ogre/Compositor/OgreCompositorManager2.h"
#include "Ogre/RenderSystems/Direct3D11/OgreD3D11RenderSystem.h"

#ifdef NDEBUG
const char* plugin_d3d = "G:/ogre-2-1/build/sdk/bin/Release/RenderSystem_Direct3D11.dll";
const char* plugin_gl = "G:/ogre-2-1/build/sdk/bin/Release/RenderSystem_GL3Plus.dll";
#else
const char* plugin_d3d = "G:/ogre-2-1/build/sdk/bin/Debug/RenderSystem_Direct3D11_d.dll";
const char* plugin_gl = "G:/ogre-2-1/build/sdk/bin/Debug/RenderSystem_GL3Plus_d.dll";
#endif

enum class RenderSystemType {
	D3D11, OPENGL
};

const char* render_system_name[] = {
	"Direct3D11 Rendering Subsystem",
	"OpenGL 3+ Rendering Subsystem"
};

int main() {
	Ogre::Root root("", "", "");
	
	root.loadPlugin(plugin_d3d);
	root.loadPlugin(plugin_gl);

	{
		//select rendersystem
		//RenderSystemType rdtype = RenderSystemType::D3D11;
		RenderSystemType rdtype = RenderSystemType::OPENGL;
		auto rdsys = root.getRenderSystemByName(render_system_name[(int)rdtype]);
		root.setRenderSystem(rdsys);
	}

	root.initialise(false);
	auto rdwin = root.createRenderWindow("render window", 1000, 1000, false);

	//create scene and camera, the compositor needs these
	auto scene = root.createSceneManager(Ogre::SceneType::ST_GENERIC, 1, Ogre::INSTANCING_CULLING_SINGLETHREAD);
	auto camera = scene->createCamera("camera");

	//create workspace definition
	const char* workspace_def_name = "dummy_wkdef";
	auto comp_manager = root.getCompositorManager2();
	comp_manager->createBasicWorkspaceDef(workspace_def_name, Ogre::ColourValue::ZERO);

	std::default_random_engine rand_engine;
	std::uniform_int_distribution<int> int_dist(400, 1000);

	//repeatedly create render textures and workspaces, and destroy them
	for (int i = 0; i < 10000; i++) {

		auto texmgr = root.getTextureManager();
		
		//random width and height
		//auto width = int_dist(rand_engine);
		//auto height = width;
		auto width = 1000;
		auto height = 1000;

		printf("iteration %d, texture size = %d x %d\n", i, width, height);
		std::string name = "rtt_" + std::to_string(i);

		//create texture
		auto texture = texmgr->createManual(name, Ogre::ResourceGroupManager::DEFAULT_RESOURCE_GROUP_NAME,
			Ogre::TEX_TYPE_2D, width, height, 1, Ogre::PF_R8G8B8A8, Ogre::TU_RENDERTARGET);

		//create workspace
		auto workspace = root.getCompositorManager2()->addWorkspace(
			scene, texture->getBuffer()->getRenderTarget(), camera, workspace_def_name, true
		);

		//destroy the workspace and texture
		root.getCompositorManager2()->removeWorkspace(workspace);
		texmgr->remove(texture);
		texture.reset();

		//Calling this in d3d11 prevents memory leak, but in opengl it causes memory leak
		//Is this a bug?
		root.getRenderSystem()->_cleanupDepthBuffers(false);

		//in d3d11, we need to call Flush() to get the textures deleted
		auto rdsys_d3d = dynamic_cast<Ogre::D3D11RenderSystem*>(root.getRenderSystem());
		if (rdsys_d3d) {
			rdsys_d3d->_clearStateAndFlushCommandBuffer();
		}
	}

	return 0;
}
0 x

Post Reply