I have a crash in Ogre 2.1.2 (and 2.1.0) when using the OpenGL3Plus renderer.
This is for an existing Qt application, which has been running with Ogre-next 2.1.0 on Windows using DX11 for many years. Recently it was decided to port this to Linux and due to some time constraints upgrading to the latest Ogre is not a practical solution.
The port went fine, Ogre wants to start, assets are loaded, we get a window, etc, but during the first 'renderOneFrame' it crashes in HlmsPbsDatablock::uploadToConstBuffer during the memcpy at (for me) line 438.
My function code, for reference:
Code: Select all
//-----------------------------------------------------------------------------------
void HlmsPbsDatablock::uploadToConstBuffer( char *dstPtr )
{
_padding0 = mAlphaTestThreshold;
float oldFresnelR = mFresnelR;
float oldFresnelG = mFresnelG;
float oldFresnelB = mFresnelB;
float oldkDr = mkDr;
float oldkDg = mkDg;
float oldkDb = mkDb;
if( mTransparencyMode == Transparent )
{
//Precompute the transparency CPU-side.
if( mWorkflow != MetallicWorkflow )
{
mFresnelR *= mTransparencyValue;
mFresnelG *= mTransparencyValue;
mFresnelB *= mTransparencyValue;
}
mkDr *= mTransparencyValue * mTransparencyValue;
mkDg *= mTransparencyValue * mTransparencyValue;
mkDb *= mTransparencyValue * mTransparencyValue;
}
memcpy( dstPtr, &mBgDiffuse[0], MaterialSizeInGpu ); /* here I get a segmentation fault */
mkDr = oldkDr;
mkDg = oldkDg;
mkDb = oldkDb;
mFresnelR = oldFresnelR;
mFresnelG = oldFresnelG;
mFresnelB = oldFresnelB;
}
It seems similar to viewtopic.php?p=516097&sid=3349cb708aa2 ... e6#p516097 but I don't know if it is the same issue.
On Windows with DX11 everything works fine, so I suspect something OpenGL3+ specific.
The scene is not very complex, but we do have multiple lights, some of which are not shadow casting, but being used for lighting the backside of objects as well as ambient lighting.
Since the same issue occurs on both KUbuntu 25.10 (Ogre 2.1.2) and Windows 11 (Ogre 2.1.0, OpenGL3Plus), but not on Windows 11 (Ogre 2.1.0 DX11), I suspect it is not a driver issue.
Unfortunately the scene is not easily shareable, so I cannot provide a minimal test case.
I did enable ENABLE_GL_CHECK in OgreGL3PlusPrerequisites.h as per the snippet below at line 112.
Code: Select all
#define ENABLE_GL_CHECK 1
This resulted in an exception being thrown in GL3PlusVaoManager::waitFor around line 1260:
Code: Select all
if( waitRet == GL_WAIT_FAILED )
{
OGRE_EXCEPT( Exception::ERR_RENDERINGAPI_ERROR,
"Failure while waiting for a GL Fence. Could be out of GPU memory. "
"Update your video card drivers. If that doesn't help, "
"contact the developers.",
"GL3PlusVaoManager::getDynamicBufferCurrentFrame" );
return fenceName;
}
This seems to indicate a GPU memory issue, but as I said, it works fine with DX11 and the scene is not very complex, it's just built up through lots of dynamic code. It's basically this scene: https://www.movella.com/hs-fs/hubfs/Ima ... 3)-1-1.png
When I'm not actively rendering, but the Ogre Root window exists, I can see the dynamic buffer cycling through 3 temporary buffers (as expected, since mDynamicBufferMultiplier is 3) and the fences being created and deleted as expected. But when rendering starts, the crash happens almost immediately.
Any ideas on how to further trace or fix this issue are very welcome.
