[2.1] OpenGL & Intel HD 2500

Discussion area about developing with Ogre-Next (2.1, 2.2 and beyond)


Post Reply
Kenshido
Gremlin
Posts: 153
Joined: Tue Jun 09, 2009 9:31 am
Location: Russia
x 12
Contact:

[2.1] OpenGL & Intel HD 2500

Post by Kenshido »

Hello, there!

I red here that Ogre 2.1 cannot work with intel cards 3000- (except for hd 2500):
Intel HD 3000 or below (except HD 2500): The hardware ought to be supported. Unfortunately Intel has decided to not update their OpenGL drivers for these cards, and as such don’t have the necessary features to support Ogre 2.1. Support for these cards should be restored once D3D11 support catches up though, and Linux users may hope for the Open Source Mesa implementation to also catch up eventually. Until then, these cards will not be able to run our bleeding edge code.
I have Intel HD 2500 card, and I see next render in Sample_PbsMaterials:
ogre21.jpg
ogre21_2.jpg
User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 5299
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 1279
Contact:

Re: [2.1] OpenGL & Intel HD 2500

Post by dark_sylinc »

Unfortunately that's a driver error, which has been corrected in the very latest drivers (tested on Intel HD 4400).

However, I'm worried that Intel may have dropped driver support/updates for their "barely old" devices. Their behavior seem to be that they only focus on the latest hardware they've released :(

That's a shame because the HW capabilities of the HD 2500 are in theory enough to run 2.1
uelkfr
Gnoblar
Posts: 12
Joined: Wed Jun 05, 2013 6:19 am

Re: [2.1] OpenGL & Intel HD 2500

Post by uelkfr »

Intel HD 4000. Updating to latest drivers helped, but still have font bug.
font_bug.jpg
User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 5299
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 1279
Contact:

Re: [2.1] OpenGL & Intel HD 2500

Post by dark_sylinc »

Darn! It's still a driver bug. The driver is ignoring the fixed function texture swizzles.

Fortunately I know a workaround, which I noticed that this "workaround" is needed for DX11 too, in order to consume less memory.
I just pushed a fix. Pull the latest changes and try again.
uelkfr
Gnoblar
Posts: 12
Joined: Wed Jun 05, 2013 6:19 am

Re: [2.1] OpenGL & Intel HD 2500

Post by uelkfr »

Thank you, now works perfect!
uelkfr
Gnoblar
Posts: 12
Joined: Wed Jun 05, 2013 6:19 am

Re: [2.1] OpenGL & Intel HD 2500

Post by uelkfr »

Another perhaps non-ogre related issue is that on OpenGL I'm getting exactly Ogre::ColourValue backgroundColour = Ogre::ColourValue( 0.2f, 0.4f, 0.6f ) as workspace clearing color, but on Direct3D I'm getting much lighter about (0.48f, 0.66f, 0.79f). And overall scene seems a bit lighter (maybe gamma, maybe brightness). But I haven't modified any settings related to my video card.
User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 5299
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 1279
Contact:

Re: [2.1] OpenGL & Intel HD 2500

Post by dark_sylinc »

One of them is missing gamma correction. Is this happening in the default samples?
uelkfr
Gnoblar
Posts: 12
Joined: Wed Jun 05, 2013 6:19 am

Re: [2.1] OpenGL & Intel HD 2500

Post by uelkfr »

Unfortunately, "sRGB Gamma Conversation" has no effect on my system. My OpenGL screenshot been posted above and here is Direct3D screenshot:
direct3d.jpg
User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 5299
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 1279
Contact:

Re: [2.1] OpenGL & Intel HD 2500

Post by dark_sylinc »

The Direct3D version is the correct one. I didn't notice your first picture was actually incorrect. Yet another OpenGL bug in Intel drivers. **sigh**
The samples are overriding the sRGB gamma setting to be always one, that's why the option isn't working for you.

Is this Windows XP/Vista/7/8? Can you upload your Ogre.log file somewhere? (after running using OpenGL)
uelkfr
Gnoblar
Posts: 12
Joined: Wed Jun 05, 2013 6:19 am

Re: [2.1] OpenGL & Intel HD 2500

Post by uelkfr »

Of course,
http://pastebin.com/M9iSBs1p

It is Windows 7.
User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 5299
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 1279
Contact:

Re: [2.1] OpenGL & Intel HD 2500

Post by dark_sylinc »

Thanks! That was useful.
Pushed a fix. Should be fixed now.
KammutierSpule
Gnoblar
Posts: 13
Joined: Tue Oct 10, 2017 12:22 pm
x 1

Re: [2.1] OpenGL & Intel HD 2500

Post by KammutierSpule »

I am running on Linux on a Intel GPU the Sample_PbsMaterials and from what I read in this thread it is not rendering correctly the sRGB.
I am able to run Unity on my machine and it seems to not suffer from this problem.

Is that still some issue with Ogre?

EDIT: GPU information
Graphics: Card: Intel Haswell-ULT Integrated Graphics Controller bus-ID: 00:02.0
Display Server: X.Org 1.18.4 drivers: intel (unloaded: fbdev,vesa) Resolution: 1920x1080@60.00hz
GLX Renderer: Mesa DRI Intel Haswell GLX Version: 3.0 Mesa 17.0.7 Direct Rendering: Yes
User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 5299
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 1279
Contact:

Re: [2.1] OpenGL & Intel HD 2500

Post by dark_sylinc »

This one will be difficult to track.

It appears to be a driver or HW issue, which likely Unity is either workarounding or emulating.
You can try debugging further what's going on by going to GL3Plus/src/windowing/GLX/OgreGLXWindow.cpp and you'll find GLXWindow::create

Code: Select all

int minAttribs[] = {
    GLX_DRAWABLE_TYPE,  GLX_WINDOW_BIT,
    GLX_RENDER_TYPE,        GLX_RGBA_BIT,
    GLX_RED_SIZE,      1,
    GLX_BLUE_SIZE,    1,
    GLX_GREEN_SIZE,  1,
#if OGRE_NO_QUAD_BUFFER_STEREO == 0
    GLX_STEREO, mStereoEnabled ? True : False,
#endif
    None
};

int maxAttribs[] = {
    GLX_SAMPLES,            static_cast<int>(samples),
    GLX_DOUBLEBUFFER,   1,
    GLX_STENCIL_SIZE,   INT_MAX,
    GLX_FRAMEBUFFER_SRGB_CAPABLE_EXT, 1,
    None
};
Try changing it to:

Code: Select all

int minAttribs[] = {
    GLX_DRAWABLE_TYPE,  GLX_WINDOW_BIT,
    GLX_RENDER_TYPE,        GLX_RGBA_BIT,
    GLX_RED_SIZE,      1,
    GLX_BLUE_SIZE,    1,
    GLX_GREEN_SIZE,  1,
#if OGRE_NO_QUAD_BUFFER_STEREO == 0
    GLX_STEREO, mStereoEnabled ? True : False,
#endif
    GLX_FRAMEBUFFER_SRGB_CAPABLE_EXT, 1, //This is new
    None
};

int maxAttribs[] = {
    GLX_SAMPLES,            static_cast<int>(samples),
    GLX_DOUBLEBUFFER,   1,
    GLX_STENCIL_SIZE,   INT_MAX,
    GLX_FRAMEBUFFER_SRGB_CAPABLE_EXT, 1,
    None
};
A few lines later you'll see:

Code: Select all

fbConfig = mGLSupport->selectFBConfig(minAttribs, maxAttribs);
If fbConfig is null after that call, then your GPU doesn't actually support sRGB (a few lines later it will throw saying "Unexpected failure to determine a GLXFBConfig"). If it does, then we can patch Ogre.

Also checkout this:

Code: Select all

if (gamma != 0)
{
        mGLSupport->getFBConfigAttrib(fbConfig, GL_FRAMEBUFFER_SRGB_CAPABLE_EXT, &gamma);
}
gamma should always be non-zero, both before and after the getFBConfigAttrib call. If not, then tell me when it happens to be 0.

Cheers
Matias
KammutierSpule
Gnoblar
Posts: 13
Joined: Tue Oct 10, 2017 12:22 pm
x 1

Re: [2.1] OpenGL & Intel HD 2500

Post by KammutierSpule »

I am new using Ogre3D so I am not sure but, I believe that I built Ogre with SDL support, so I was looking for your windowing/GLX and I thought could the SDL be the issue? Since if is creating the windows using SDL it is not running the source code you suggest me to try.. ?

Edit:
I can see it is using GLX subsystem in the log, but still, it was built with SDL support I believe...
******************************
*** Starting GLX Subsystem ***
******************************
KammutierSpule
Gnoblar
Posts: 13
Joined: Tue Oct 10, 2017 12:22 pm
x 1

Re: [2.1] OpenGL & Intel HD 2500

Post by KammutierSpule »

I tested your suggestion and I make sure it was build and running the changes and both are OK, fbConfig is created, gamma is still 1 after the test.

I also debugged trace the openGL application and I can see it is enabling/disabling the sRGB for each different map.
KammutierSpule
Gnoblar
Posts: 13
Joined: Tue Oct 10, 2017 12:22 pm
x 1

Re: [2.1] OpenGL & Intel HD 2500

Post by KammutierSpule »

I made some more tests,
I was searching for the documentation of "GLX_FRAMEBUFFER_SRGB_CAPABLE_EXT"
and it suggests that it is used with the function glXChooseVisual that Ogre is not using it.
I tried to use that function but there is no change..
User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 5299
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 1279
Contact:

Re: [2.1] OpenGL & Intel HD 2500

Post by dark_sylinc »

Hi KammutierSpule!

I just spotted there was a typo, as a "GL_FRAMEBUFFER_SRGB_CAPABLE_EXT" should've been "GLX_FRAMEBUFFER_SRGB_CAPABLE_EXT".

This could explain your gamma correction bug in Linux. Could you check if this fixes your issue?
KammutierSpule
Gnoblar
Posts: 13
Joined: Tue Oct 10, 2017 12:22 pm
x 1

Re: [2.1] OpenGL & Intel HD 2500

Post by KammutierSpule »

Sorry I just noticed this post now.
I was backing to this problem since I was installing my work on a new Linux Machine.
I believe I am experience the same issue. Even using your suggestion to set "export LIBGL_ALWAYS_SOFTWARE=1" it is not working on this machine for me
Maybe I am doing something wrong or it could be an issue with drivers..

Maybe could you add an option in the future to use SW (shader based) only gamma correction? So that will work for sure on all the cases if we want.
At moment as is, I cannot do it without change the Ogre source code.

Also, I noticed that you have in the shaders code to use a gamma of 2 (x*x and sqrt(x) )
It would be nice for me if I could setup gamma or set my desired pow
Post Reply