In GL, hardware gamma correction on textures doesn't work

Discussion area about developing or extending OGRE, adding plugins for it or building applications on it. No newbie questions please, use the Help forum for that.
User avatar
sparkprime
Ogre Magi
Posts: 1137
Joined: Mon May 07, 2007 3:43 am
Location: Ossining, New York
Contact:

In GL, hardware gamma correction on textures doesn't work

Post by sparkprime » Sun Aug 08, 2010 8:25 pm

I have enabled hardware gamma correction in my diffuse maps. I have verified that this is working by inserting the following code:

Code: Select all

        // Allocate internal buffer so that glTexSubImageXD can be used
        // Internal format
        GLenum format = GLPixelUtil::getClosestGLInternalFormat(mFormat, mHwGamma);
        std::cout << "gamma: " << (mHwGamma?"true":"false") << "  format: "<< format << std::endl;
Which prints things like

gamma: true format: 35907
gamma: true format: 35917

Which I have found to be

#define GL_SRGB8_ALPHA8_EXT 0x8C43
#define GL_COMPRESSED_SRGB_ALPHA_S3TC_DXT1_EXT 0x8C4D

So that's all good.

However I don't see a change. The textures ought to appear darker due to the gamma conversion. I have been doing gamma correction manually in the shader for some time so I know what to expect.

The SRGB framebuffer is working for me though, I can disable the manual gamma conversion I have previously been doing on output. However I am still having to do manual conversion on input because the hardware is not doing it for me.

Is there anything I have to do in the shader (CG) to make this work?

edit: does work in d3d9, doesn't work in windows or linux gl, probably therefore not a driver problem, probably a bug in gl rendersystem?
0 x

User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 4146
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 268
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by dark_sylinc » Mon Aug 23, 2010 4:24 pm

If the texture was already loaded as non-gamma and now set as HW gamma; it won't work.
This usually happens when you have two materials using the same texture, one with hw gamma one, another with hw gamma off.
Also ParticleUniverse has the horrible idea of loading textures with their scripts, ignoring hw gamma.

The tex unit has to be opted in to be gamma, as well as the texture.

I usually do this to guarantee it's HW gamma:

Code: Select all

texUnit->setHardwareGammaEnabled( hwGamma );

TexturePtr texPtr = TextureManager::getSingleton().getByName( texUnit->getTextureName() );
if( !texPtr.isNull() )
{
  texPtr->unload();
  texPtr->setHardwareGammaEnabled( true );
}
Oh, and GL HW Gamma does work in my application as well with D3D9

Cheers
Dark Sylinc

Edit: I usually put names to texture units in the materials, using the word "Gamma" in the name to specify which ones should be HW gamma enabled. Then, once I've loaded all my resources, iterate through all materials, all their techniques, all their passes, and finally do "if( texUnit->getName().find( "Gamma" ) != Ogre::String::npos )" then unload the texture and reload it with HW Gamma on.
0 x

User avatar
sparkprime
Ogre Magi
Posts: 1137
Joined: Mon May 07, 2007 3:43 am
Location: Ossining, New York
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by sparkprime » Mon Aug 23, 2010 4:35 pm

I ensured that wasn't happening using the print, as explained in the original post
0 x

User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 4146
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 268
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by dark_sylinc » Mon Aug 23, 2010 4:48 pm

Oh I see! You wrote the print inside the GL RenderSystem.
mmm weird.

Ok I'll try my engine again since now I have the latest Mercurial version, it may have been broken.
0 x

User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 4146
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 268
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by dark_sylinc » Mon Aug 23, 2010 4:54 pm

Ok just tried latest Mercurial, you're right it's broken.
My textures look all washed out.

Got broken somewhere between 1.6.0 and latest build (d'oh... that's a lot of revisions) since it used to work
0 x

User avatar
sparkprime
Ogre Magi
Posts: 1137
Joined: Mon May 07, 2007 3:43 am
Location: Ossining, New York
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by sparkprime » Tue Aug 24, 2010 4:55 am

Ah that's good news, at least it's not just me :)

thanks
0 x

User avatar
Wolfmanfx
OGRE Team Member
OGRE Team Member
Posts: 1525
Joined: Fri Feb 03, 2006 10:37 pm
Location: Austria - Leoben
x 1
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Wolfmanfx » Tue Aug 24, 2010 7:13 am

Its a little bit offtopic but it seems that ogre needs better units test because since steve is gone many commits break some old features i know the latest hg default is unstable but you know....
0 x

User avatar
jacmoe
OGRE Retired Moderator
OGRE Retired Moderator
Posts: 20570
Joined: Thu Jan 22, 2004 10:13 am
Location: Denmark
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by jacmoe » Tue Aug 24, 2010 10:44 am

Did you file a bug?
Otherwise it will remain one... :)
0 x
/* Less noise. More signal. */
Ogitor Scenebuilder - powered by Ogre, presented by Qt, fueled by Passion.
OgreAddons - the Ogre code suppository.

User avatar
sparkprime
Ogre Magi
Posts: 1137
Joined: Mon May 07, 2007 3:43 am
Location: Ossining, New York
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by sparkprime » Tue Aug 24, 2010 4:13 pm

i'll do that now it's been confirmed
0 x

User avatar
sparkprime
Ogre Magi
Posts: 1137
Joined: Mon May 07, 2007 3:43 am
Location: Ossining, New York
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by sparkprime » Sun Sep 12, 2010 4:32 pm

I had a look trying to work out what is going wrong, and as far as I can tell, everything is being done correctly. It should just be a case of loading a texture with the right format, and I can see in the debugger that the right format is being used.

This problem is particularly irritating with fixed function compositors, as each compositor will darken the scene slightly.

edit: actually the darkening doesn't have anything to do with broken gamma, which actually makes it lighter and compensates slightly for whatever is causing the darkening...
editedit: the darkening was because i forgot to turn off lighting in the compositor quad material :)
0 x

User avatar
sparkprime
Ogre Magi
Posts: 1137
Joined: Mon May 07, 2007 3:43 am
Location: Ossining, New York
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by sparkprime » Thu Jan 27, 2011 3:42 pm

I'm reawakening this because I need it more now :)
0 x

User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy » Mon Sep 03, 2012 2:07 pm

Here is another guy fighting with the same issue.

I've problems to get hardware gamma correction working under the OGL RS. It works fine under DX9. In my case, I ensure that textures are reloaded if needed with:

Code: Select all

         currentTUS->setHardwareGammaEnabled(true);

			Ogre::TexturePtr texPtr = Ogre::TextureManager::getSingleton().getByName(mSourceFileName);
			if (!texPtr.isNull())
			{
				texPtr->unload();
				texPtr->setHardwareGammaEnabled(true);
			}
(Although I'm using a custom material system which ensures that the texture is loaded with HW gamma enabled from a start, I just added that piece of code for testing purposes)

Under DX9 all is working as expected, textures are being gamma corrected (pow 2.2).

I'm using Ogre 1.7.4, I've checked the Ogre 1.8.0/1.8.1 changelog but I haven't found any reference to the problem, so I assume it's still present.

Is anyone having the same problem?

Xavier
0 x

User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy » Mon Sep 03, 2012 4:03 pm

Btw, here is a comparison between OGL and DX9:
ImageImage

Any help is really appreciated.
0 x

User avatar
sparkprime
Ogre Magi
Posts: 1137
Joined: Mon May 07, 2007 3:43 am
Location: Ossining, New York
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by sparkprime » Tue Sep 04, 2012 4:33 am

I gave up and do all gamma correction manually now. Not ideal but good enough (so far). As far as I know my original issue still stands.
0 x

User avatar
masterfalcon
OGRE Team Member
OGRE Team Member
Posts: 4270
Joined: Sun Feb 25, 2007 4:56 am
Location: Bloomington, MN
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by masterfalcon » Tue Sep 04, 2012 5:16 am

Yeah, it's still there in 1.8.1. I looked into it a couple months ago but I didn't see anything wrong so I was stumped.
0 x

User avatar
Kojack
OGRE Moderator
OGRE Moderator
Posts: 7152
Joined: Sun Jan 25, 2004 7:35 am
Location: Brisbane, Australia
x 19

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Kojack » Tue Sep 04, 2012 11:30 am

Xavyiy wrote:Here is another guy fighting with the same issue.

I've problems to get hardware gamma correction working under the OGL RS. It works fine under DX9.
Are you having problems with the texture loading gamma correction or the framebuffer gamma correction? The original post was about the texture loading not doing gamma but the framebuffer working, which would make opengl look brighter, like in your pic. So I guess that's the trouble you have.

I have the opposite. In opengl my texture loading gamma correction works correctly, but the framebuffer gamma (setting srgb mode in the config dialog) does nothing. So textures are too dark in gl. Directx 9 works fine.

Whenever I see things like this my first assumption is "ati vs nvidia". I'm on a radeon 5870, how about the rest of you with the problem?
0 x

User avatar
Kojack
OGRE Moderator
OGRE Moderator
Posts: 7152
Joined: Sun Jan 25, 2004 7:35 am
Location: Brisbane, Australia
x 19

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Kojack » Tue Sep 04, 2012 12:35 pm

I did a little debugging.
Support for the srgb framebuffer extension is being checked before gl extensions are loaded, so the list is empty.

Here's some code from GLRenderSystem::_createRenderWindow

Code: Select all

// Create the window
		RenderWindow* win = mGLSupport->newWindow(name, width, height, 
			fullScreen, miscParams);

		attachRenderTarget( *win );

		if (!mGLInitialised) 
		{                

			// set up glew and GLSupport
			initialiseContext(win);
The gl extension for srgb framebuffers is being called in newWindow (well, inside of Win32GLSupport::selectPixelFormat that's called by Win32Window::create that's called by newWindow).
But the list of gl extensions isn't populated until initialiseContext at the bottom.
So at least on windows, srgb framebuffer can never be enabled.
0 x

User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy » Tue Sep 04, 2012 1:10 pm

Are you having problems with the texture loading gamma correction or the framebuffer gamma correction? The original post was about the texture loading not doing gamma but the framebuffer working, which would make opengl look brighter, like in your pic. So I guess that's the trouble you have.
My problem is related with texture loading gamma correction, for the framebuffer I don't use any gamma correction, I do it at the first step of the compositor chain, with a compositor.

The problem is that, in OpenGL, the texture is not being gama corrected so we've not the effect of pow(texCol, 2.2). Then, when in my final compositor I apply the pow(col, 1.0/2.2), things look much brighter, as expected.

pow(col,2.2) makes the image more "darker, with more contrast" <--- That's missing, since in OGL it doesn't work
pow(col,1/2.2) makes the image more "brighter, all like white"

So I'm getting the result of applying two times pow(col, 1/2.2), one from the original texture (it's stored with 1/2.2 in the hard disk) and another one time from the gamma correction compositor.
I have the opposite. In opengl my texture loading gamma correction works correctly, but the framebuffer gamma (setting srgb mode in the config dialog) does nothing. So textures are too dark in gl. Directx 9 works fine.
Well, in my case srgb framebuffer seems to work for render windows(at least in DX9, don't remember if I've tested it under OGL too), but not for RTTs(neither OGL or DX9). In the editor I'm using an RTT to render the scene+compositors, and then in the render window I just render this texture with a fullscreen quad and then I render the gizmos and overlays. That way this geometry is not affected by compositors.

This is why I'm using a compositor to make the gamma correction: sRGB on RTTs doesn't work.

Looks like gamma-correction things are a little broken in Ogre. Let's hope we'll be able to figure out what's happening.
Whenever I see things like this my first assumption is "ati vs nvidia". I'm on a radeon 5870, how about the rest of you with the problem?
I'm on a Nvidia 310M.

Xavier
0 x

User avatar
Kojack
OGRE Moderator
OGRE Moderator
Posts: 7152
Joined: Sun Jan 25, 2004 7:35 am
Location: Brisbane, Australia
x 19

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Kojack » Tue Sep 04, 2012 2:24 pm

I modified ogre's source to ignore the extension (which hasn't been set yet) and just set hardware srgb regardless, and it works fine. Both texture gamma and framebuffer gamma in opengl.
In Win32GLSupport::selectPixelFormat (inside of OgreWin32GLSupport.cpp) I changed:

Code: Select all

if (useHwGamma && checkExtension("WGL_EXT_framebuffer_sRGB"))
to:

Code: Select all

if (useHwGamma)
It's probably safe to remove the extension check, because it's only used to set up the data sent to wglChoosePixelFormatARB, which will already fail to find a valid format if hardware gamma isn't available and ogre's responds is to clear the hardware gamma request flag and try again.

But I'm not a gl coder, somebody who knows what they are doing should check it out while I go back to using directx 9. :)


There's another gamma related problem in ogre. The material script parser accepts the gamma keyword for textures, but anim_texture and cubic_texture don't accept it. So the only way to load gamma corrected cubic or animated textures is via c++.

What I'd really like to see is a flag in the texture manager that can force gamma on or off for all textures regardless of the material settings. This way the workarounds like enabling gamma then unloading to make sure the texture wasn't already loaded without gamma isn't needed, and animated and cubic textures can get it (although that's just a workaround too, the script parser needs to be fixed to know about gamma for anim and cubic).
0 x

User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy » Tue Sep 04, 2012 2:31 pm

I modified ogre's source to ignore the extension (which hasn't been set yet) and just set hardware srgb regardless, and it works fine.
Great! I'll give it a try! Let's see if I can sRGB in RTTs working too!
What I'd really like to see is a flag in the texture manager that can force gamma on or off for all textures regardless of the material settings. This way the workarounds like enabling gamma then unloading to make sure the texture wasn't already loaded without gamma isn't needed, and animated and cubic textures can get it (although that's just a workaround too, the script parser needs to be fixed to know about gamma for anim and cubic).
Well, that's a little risky since only diffuse textures must be gamma corrected, not normal maps or similar.

Xavier
0 x

User avatar
Kojack
OGRE Moderator
OGRE Moderator
Posts: 7152
Joined: Sun Jan 25, 2004 7:35 am
Location: Brisbane, Australia
x 19

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Kojack » Tue Sep 04, 2012 2:33 pm

Damn, that's a good point, I forgot about normal maps and stuff like that. :(
0 x

User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy » Tue Sep 04, 2012 4:38 pm

Kojack wrote:Damn, that's a good point, I forgot about normal maps and stuff like that. :(
Indeed, it's very annoying. In my engine, I'll allow the user to flag texture units as "diffuse" (I use a custom material script file format) and then, if the current scene (a project can have multiple scenes) is in "Linear space" (similar to Unity3d: http://docs.unity3d.com/Documentation/M ... hting.html), all these diffuse textures are going to be gamma corrected by hw (and use sRGB framme buffers, if I figure out how to use them in RTTs)

Also, I use that diffuse flag info to render the scene in "Lighting only" mode: diffuse textures are switched to a gray one. It's very useful for artists, helps a lot for getting a good lighting in the scene.
I gave up and do all gamma correction manually now. Not ideal but good enough (so far). As far as I know my original issue still stands.
Well, it's more than enough for a game or similar, but in a game engine it's a little limited (you can't use the same shaders, etc). Also, using sRGB frame buffers if blending is enabled, the previously stored value is converted back to linear before blending and the result of the blend is gamma-corrected. More info here: http://http.developer.nvidia.com/GPUGem ... _ch24.html

---------

In some weeks I'll be back to the topic since right now I've other priorities in my todo list, I'm not a GL or DX coder (I haven't directly used them since 7 or 8 years ago, and that was when I was learning c++), so it's very probable I'll annoy you with more gamma-related issues :P

Xavier
0 x

User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy » Sun Sep 16, 2012 8:08 pm

Well, here we are again...
I've been doing some testing, without any luck.

OgreGLHardwarePixelBuffer.cpp --- GLTextureBuffer::GLTextureBuffer (called by GLTexture::_createSurfaceList)

Code: Select all

// Get format
	glGetTexLevelParameteriv(mFaceTarget, level, GL_TEXTURE_INTERNAL_FORMAT, &value);
	mGLInternalFormat = value;
	mFormat = GLPixelUtil::getClosestOGREFormat(value);

	std::cout << "GLTextureBuffer: " << writeGamma << " " << mGLInternalFormat << std::endl;
	
	// Default
	mRowPitch = mWidth;
Result:
writeGamma == true ---> Good
mGLInternalFormat == GL_COMPRESSED_SRGB_ALPHA_S3TC_DXT1_EXT ---> Good

--------

I've been trying to manually check any kind of gamma-related change between 1.7.4 and 1.6.0(worked for dark_sylinc), but without success. Maybe it's some kind of GL extension we're not setting? I have zero experience with OGL so I'm not very helful here :/

Any thought?

Looks like the internal pixel format is correct, so... any ideas about the root of the problem?

Xavier
0 x

User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy » Tue Sep 18, 2012 11:10 pm

Following this https://twitter.com/Xavyiy/status/248177358643810304, here is a little how-to for reproducing the texture gamma correction bug under OpenGL:

Basically, you'll notice that if you add the keyword 'gamma' to a texture in a .material file this way:

Code: Select all

texture_unit
{
	texture Source.jpg [b]gamma[/b]
}
Under DX9 you'll get a darker texture(gamma corrected) while in OpenGL you'll get the same colours than if you don't set the gamma keyword.

--------------------

Using the sample browser:

1. Edit facial.material:

Code: Select all

material drbunsen_head
...
   texture Dr_Bunsen_Head.jpg [b]gamma[/b]
...
2. Run it under DX9:
You should get something like: As you can see, the texture is gamma corrected.
Image

3. Run it under OGL:
You should get something like: As you can see, the texture is NOT gamma corrected. You get exactly the same result than without the gamma keyword.
Image

Xavier
0 x

User avatar
masterfalcon
OGRE Team Member
OGRE Team Member
Posts: 4270
Joined: Sun Feb 25, 2007 4:56 am
Location: Bloomington, MN
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by masterfalcon » Wed Sep 19, 2012 1:30 am

This must be something driver specific.

Here is on OS X 10.8 on NVIDIA.
Screen Shot 2012-09-18 at 7.29.48 PM.png
Screen Shot 2012-09-18 at 7.29.48 PM.png (207.87 KiB) Viewed 5294 times
0 x

Post Reply