In GL, hardware gamma correction on textures doesn't work

Discussion area about developing or extending OGRE, adding plugins for it or building applications on it. No newbie questions please, use the Help forum for that.
User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain
x 87

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy »

Mmm great, dunno why but I was suspecting that was the cause of the problem :/

I'll update my Nvidia drivers and let's see if the problem gets solved, otherwise I'll have to live with it! Btw, if someone under windows+Nvidia is able to test it, it'll be very helpful for me.

Xavier
User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain
x 87

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy »

After updating my drivers, I still get the same result. I guess it's something realted with Nvidia+Win7 drivers, at least with a Nvidia 310M and the 306.23 driver version(and some previous versions).

It's funny because if I use the integrated Intel GPU, the gamma correction works: so yes, definitively it's driver related!

Xavier
TheSHEEEP
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 972
Joined: Mon Jun 02, 2008 6:52 pm
Location: Berlin
x 65

Re: In GL, hardware gamma correction on textures doesn't wor

Post by TheSHEEEP »

How important is it for you to have hardware gamma correction?

I'd imagine doing it yourself via shader would be easier than going through all the hassle like you currently do :)
My site! - Have a look :)
Also on Twitter - extra fluffy
User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain
x 87

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy »

TheSHEEEP wrote:How important is it for you to have hardware gamma correction?

I'd imagine doing it yourself via shader would be easier than going through all the hassle like you currently do :)
Well, I'm working on a "full-featured"(of course, far from AAA engines) game engine: this is the main problem. Imagine something similar to Unity3D(all-in-one editor) but using Ogre as render engine.

Regarding the gamma correction part, in the editor, the user is able to select between gamma (traditional, no gamma correction) and linear (gamma corrected) lighting pipelines. Both pipelines must work with the same materials, so the engine automatically reloads all diffuse textures with gamma correction enabled and also all diffuse colors are gamma corrected before being sent to the shader(I'm using a "custom material system" + custom scripting system, which at the end just creates/updates/removes all Ogre materials using the C++ API, so I can easily allow the user to define what colors/textures must be gamma corrected in material scripts). That way the user is able to use the same materials/shaders without any change using both lighting pipelines.

For the final gamma correction, pow(1/2.2), I use a compositor to allow the end-user changing the gamma correction value using a classical slider in the config menu. Also, the developer can implement the gamma correction pass in the last compositor saving some milliseconds. That's why I'm not using sRGB frame buffers for the final correction.

So, if it was just a game, or similar, it'll be enough to gamma-correct textures/colors in the shader itself, but unfortunately I can't do that in a serious engine.

Anyway, knowing that it's a driver-related issue I'm much more relaxed. It works pretty well under DX9, and looks like Windows is the only plattform affected by this driver issue, so at really it's not a big problem now. Let's hope Nvidia guys will solve it before being commercial. Otherwise I guess I'll dig deeper into it.

Xavier
User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 5296
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 1278
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by dark_sylinc »

NVIDIA isn't going to fix it if no one files a bug report (I didn't).

BTW, your game engine problem should be solvable if you design a system where you control part or all of the shader generation (like RTSS does)
The BIG MAIN problem with doing gamma correction in the shader is that you don't get correct bilinear filtering (gamma correction must be done before filtering).

In DX9 the device can decide whether do the conversion before or after the filtering, but almost all of them do the conversion as they should, because that's what DX10 forces them to comply with anyway.
User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain
x 87

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy »

NVIDIA isn't going to fix it if no one files a bug report (I didn't).
Indeed. I was thinking in filling a bug but finally there are some reason which prevents me from doing it:
  • Texture hw gamma correction is something extensively used in new tittles and I'm having this problem using drivers with more than one year of difference between their release dates, so I very doubt Nvidia is not aware of it or even than it's a driver poblem and not an ogre bug
  • Before filling a bug, I would like to reproduce it in a simple OpenGL app, but due to my lack of time and unexperience with raw OpenGL(I haven't use it since 2004) that will have to wait
  • Btw, I've seen some Nvidia-related hacks in the OpenGL render system. Maybe that's the root of the problem after all. As I've said before, I really doubt Nvidia has broken hw gamma correction for more than 1 year...
About the shader generation: yes, that would be a solution but appart from a custom shading language (which I'll implement in a middle future, after the first public releases. Similar to RTSS since at the end I'll translate it to glsl/hlsl instructions in run time) I want to allow the user to directly use glsl/hlsl. Also, I prefer the idea of fixing the OGL "bug" rather than addressing it in the other way.

Btw, as far as I know Unity3D and UDK have working hw gamma correction under OpenGL+Nvidia+Win, so... I'll have to dig deeper before complaining to Nvidia.
User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain
x 87

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy »

Well, finally today I've dig deeper into this problem and now it's fixed.

What was happening? Two things:

1: OgreGLTexture.cpp
Here there was the following nvidia hack for non-apple platforms:

Code: Select all

#if OGRE_PLATFORM != OGRE_PLATFORM_APPLE
	/*	if (Root::getSingleton().getRenderSystem()->getCapabilities()->getVendor() == GPU_NVIDIA
			&& !PixelUtil::isCompressed(mFormat))
		{
			mMipmapsHardwareGenerated = false;
		}*/
#endif
That broke hw gamma correction for non compressed textures in windows with a Nvidia card. To fix it just remove the previous code.

2: OgreGLHardwarePixelBuffer.cpp in GLTextureBuffer::upload(...)
In the compressed branch, we were doing the following:

Code: Select all

GLenum format = GLPixelUtil::getClosestGLInternalFormat(mFormat);
which was setting the second parameter of getClosestGLInternalFormat(PixelFormat mFormat, bool hwGamma = false) to false since it uses the implicit value. So the returned format was not sRGB.

Quick fix: just replace the previous line by:

Code: Select all

GLenum format = mGLInternalFormat;
At really, this bug is repeated lot of time inside GLTextureBuffer::upload, since I see we use a lot getGLOriginFormat(...) which doesn't take account the gamma correction flag.

Since masterfalcon is giving some love to the OGL3+ render system, I think the better option is waiting until he fixes it in the upcoming OGL3+ RS and then applying the changed to the regular OGL render system.

... And here we are, it works like a charm!
Image

Xavier
User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 5296
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 1278
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by dark_sylinc »

Cool!!!

Just as a history lesson, the reason that hack was there my have been because of this (the forum link there is broken, so here it is)

Edit: We should probably find in files other usages of getClosestGLInternalFormat to see this problem also doesn't appear somewhere else.
User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain
x 87

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy »

dark_sylinc wrote:Edit: We should probably find in files other usages of getClosestGLInternalFormat to see this problem also doesn't appear somewhere else.
Indeed. Or even remove the implicit 'hwGamma = false' value making sure we're filling both parameters every time we use it.
User avatar
masterfalcon
OGRE Team Member
OGRE Team Member
Posts: 4270
Joined: Sun Feb 25, 2007 4:56 am
Location: Bloomington, MN
x 126
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by masterfalcon »

How about this for a start?
Attachments
hwgamma.diff.zip
(1.53 KiB) Downloaded 108 times
User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain
x 87

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy »

Looks good to me. But software mipmap generated textures might not be gamma corrected:
Look for GLPixelUtil::getGLOriginFormat(data.format) in ::upload(...), it does not take account gamma correction. Dunno if it's intended or not.
User avatar
masterfalcon
OGRE Team Member
OGRE Team Member
Posts: 4270
Joined: Sun Feb 25, 2007 4:56 am
Location: Bloomington, MN
x 126
Contact:

Re: In GL, hardware gamma correction on textures doesn't wor

Post by masterfalcon »

Yeah, I actually tried that. It caused all sorts of ugliness. I haven't thought through to see if we should try to get that case working too or not.
User avatar
Xavyiy
OGRE Expert User
OGRE Expert User
Posts: 847
Joined: Tue Apr 12, 2005 2:35 pm
Location: Albacete - Spain
x 87

Re: In GL, hardware gamma correction on textures doesn't wor

Post by Xavyiy »

Well, I don't think someone is going to use hw gamma correction but sw generated mips so I think it's good enough with your patch. Anyway a little comment on the code could be useful in the future.
Post Reply