depthbias

A place for users of OGRE to discuss ideas and experiences of utilitising OGRE in their games / demos / applications.
ogreogre
Halfling
Posts: 59
Joined: Mon Aug 13, 2007 8:30 pm

depthbias

Post by ogreogre »

Hi all,
hate to revisit this, because I browsed the related long threads about this already. However, I'm seeing depth bias discrepencies within d3d. What is puzzling me is that there're variations *within ogre D3D* on different machines. I run ogre d3d on all my testing machines, but one depthbias setting (for simple decal texture) would work fine on some machines and flicker on others. If I increase the bias to the point of having artifacts on some machines, it would still flicker on some other machines...

I understand that tweaking the projection matrix is a better way, but for various reasons that's not an option right now. Has anybody had the same issue--the same Ogre D3D depthbias setting having different effects on different machines?

Thanks!
Steve
User avatar
sinbad
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 19269
Joined: Sun Oct 06, 2002 11:19 pm
Location: Guernsey, Channel Islands
x 66

Post by sinbad »

The other machines are probably running in 16-bit colour mode. The bias is in increments of the smallest measurable depth increment but in 16-bit modes the depth buffer is also less precise.
ogreogre
Halfling
Posts: 59
Joined: Mon Aug 13, 2007 8:30 pm

Post by ogreogre »

Sinbad, thanks for the reply. From what you said the right approach should be to detect whether the window is init'd as 16-bit and use different biases.

Currently I use everything default with Ogre so I don't even know what kind of a window it created... How would I find out if a window is 16-bit or if the depth buffer is of less precision?

Thanks,
Steve
User avatar
sinbad
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 19269
Joined: Sun Oct 06, 2002 11:19 pm
Location: Guernsey, Channel Islands
x 66

Post by sinbad »

RenderWindow::getColourDepth()
ogreogre
Halfling
Posts: 59
Joined: Mon Aug 13, 2007 8:30 pm

Post by ogreogre »

Unfortunately... I found that the the machines with flickering are using 32-bit colour depth. Which is to say that they may be using 32-bit colour, but 16-bit depth. Is there a function that returns the number of bits of Depth buffer?

Thanks,
Steve
User avatar
sinbad
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 19269
Joined: Sun Oct 06, 2002 11:19 pm
Location: Guernsey, Channel Islands
x 66

Post by sinbad »

The depth buffer will always be 24-bit (24/8 depth/stencil) in 32-bit colour mode.

This suggests that the cards / drivers in question are implementing the depth bias slightly differently, this is possible as different cards can use different depth compression techniques. You'll have to either pick values that work for all (ie larger than you might need for some cards), or you'll have to detect the cards that are causing you issues and pad the biases perhaps?