Hi all,
hate to revisit this, because I browsed the related long threads about this already. However, I'm seeing depth bias discrepencies within d3d. What is puzzling me is that there're variations *within ogre D3D* on different machines. I run ogre d3d on all my testing machines, but one depthbias setting (for simple decal texture) would work fine on some machines and flicker on others. If I increase the bias to the point of having artifacts on some machines, it would still flicker on some other machines...
I understand that tweaking the projection matrix is a better way, but for various reasons that's not an option right now. Has anybody had the same issue--the same Ogre D3D depthbias setting having different effects on different machines?
Thanks!
Steve
depthbias
-
- OGRE Retired Team Member
- Posts: 19269
- Joined: Sun Oct 06, 2002 11:19 pm
- Location: Guernsey, Channel Islands
- x 66
-
- Halfling
- Posts: 59
- Joined: Mon Aug 13, 2007 8:30 pm
Sinbad, thanks for the reply. From what you said the right approach should be to detect whether the window is init'd as 16-bit and use different biases.
Currently I use everything default with Ogre so I don't even know what kind of a window it created... How would I find out if a window is 16-bit or if the depth buffer is of less precision?
Thanks,
Steve
Currently I use everything default with Ogre so I don't even know what kind of a window it created... How would I find out if a window is 16-bit or if the depth buffer is of less precision?
Thanks,
Steve
-
- Halfling
- Posts: 59
- Joined: Mon Aug 13, 2007 8:30 pm
-
- OGRE Retired Team Member
- Posts: 19269
- Joined: Sun Oct 06, 2002 11:19 pm
- Location: Guernsey, Channel Islands
- x 66
The depth buffer will always be 24-bit (24/8 depth/stencil) in 32-bit colour mode.
This suggests that the cards / drivers in question are implementing the depth bias slightly differently, this is possible as different cards can use different depth compression techniques. You'll have to either pick values that work for all (ie larger than you might need for some cards), or you'll have to detect the cards that are causing you issues and pad the biases perhaps?
This suggests that the cards / drivers in question are implementing the depth bias slightly differently, this is possible as different cards can use different depth compression techniques. You'll have to either pick values that work for all (ie larger than you might need for some cards), or you'll have to detect the cards that are causing you issues and pad the biases perhaps?