Geforce4 Ti UBYTE4 question...
Posted: Thu Mar 17, 2005 6:09 am
I've been perusing the forums regarding problems getting hardware skinning to work on GF4 cards b/c the cards don't apparently support UBYTE4 vertex element types, which apparently would be required if HW skinning was to work. I just wanted to share some personal experience in the matter, having a GF4 4200.
When I originally was testing Ogre out, all the way back during the 0.15 days, I am almost *positive* that I saw the Skeletal Animation demo running w/ hardware skinning enabled under both DX and OGL. No graphical problems or anything. I was running unofficial drivers (61.77, Omega's modified version) as opposed to the official 66.93, due to DX9 problems I ran into when I upgraded (see http://www.ogre3d.org/phpBB2/viewtopic. ... highlight= if you're curious). I do, however, remember that while DX9 was very fast, OGL was extremely slow in comparison.
Fast-forward to today, and I've since upgraded my drivers to something more recent (nvidia 71.81, still unofficial but not "modified") that didn't cause DX problems in Ogre. OGL performance is now (in a non-trivial number of cases) faster than DX9. However, it was when I ran the Skeletal Animation demo again that I noticed it was using software skinning in DX. Like I said, I could have sworn it used HW skinning at some point, so this puzzled me. Even more of a conundrum was that HW skinning works perfectly fine in OGL. Looking at the log after running in OGL, it says "VET_UBYTE4 vertex element type: yes". The DX version says no.
I have scoured my entire machine for an old log file from before my driver update. Being a sloppy HDD-keeper, I seem to have builds of Ogre in its various versions here and there, but none of them has a .log file old enough. I suppose I could switch back to old drivers to test, but I'm not ready to go there yet.
Here's what I've boiled the possibilities down to:
I'm not really expecting any hard conclusions based on this, I just wanted to add to the body of knowlege about this problem, and wondered if Sinbad & crew might be able to make any sense of it. I'd be glad to go back to previous drivers and do some testing, if you guys think it'd be worth it, but I wanted to get some advice before reverting.
Thanks! Hope I've piqued someone's interest!
When I originally was testing Ogre out, all the way back during the 0.15 days, I am almost *positive* that I saw the Skeletal Animation demo running w/ hardware skinning enabled under both DX and OGL. No graphical problems or anything. I was running unofficial drivers (61.77, Omega's modified version) as opposed to the official 66.93, due to DX9 problems I ran into when I upgraded (see http://www.ogre3d.org/phpBB2/viewtopic. ... highlight= if you're curious). I do, however, remember that while DX9 was very fast, OGL was extremely slow in comparison.
Fast-forward to today, and I've since upgraded my drivers to something more recent (nvidia 71.81, still unofficial but not "modified") that didn't cause DX problems in Ogre. OGL performance is now (in a non-trivial number of cases) faster than DX9. However, it was when I ran the Skeletal Animation demo again that I noticed it was using software skinning in DX. Like I said, I could have sworn it used HW skinning at some point, so this puzzled me. Even more of a conundrum was that HW skinning works perfectly fine in OGL. Looking at the log after running in OGL, it says "VET_UBYTE4 vertex element type: yes". The DX version says no.
I have scoured my entire machine for an old log file from before my driver update. Being a sloppy HDD-keeper, I seem to have builds of Ogre in its various versions here and there, but none of them has a .log file old enough. I suppose I could switch back to old drivers to test, but I'm not ready to go there yet.
Here's what I've boiled the possibilities down to:
- A: Nvidia's drivers are causing OGL to report incorrectly, and Ogre is actually doing software skinning, even though it thinks its doing hw skinning. The GF4 really, truly has no support for UBYTE4 vertex types.
B: Nvidia's DX drivers are incorrectly setting the UBYTE4 cap to false, and the problem is simply driver-related. The GF4 actually does have UBYTE4 support.
I'm not really expecting any hard conclusions based on this, I just wanted to add to the body of knowlege about this problem, and wondered if Sinbad & crew might be able to make any sense of it. I'd be glad to go back to previous drivers and do some testing, if you guys think it'd be worth it, but I wanted to get some advice before reverting.
Thanks! Hope I've piqued someone's interest!