Page 1 of 1

Geforce4 Ti UBYTE4 question...

Posted: Thu Mar 17, 2005 6:09 am
by zolaar
I've been perusing the forums regarding problems getting hardware skinning to work on GF4 cards b/c the cards don't apparently support UBYTE4 vertex element types, which apparently would be required if HW skinning was to work. I just wanted to share some personal experience in the matter, having a GF4 4200.

When I originally was testing Ogre out, all the way back during the 0.15 days, I am almost *positive* that I saw the Skeletal Animation demo running w/ hardware skinning enabled under both DX and OGL. No graphical problems or anything. I was running unofficial drivers (61.77, Omega's modified version) as opposed to the official 66.93, due to DX9 problems I ran into when I upgraded (see http://www.ogre3d.org/phpBB2/viewtopic. ... highlight= if you're curious). I do, however, remember that while DX9 was very fast, OGL was extremely slow in comparison.

Fast-forward to today, and I've since upgraded my drivers to something more recent (nvidia 71.81, still unofficial but not "modified") that didn't cause DX problems in Ogre. OGL performance is now (in a non-trivial number of cases) faster than DX9. However, it was when I ran the Skeletal Animation demo again that I noticed it was using software skinning in DX. Like I said, I could have sworn it used HW skinning at some point, so this puzzled me. Even more of a conundrum was that HW skinning works perfectly fine in OGL. Looking at the log after running in OGL, it says "VET_UBYTE4 vertex element type: yes". The DX version says no.

I have scoured my entire machine for an old log file from before my driver update. Being a sloppy HDD-keeper, I seem to have builds of Ogre in its various versions here and there, but none of them has a .log file old enough. I suppose I could switch back to old drivers to test, but I'm not ready to go there yet.

Here's what I've boiled the possibilities down to:
  • A: Nvidia's drivers are causing OGL to report incorrectly, and Ogre is actually doing software skinning, even though it thinks its doing hw skinning. The GF4 really, truly has no support for UBYTE4 vertex types.
    B: Nvidia's DX drivers are incorrectly setting the UBYTE4 cap to false, and the problem is simply driver-related. The GF4 actually does have UBYTE4 support.
I figure the second possibility is much more likely than the first, considering even my ATI Rage128 from 5+ years ago can manage HW skinning in Ogre, in both DX and OGL.

I'm not really expecting any hard conclusions based on this, I just wanted to add to the body of knowlege about this problem, and wondered if Sinbad & crew might be able to make any sense of it. I'd be glad to go back to previous drivers and do some testing, if you guys think it'd be worth it, but I wanted to get some advice before reverting.

Thanks! Hope I've piqued someone's interest!

Posted: Thu Mar 17, 2005 2:39 pm
by sinbad
The GeForce3/4 have never supported VET_UBYTE4 in DirectX, yet they've always supported it in GL.

The GF3/4 cards are bloody strange in DirectX. There are a number of things that they just don't support when running DirectX, including UBYTE4 and infinite far clip projection matrices, even when both of these things work fine in GL. It's either a freaky design thing with DirectX, or GL is providing emulation to compensate.

I hate the GF3/4 cards in DirectX. They caused me massive problems when implementing skinning and stencil shadows precisely because they inexplicably don't support some of the key features in one API. There are workarounds in the code precisely for these damn cards - I'd be glad if I never saw one ever again :(

Posted: Mon Apr 18, 2005 10:17 pm
by Sarev0k
I was wondering where you used workarounds for the GF3/4 graphics cards in Ogre. I was thinking that my Geforce 4 4200 Go card in my notebook may be the reason why I am unable to see terrain in the Paging Scene Manager, while everyone else on my team can view terrain fine.

Do you think there's any corrilation?