Cg Requirements

A place for users of OGRE to discuss ideas and experiences of utilitising OGRE in their games / demos / applications.
User avatar
psiegel
Kobold
Posts: 28
Joined: Sun Aug 29, 2004 5:51 pm
Location: Boston, MA

Cg Requirements

Post by psiegel »

I've been looking at the Cg plugin, and it seems like it's a pretty powerful tool. But before I go spending a lot of time learning Cg and incorporating it into my project, I wonder if someone could give me the low-down on what kind of hardware is needed for Cg use. What I mean is, will requiring the Cg plugin alter my app's minimum reqs at all? Do some older video cards not support it?

Paul
User avatar
:wumpus:
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 3067
Joined: Tue Feb 10, 2004 12:53 pm
Location: The Netherlands
x 1

Post by :wumpus: »

To really have fun with Cg (full support for both pixel and fragment shaders) you need a Shaders 2.0 card like ATI 9500+ and NVidia FX series.

Some earlier cards can do Shader 1.1, but this is kind of limited as to the operations you can do. To make your game run on those you can easily provide a fallback technique on your material that compiles under older versions or only employs the fixed pipeline.
User avatar
sinbad
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 19269
Joined: Sun Oct 06, 2002 11:19 pm
Location: Guernsey, Channel Islands
x 66

Post by sinbad »

These 2 articles are a pretty good summary of support. Remember, Cg just generates low-level shaders from a convenient high-level language; no card 'supports' Cg, they just support the shaders the Cg compiles down to.

Desktop GPU Guide
Worsktation GPU Guide

As you can see, minimum vertex and pixel shader support emerges pretty early on in the list these days, although you can't do as much with 1.x as wumpus pointed out; vertex shader constant limits constrain you when you're doing hardware skinning, and not being able to sample more than once from a texture, or across texture coord sets, is a bummer in pixel shaders. Once you get to 2.x shaders it becomes a lot more fun, cards from about 2 years ago onward support them.
Last edited by sinbad on Sat Apr 02, 2005 5:02 pm, edited 1 time in total.
User avatar
psiegel
Kobold
Posts: 28
Joined: Sun Aug 29, 2004 5:51 pm
Location: Boston, MA

Post by psiegel »

Thanks for the info guys. Of course, this just leads to more questions. Where can I find some documentation on what features 1.x shaders supports vs. 2.x shaders?

:sinbad - Both those links point to the same place, was this a mistake?

Paul
User avatar
sinbad
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 19269
Joined: Sun Oct 06, 2002 11:19 pm
Location: Guernsey, Channel Islands
x 66

Post by sinbad »

psiegel wrote:Thanks for the info guys. Of course, this just leads to more questions. Where can I find some documentation on what features 1.x shaders supports vs. 2.x shaders?
For DirectX, you're best to look at MSDN.

For GL, the standards actually started at 2.x essentially with ARB_vertex_program and ARB_fragment_program - prior to that there were only vendor-specific ones like NV_texture_shader and NV_register_combiners. You can find details on all of them on the OpenGL Extensions Registry, although the papers there are a little dry, which is why I have Lengyels openGL Extensions Guide book.
:sinbad - Both those links point to the same place, was this a mistake?
Doh, fixed it (the more useful one is the desktop GPU guide).