GPU Only Particle System (Geometry Shaders)

A place to show off your latest screenshots and for people to comment on them. Only start a new thread here if you have some nice images to show off!
Post Reply
User avatar
Noman
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 714
Joined: Mon Jan 31, 2005 7:21 pm
Location: Israel
x 2
Contact:

GPU Only Particle System (Geometry Shaders)

Post by Noman »

Hello all

As part of the GSoC Geometry Shader Project I decided to add another important feature to the project - Render To Vertex Buffer support.

The feature allows you to save the contents of the vertex+geometry pipeline to a vertex buffer, and then use it later on. You can also use that output as the input of the next run, giving us iterative geometry on the GPU.

One of the possibilities of this feature is GPU only particle systems, like microsoft did in their ParticleGS demo. So, I decided to add the API to Ogre, implement it in the OpenGL render system and create a port of the ParticleGS demo in Ogre.

And for the results :
Image
Image
Image

Note that all of the geometry is completely generated in shaders! I do not use Ogre::ManualObject nor do I write vertex/index buffers manually. The only geometry that I do generate is the launcher particle that starts the system.

Particle GS Demo download
There is a binary download for those who want to try it out for themselves. You can download it here.
Demo requirements : Geforce 8 class (or higher) graphics card. ATI did not implement geometry shaders in their OpenGL drivers.

For those of you who aren't privileged enough to have a compatible card, you can also check out the video of the demo on YouTube, although my video conversion skills don't really make it look good.

If you want more in depth information about the project, check out the discussion thread, or even grab the source code a the SVN branch :
https://svn.ogre3d.org/svnroot/ogre/bra ... eomshaders

This pretty much marks the end of the google project, which I can now say has been a huge success for me.

Let the fireworks begin!
User avatar
Zeal
Ogre Magi
Posts: 1260
Joined: Mon Aug 07, 2006 6:16 am
Location: Colorado Springs, CO USA

Post by Zeal »

Sigh... you make me want to upgrade to vista...

Nice work though!
ppClarity
Gremlin
Posts: 194
Joined: Sat Sep 02, 2006 12:27 am
x 2

Post by ppClarity »

That's the point of his project, no Vista required :D
User avatar
_tommo_
Gnoll
Posts: 677
Joined: Tue Sep 19, 2006 6:09 pm
x 3
Contact:

Post by _tommo_ »

ppClarity wrote:That's the point of his project, no Vista required :D
If you run it in OpenGL... wich you can't if you have an ATI.
Wich BTW now are the best GPUs... so many still require Vista i think ;)


Anyway that geometry shaders would be really useful in a game that i thought... i want a dx10 GPU now...

good work!
OverMindGames Blog
IndieVault.it: Il nuovo portale italiano su Game Dev & Indie Games
User avatar
Zeal
Ogre Magi
Posts: 1260
Joined: Mon Aug 07, 2006 6:16 am
Location: Colorado Springs, CO USA

Post by Zeal »

That's the point of his project, no Vista required
Waitt what? Ohhhhhh opengl.. Sorry I dont follow opengl, so youre saying it will have all the neat 'directx10' features, but still run on xp?

Thats exciting!
User avatar
Noman
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 714
Joined: Mon Jan 31, 2005 7:21 pm
Location: Israel
x 2
Contact:

Post by Noman »

In fact, the development machine for this project uses windows XP!

I don't want to start an argument about which is the better card between the two, but NVIDIA has two big advantages for developers IMO :
1) Better OpenGL support (Geometry shader / transform feedback support just proves it, but it has almost always been like this)
2) Better developer tools (NVPerfHud = amazing).

I'm not sure which brand I'd rather have as a consumer (benchmarks etc), but as a developer I'm definitely an NVIDIA fan. And in the long run, this is probably a big boost on the consumer side, as a product that was mainly developed on machines with NVIDIA cards is likely to be tuned for them...
User avatar
_tommo_
Gnoll
Posts: 677
Joined: Tue Sep 19, 2006 6:09 pm
x 3
Contact:

Post by _tommo_ »

Noman wrote:In fact, the development machine for this project uses windows XP!

I don't want to start an argument about which is the better card between the two, but NVIDIA has two big advantages for developers IMO :
1) Better OpenGL support (Geometry shader / transform feedback support just proves it, but it has almost always been like this)
2) Better developer tools (NVPerfHud = amazing).

I'm not sure which brand I'd rather have as a consumer (benchmarks etc), but as a developer I'm definitely an NVIDIA fan.
Yes, you are right, but i think that DX10 will become mass market thanks to AMD, this time... not because of the excellence, needed for developers, but because they released cards that "just work" and cost very less. You can run Crysis on Very High on the HD4850... and it costs half of the nvidia 280.
Anyway, i'm really surprised of how few users of this forum can support fully Directx10... we are to some degree all developers but this standard has really been ignored. Poor Microsoft ;)
OverMindGames Blog
IndieVault.it: Il nuovo portale italiano su Game Dev & Indie Games
User avatar
ahmedismaiel
OGRE Contributor
OGRE Contributor
Posts: 217
Joined: Wed Jan 25, 2006 11:16 pm
Location: Redmond,WA

Post by ahmedismaiel »

Nice Work, worked like a charm at 400FPS on 8400GT

shouldn't be any way to update the vertex count with what have been generated in GS?
User avatar
Noman
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 714
Joined: Mon Jan 31, 2005 7:21 pm
Location: Israel
x 2
Contact:

Post by Noman »

The vertex count does get updated.
However, the triangle count is what the Ogre debug overlay shows.
Since the output of the generating pass is points, they cannot contribute to the triangle count. The display pass geometry shader turns those points into screen aligned quads, but the software doesn't really have a way to know how many triangles were formed...
bharling
Gremlin
Posts: 166
Joined: Fri Jun 30, 2006 1:04 pm

Post by bharling »

Nice Work, worked like a charm at 400FPS on 8400GT
Strange it fails on my 8500GT:

Code: Select all

08:58:43: OGRE EXCEPTION(2:InvalidParametersException): Parameter called elapsedTime does not exist.  in GpuProgramParameters::_findNamedConstantDefinition at ..\src\OgreGpuProgram.cpp (line 1090)
shame, am interested very much in this kind of thing!
Was here
User avatar
Noman
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 714
Joined: Mon Jan 31, 2005 7:21 pm
Location: Israel
x 2
Contact:

Post by Noman »

bharling - I want to work this problem out with you, its very important for me to solve it.

Couple of things :
- I noticed that i forgot to include the cg runtime (cg.dll) in the package, so the demo was using whatever version of the runtime that was installed on your computer. Perhaps its an old one. I re-uploaded the demo, this time including cg.dll. Can you try to download again and tell me if you still get problems?
- What version of the driver are you running? 8500 sounds like a laptop driver. Can you download NVIDIA's transform feedback fractal demo and check if that works? Perhaps you need to update your driver?

If the transform feedback sample works, and mine still doesn't, can you please post the full Ogre.log for me to check?

Thanks!
User avatar
KungFooMasta
OGRE Contributor
OGRE Contributor
Posts: 2087
Joined: Thu Mar 03, 2005 7:11 am
Location: WA, USA
x 16
Contact:

Post by KungFooMasta »

Cool video! 8)
Creator of QuickGUI!
bharling
Gremlin
Posts: 166
Joined: Fri Jun 30, 2006 1:04 pm

Post by bharling »

Aha,

the revised version worked - fps hovers around 220.

I'm on a desktop machine, my card is a GeForce 8500GT.

very cool, nice work ;)
Was here
alex.t
Halfling
Posts: 42
Joined: Wed Feb 27, 2008 7:08 pm

Post by alex.t »

Do you have any idea why an 8800GTS would give me an Unimplemented Exception? It says my card does not support geometry programs.
User avatar
sinbad
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 19265
Joined: Sun Oct 06, 2002 11:19 pm
Location: Guernsey, Channel Islands
x 66
Contact:

Post by sinbad »

alex.t wrote:Do you have any idea why an 8800GTS would give me an Unimplemented Exception? It says my card does not support geometry programs.
Drivers?
Post Reply