GPU Only Particle System (Geometry Shaders)
-
- OGRE Retired Team Member
- Posts: 714
- Joined: Mon Jan 31, 2005 7:21 pm
- Location: Israel
- x 2
GPU Only Particle System (Geometry Shaders)
Hello all
As part of the GSoC Geometry Shader Project I decided to add another important feature to the project - Render To Vertex Buffer support.
The feature allows you to save the contents of the vertex+geometry pipeline to a vertex buffer, and then use it later on. You can also use that output as the input of the next run, giving us iterative geometry on the GPU.
One of the possibilities of this feature is GPU only particle systems, like microsoft did in their ParticleGS demo. So, I decided to add the API to Ogre, implement it in the OpenGL render system and create a port of the ParticleGS demo in Ogre.
And for the results :
Note that all of the geometry is completely generated in shaders! I do not use Ogre::ManualObject nor do I write vertex/index buffers manually. The only geometry that I do generate is the launcher particle that starts the system.
Particle GS Demo download
There is a binary download for those who want to try it out for themselves. You can download it here.
Demo requirements : Geforce 8 class (or higher) graphics card. ATI did not implement geometry shaders in their OpenGL drivers.
For those of you who aren't privileged enough to have a compatible card, you can also check out the video of the demo on YouTube, although my video conversion skills don't really make it look good.
If you want more in depth information about the project, check out the discussion thread, or even grab the source code a the SVN branch :
https://svn.ogre3d.org/svnroot/ogre/bra ... eomshaders
This pretty much marks the end of the google project, which I can now say has been a huge success for me.
Let the fireworks begin!
As part of the GSoC Geometry Shader Project I decided to add another important feature to the project - Render To Vertex Buffer support.
The feature allows you to save the contents of the vertex+geometry pipeline to a vertex buffer, and then use it later on. You can also use that output as the input of the next run, giving us iterative geometry on the GPU.
One of the possibilities of this feature is GPU only particle systems, like microsoft did in their ParticleGS demo. So, I decided to add the API to Ogre, implement it in the OpenGL render system and create a port of the ParticleGS demo in Ogre.
And for the results :
Note that all of the geometry is completely generated in shaders! I do not use Ogre::ManualObject nor do I write vertex/index buffers manually. The only geometry that I do generate is the launcher particle that starts the system.
Particle GS Demo download
There is a binary download for those who want to try it out for themselves. You can download it here.
Demo requirements : Geforce 8 class (or higher) graphics card. ATI did not implement geometry shaders in their OpenGL drivers.
For those of you who aren't privileged enough to have a compatible card, you can also check out the video of the demo on YouTube, although my video conversion skills don't really make it look good.
If you want more in depth information about the project, check out the discussion thread, or even grab the source code a the SVN branch :
https://svn.ogre3d.org/svnroot/ogre/bra ... eomshaders
This pretty much marks the end of the google project, which I can now say has been a huge success for me.
Let the fireworks begin!
-
- Ogre Magi
- Posts: 1260
- Joined: Mon Aug 07, 2006 6:16 am
- Location: Colorado Springs, CO USA
-
- Gremlin
- Posts: 194
- Joined: Sat Sep 02, 2006 12:27 am
- x 2
-
- Gnoll
- Posts: 677
- Joined: Tue Sep 19, 2006 6:09 pm
- x 5
If you run it in OpenGL... wich you can't if you have an ATI.ppClarity wrote:That's the point of his project, no Vista required
Wich BTW now are the best GPUs... so many still require Vista i think
Anyway that geometry shaders would be really useful in a game that i thought... i want a dx10 GPU now...
good work!
-
- Ogre Magi
- Posts: 1260
- Joined: Mon Aug 07, 2006 6:16 am
- Location: Colorado Springs, CO USA
-
- OGRE Retired Team Member
- Posts: 714
- Joined: Mon Jan 31, 2005 7:21 pm
- Location: Israel
- x 2
In fact, the development machine for this project uses windows XP!
I don't want to start an argument about which is the better card between the two, but NVIDIA has two big advantages for developers IMO :
1) Better OpenGL support (Geometry shader / transform feedback support just proves it, but it has almost always been like this)
2) Better developer tools (NVPerfHud = amazing).
I'm not sure which brand I'd rather have as a consumer (benchmarks etc), but as a developer I'm definitely an NVIDIA fan. And in the long run, this is probably a big boost on the consumer side, as a product that was mainly developed on machines with NVIDIA cards is likely to be tuned for them...
I don't want to start an argument about which is the better card between the two, but NVIDIA has two big advantages for developers IMO :
1) Better OpenGL support (Geometry shader / transform feedback support just proves it, but it has almost always been like this)
2) Better developer tools (NVPerfHud = amazing).
I'm not sure which brand I'd rather have as a consumer (benchmarks etc), but as a developer I'm definitely an NVIDIA fan. And in the long run, this is probably a big boost on the consumer side, as a product that was mainly developed on machines with NVIDIA cards is likely to be tuned for them...
-
- Gnoll
- Posts: 677
- Joined: Tue Sep 19, 2006 6:09 pm
- x 5
Yes, you are right, but i think that DX10 will become mass market thanks to AMD, this time... not because of the excellence, needed for developers, but because they released cards that "just work" and cost very less. You can run Crysis on Very High on the HD4850... and it costs half of the nvidia 280.Noman wrote:In fact, the development machine for this project uses windows XP!
I don't want to start an argument about which is the better card between the two, but NVIDIA has two big advantages for developers IMO :
1) Better OpenGL support (Geometry shader / transform feedback support just proves it, but it has almost always been like this)
2) Better developer tools (NVPerfHud = amazing).
I'm not sure which brand I'd rather have as a consumer (benchmarks etc), but as a developer I'm definitely an NVIDIA fan.
Anyway, i'm really surprised of how few users of this forum can support fully Directx10... we are to some degree all developers but this standard has really been ignored. Poor Microsoft
-
- OGRE Contributor
- Posts: 217
- Joined: Wed Jan 25, 2006 11:16 pm
- Location: Redmond,WA
-
- OGRE Retired Team Member
- Posts: 714
- Joined: Mon Jan 31, 2005 7:21 pm
- Location: Israel
- x 2
The vertex count does get updated.
However, the triangle count is what the Ogre debug overlay shows.
Since the output of the generating pass is points, they cannot contribute to the triangle count. The display pass geometry shader turns those points into screen aligned quads, but the software doesn't really have a way to know how many triangles were formed...
However, the triangle count is what the Ogre debug overlay shows.
Since the output of the generating pass is points, they cannot contribute to the triangle count. The display pass geometry shader turns those points into screen aligned quads, but the software doesn't really have a way to know how many triangles were formed...
-
- Gremlin
- Posts: 166
- Joined: Fri Jun 30, 2006 1:04 pm
Strange it fails on my 8500GT:Nice Work, worked like a charm at 400FPS on 8400GT
Code: Select all
08:58:43: OGRE EXCEPTION(2:InvalidParametersException): Parameter called elapsedTime does not exist. in GpuProgramParameters::_findNamedConstantDefinition at ..\src\OgreGpuProgram.cpp (line 1090)
Was here
-
- OGRE Retired Team Member
- Posts: 714
- Joined: Mon Jan 31, 2005 7:21 pm
- Location: Israel
- x 2
bharling - I want to work this problem out with you, its very important for me to solve it.
Couple of things :
- I noticed that i forgot to include the cg runtime (cg.dll) in the package, so the demo was using whatever version of the runtime that was installed on your computer. Perhaps its an old one. I re-uploaded the demo, this time including cg.dll. Can you try to download again and tell me if you still get problems?
- What version of the driver are you running? 8500 sounds like a laptop driver. Can you download NVIDIA's transform feedback fractal demo and check if that works? Perhaps you need to update your driver?
If the transform feedback sample works, and mine still doesn't, can you please post the full Ogre.log for me to check?
Thanks!
Couple of things :
- I noticed that i forgot to include the cg runtime (cg.dll) in the package, so the demo was using whatever version of the runtime that was installed on your computer. Perhaps its an old one. I re-uploaded the demo, this time including cg.dll. Can you try to download again and tell me if you still get problems?
- What version of the driver are you running? 8500 sounds like a laptop driver. Can you download NVIDIA's transform feedback fractal demo and check if that works? Perhaps you need to update your driver?
If the transform feedback sample works, and mine still doesn't, can you please post the full Ogre.log for me to check?
Thanks!
-
- Gremlin
- Posts: 166
- Joined: Fri Jun 30, 2006 1:04 pm
-
- Halfling
- Posts: 42
- Joined: Wed Feb 27, 2008 7:08 pm