Personal Details
Name: Liu Lu
Email:
luluathena@gmail.com
OGRE Forum username: luluathena
GTalk:
luluathena@gmail.com
Project Proposal
There are two reasons for me to implement Off-screen Particles for OGRE. Large particle systems are common for smoke, fog and so on. Those effects can easily fill the screen then the FPS decrease quickly. Secondly, particle sprites are widely used in games. But when they intersect with the objects in the scene, artifacts usually come out. Like
Fig 1 from the ParticleFX demo. These problems can be solved by off-screen particles. Efficiency and visual effects can both be improved.
Fig 1
Here I just suppose each particle has been expanded into polygon and parallel to screen in Geometry Shader. The algorithm is progressed in pixel shader. Off-screen particles can greatly improve the performance when particles are nearly filling the screen. If user choose to use off-screen particles, they should also specify sample scales like 2*2, 4*4. The sampling function in step 2, soft particles levels in step 3 and whether to use mixed-resolution rendering in step 3-4 will all influence the FPS and effect. These options are a bit complex for user to compose. To simplify the implementation I want to organize them like "low_effect | medium_effect | high_effect" or something.
1. Render all solid objects in the scene and get the depth buffer.
I searched the forums to see how to get the depth buffer in OGRE. Seems like RenderSystem didn't have this function. And I found that most common method is to use a floating point render target and render the depth in pixel shader. In [1] there is a similar method. They use MRT to store the depth values into a FP32 render target while rendering the scene in the first pass. But it's hard to integrate. Another method is direct access to the values in the depth buffer, supported by DirectX10 and OPENGL. So it's not cross-platform....I'm still looking for a method to get depth buffer from the shader while user doesn't need to do anything.
2. Down-sample the resulting depth buffer.
Which sampling functions to choose has to be application defined. To solve halo problem then choose the maximum depth of the a*a pixels as sample pixel. But the particles will encroach into the object. Or simply point sampling is used. Even both methods have inherent drawbacks, introduction of linear blending in step 4 will hide a large part of the artifacts.
3. Depth test with off-screen target and edge detection.
To avoid artifacts between particles and scene objects, use a “contrast” function to get zFade [1]. Fig 2 is the contrast figure. However the function can be time consuming since it's complicated and even got a pow() in it. In GPU Gems3 Chapter 23 they choose an easier approach in [1] - zFade = saturate(scale * (myDepth - sceneDepth)). In most situations this function is fairly enough.
Fig 2
Alpha blending states should be specified then particles are accumulated in the off-screen target. For mixed resolution rendering, we should use Sobel algorithm to detect the edges of render target and store in another off-screen render target. It's not necessarily to use mixed resolution rendering if the blocky artifacts are acceptable.
4. Composite the particle render target back onto the main frame buffer, upsampling.
At the first time render low resolution particles to pixels that are not marked as edges. Then render particles at full frame-buffer resolution to pixels that are marked as edges. Just like the figures in Chapter 23 of GPU Gems 3.
Fig 3
The demos will show the results under different options. And just like Chapter 23 of GPU Gems 3, there will be some tables to show the FPS differences.
Schedule
April 20-May 20
Discuss with mentor, learn shader language, read source code of OGRE and study on the interfaces.
May 21-May 30
Finish the detailed-design document.
May 31-July 16
Implement the algorithm, test the code.
- May 31 – June 5 Step 1.
June 6 - June 15 Step 2.
June 16 -June 30 Step 3.
July 1- July 10 Step 4.
July 11 - July 16 Test the code.
July 17-August 1
Scrub the code, write tests
August 2-August15
Make demos and improve documents.
Why You’re The Person For This Project
I’m a graduate student in the State Key Laboratory of CAD & CG, which belongs to Zhejiang University in China. My major is Computer Graphics. So far I have worked on a project about physics and collision simulation of clothes animation in CPU, implemented with C++ and OPENGL. Then I used CUDA to rewrite the simulation program into GPU. I also wrote some other programs such as ray tracing and rigid body collision. Recently I’m doing something about stereo correspondence. It’s nearly completed. I’ve heard a lot of things about OGRE, but I haven’t used it before.
Why OGRE
I’m really interested in 3D graphics engine and games. So I checked the organizations’ list with those keywords and I found OGRE. I have been long wishing to study on OGRE and I think this is a good opportunity. Luckily I found “Off-screen Particles” in ideas list. There are some similar points between the algorithm and cloth collision simulation I did before. Particle systems seems fun to me. That’s why I decide to apply to OGRE.
[1] Lorach, Tristan. 2007. "Soft Particles." In the NVIDIA DirectX 10 SDK. Available online at
http://developer.download.nvidia.com/SD ... les_hi.pdf
-----------------------------------------------------
I'm going to read the APIs and figure out which language should I choose to implement according to the compatibility. Will anyone give me some advices?
Thank you for your time.
You do not have the required permissions to view the files attached to this post.