Post
by Falagard » Mon Apr 25, 2005 10:08 am
Okay, reading that first pdf, it looks like the following:
There are three possible ways to do it
1 DirectX 9 and looks like it's shader 3.0 only, where he's using a fragment program (pixel shader) to "render" into a vertex buffer, which essentially creates new vertices, and then uses a vertex program to then apply the displacement. There's some pseudo code there, but nothing concrete, and you're looking at very next gen stuff it you want to do that. AFAIK even the currently ATI cards don't support 3.0 yet, only NVidia.
2. OpenGL - have they added support for these "UberBuffers" which allow you to do the same as above?
3. Software displacement. People have been doing progressive meshes for a long time now in software, so it's not a revelation. He may be right that the next step for future games will be to go from normal mapping into displacement mapping for more details, but I tend to think that it will work slightly differently, that people will continue creating various LOD mesh versions but have a way to morph between the vertices of the higher to the lower LOD meshes, as I've seen done in a few games currently in development.
So anyhow, I'm no expert, but there is some info in that PDF but it would definitely take some work to implement any of the above.
One thing I didn't realize is that Charles Bloom's Galaxy tool could do as much as it can do. I had seen it in passing before, but never really thought of it as a tool to create low tesellation meshes and normal maps. When I searched for tools, I found NVidia's tool and ATI's tool and that's it, but CB's tool includes source code. I've just downloaded it and am going to take a look cause...wow. I wouldn't mind getting it to read ogre meshes and automatically generate a low poly mesh and normal map from a high poly mesh. One of the pains of using NVidia and ATI's tools are getting anything INTO them, and useful out.
Clay
Last edited by
Falagard on Mon Apr 25, 2005 10:11 am, edited 1 time in total.
0 x