Page 1 of 1

DX9 Displacement Maps

Posted: Mon Apr 25, 2005 1:52 am
by gfm
This might not be the right forum to ask this in, but I figured someone here might know something about it.. :)

Supposedly DirectX 9.0c introduces support for displacement maps. There's a whole API for working with them, and a page dedicated to describing the features available in MSDN Library.

I've spent a few hours trying to do something with displacement maps but I haven't gotten anywhere. On one of my systems I get an error "NOT SUPPORTED" when I call CreateTexture with the D3DUSAGE_DMAP parameter set (GeForce 6600 GT). On another system I don't get an error, but then nothing seems to happen when I go to render the object using displacement mapping (Radeon 9600).

Does anyone have a demo of DX9 displacement mapping in action? (With source?!) :)

Thanks!
-robert

Posted: Mon Apr 25, 2005 4:06 am
by neocryptek
A very long time ago someone was experimenting with a displacement based terrain renderer. I dont think it ever finished, but you can see what remains of it in ogreaddons in the 'displacementmapterrain' directory. Obviously it isnt going to work with latest Ogre, or be a straight DX9 example, but hey maybe itl shove you in the right direction! :)

-N30

Posted: Mon Apr 25, 2005 8:34 am
by tuan kuranes
PLSM2 does that for terrain rendering. (I called It VertexCompression.) using Vertex Shader.

But It's an Vertex Shader 1 implementation, abut new High End card that supports Vertex Shader 3.0 should be able to do it in a better way.
(but you need Vertex Shader 3.0, check your ogre.log for version you support)

Posted: Mon Apr 25, 2005 9:34 am
by gfm
neocryptek wrote:A very long time ago someone was experimenting with a displacement based terrain renderer. I dont think it ever finished, but you can see what remains of it in ogreaddons in the 'displacementmapterrain' directory. Obviously it isnt going to work with latest Ogre, or be a straight DX9 example, but hey maybe itl shove you in the right direction! :)

-N30
Thanks for the link!

I got it working using my old Ogre 0.14 installation (glad I kept it around!) Turns out this demo uses a 'cg' shader to do the displacement mapping. The concept is simple: pass in to your vertex shader the x and z coordinate as a position vector, and then pass in the y coordinate as a color intensity in a texture.

This isn't exactly what I was after, and I'm starting to wonder if I'm chasing something that doesn't actually exist. I thought that DX9 had the ability to create geometry in the tesselator stage using a height-map texture. I thought it was possible to pass in a flat surface or 4 points, and then based on a height-map texture render that has hundreds of points. Is this not possible? All of the demos I've seen just offset geometry, they don't create it. I thought the technical term for this was "offset mapping," not "displacement mapping" ?

-robert

Posted: Mon Apr 25, 2005 9:44 am
by qsilver
I haven't yet seen a DX9 displacement mapping demo, although the MSDN documentation seems to describe it pretty well. I did find a presentation from Game Developers Conference 2003 that might help:
http://www.gdconf.com/archives/2003/Doggett_Michael.pdf [slides]
http://www.gdconf.com/archives/2003/Dog ... ael_02.pdf [paper]

Posted: Mon Apr 25, 2005 9:50 am
by gfm
qsilver wrote:I haven't yet seen a DX9 displacement mapping demo, although the MSDN documentation seems to describe it pretty well.
You try implementing it! :) I couldn't get it to work. :)

Posted: Mon Apr 25, 2005 10:08 am
by Falagard
Okay, reading that first pdf, it looks like the following:

There are three possible ways to do it

1 DirectX 9 and looks like it's shader 3.0 only, where he's using a fragment program (pixel shader) to "render" into a vertex buffer, which essentially creates new vertices, and then uses a vertex program to then apply the displacement. There's some pseudo code there, but nothing concrete, and you're looking at very next gen stuff it you want to do that. AFAIK even the currently ATI cards don't support 3.0 yet, only NVidia.

2. OpenGL - have they added support for these "UberBuffers" which allow you to do the same as above?

3. Software displacement. People have been doing progressive meshes for a long time now in software, so it's not a revelation. He may be right that the next step for future games will be to go from normal mapping into displacement mapping for more details, but I tend to think that it will work slightly differently, that people will continue creating various LOD mesh versions but have a way to morph between the vertices of the higher to the lower LOD meshes, as I've seen done in a few games currently in development.

So anyhow, I'm no expert, but there is some info in that PDF but it would definitely take some work to implement any of the above.

One thing I didn't realize is that Charles Bloom's Galaxy tool could do as much as it can do. I had seen it in passing before, but never really thought of it as a tool to create low tesellation meshes and normal maps. When I searched for tools, I found NVidia's tool and ATI's tool and that's it, but CB's tool includes source code. I've just downloaded it and am going to take a look cause...wow. I wouldn't mind getting it to read ogre meshes and automatically generate a low poly mesh and normal map from a high poly mesh. One of the pains of using NVidia and ATI's tools are getting anything INTO them, and useful out.

Clay

Posted: Mon Apr 25, 2005 10:10 am
by :wumpus:
I'm quite sure the geometry processors can't actually generate geometry. (this might have changed on VS/PS 3.0 though, as it is possible to use textures in vertex programs there, but that would target only the newest nvidia cards)

The vertex shaders 3.0 implementation of displacement mapping does not need any extra support in the rendersystem at all. It can all be done with shaders. You feed a texture to the vertex shader and it fetches the height values from that, just like with pixel shaders. Using this kind of displacement mapping you can indeed do things like morph gradually between LODs.

Posted: Mon Apr 25, 2005 10:47 am
by Falagard
Yeah, I think you're right about creating the verts, I had read something wrong. It has to do with patches and higher order surfaces, they are tesselating the surfaces.

Basically, I think that there's a new special type of texture that can be set for a path that represents the displacement map, D3DDMAPSAMPLER, and then the vertex program can access this "texture" from the program to modify vert positions by looking up into this texture. The creation of vertices happens earlier in the patch tesselation stages.

Oh, and yeah I just looked at Bloom's stuff and it would be easy to add a new mesh importer for Ogre .mesh files. Interesting. Very interesting.

Clay

Posted: Mon Apr 25, 2005 6:46 pm
by :wumpus:
Btw, is this NPatch stuff still hardware accelerated on modern cards? As far as I know it's very rarely used.

NVidia even dropped their hardware tesselation extension:

http://oss.sgi.com/projects/ogl-sample/ ... uators.txt

Posted: Mon Apr 25, 2005 8:44 pm
by sinbad
NPatches, if supported at all, are poorly supported. Nice idea but since they're difficult to use whilst still maintaining a backwards-compatible route, the impetus really isn't there to improve them I don't think. No-one is going to base their engine on it until enough people have got cards which support them well - which won't happen unless people write games using it ;)

Displacement mapping in a very loose sense is done in vs_1_1 in the TerrainSceneManager - the LOD morphing uses extra vertex to displace the vertices smoothly from LOD to LOD. but yeah, in vs_3_0 you can do it using a texture which is a bit more convenient.