Frame-rate independent motion blur compositor

Anything and everything that's related to OGRE or the wider graphics field that doesn't fit into the other forums.
User avatar
SpaceDude
Bronze Sponsor
Bronze Sponsor
Posts: 822
Joined: Thu Feb 02, 2006 1:49 pm
Location: Nottingham, UK
x 3

Frame-rate independent motion blur compositor

Post by SpaceDude »

The motion blur compositor in the ogre demos looks nice but it has a problem: The amount of blur you see depends on the frame rate.

The way it works is that it creates a persistent texture and every time a new frame is rendered the new frame is blended with this texture by a given blend factor. So essentially what you end up with is a blend of all the previous frames where the newest frames have an exponentially higher blend factor.

If you have a very high frame rate then you will not see as much blur as if you have a low frame rate. So I've been thinking of a way to make it frame-rate independent and I think I came up with a solution which I thought might be of interest to others.

Rather than using a fixed blend factor, we can use a factor which depends on the frame rate, here is the equation:

Code: Select all

b = 1-e^-(dt*a)
where,

dt = time since last render
b = blend factor
a = parameter used to adjust the amount of motion blur
Tinnus
Halfling
Posts: 45
Joined: Fri Nov 02, 2007 3:23 am
x 4

Post by Tinnus »

Did you try it? :)

I'm not good at shaders, so I'm not sure if there's an exp() function in the pixel shader, but I suppose the function could be precalculated to a texture lookup beforehand :)
User avatar
sinbad
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 19269
Joined: Sun Oct 06, 2002 11:19 pm
Location: Guernsey, Channel Islands
x 66

Post by sinbad »

Pixel Motion Blur is better of course :)
User avatar
SpaceDude
Bronze Sponsor
Bronze Sponsor
Posts: 822
Joined: Thu Feb 02, 2006 1:49 pm
Location: Nottingham, UK
x 3

Post by SpaceDude »

Tinnus, just had the chance to test it. And it does work as expected, I tested at both 400 FPS and 10 FPS (by introducing a sleep of 100 ms between renders) and the amount of blur I see is the same (i.e. the length of the trail). Obviously they look different because the higher frame rate is sampling frames more often but this can't be avoided. Without using this adjustment the blur trail is very long at 10 FPS, and practically invisible at 400 FPS. So I'm quite happy with it.

Since it would be a waste to use the exp() function for every pixel, I calculated this on the CPU once before each rendered frame and pass in the resulting blend factor as a parameter to the shader. So in fact the shader is unchanged, but the blur parameter is calculated dynamically based on the frame rate. The code goes something like this:

Code: Select all

void CGame::notifyMaterialSetup(Ogre::uint32 pass_id, Ogre::MaterialPtr &mat)
{
	if (pass_id == 666)
	{
		m_BlurParameters = mat->getTechnique(0)->getPass(0)->getFragmentProgramParameters();
	}
}

void CGame::notifyMaterialRender(Ogre::uint32 pass_id, Ogre::MaterialPtr &mat)
{
	if (pass_id == 666)
	{
		unsigned long iRenderTime = Root::getSingleton().getTimer()->getMilliseconds();
		Real ElapsedTime = Real(iRenderTime-m_iLastRenderTime)/1000.0;
		if (ElapsedTime < 0)
			ElapsedTime = 0;
		Real BlurAdjust = 10;
		Real Blur = Math::Exp(-ElapsedTime*BlurAdjust);
		m_BlurParameters->setNamedConstant("blur", Blur);
		m_iLastRenderTime = iRenderTime;
	}
}
sinbad thanks for the link, but depends what you need it for. I want the whole screen to look blury to make the player appear disoriented after hit by a flashbang :) (I posted another thread about that). So for my purposes this shader works quite well.
jj
Halfling
Posts: 50
Joined: Thu Apr 03, 2008 1:33 am

Post by jj »

Hi,

Thanks the code snippet. I was just trying it, but when the blur factor gets >= 1.0 the scene freezes... and with this piece of code it gets some time to this values....

Any ideas about how to solve this problem? just clamping the values?

Thanks,
jj
jonnys
Halfling
Posts: 77
Joined: Thu Mar 27, 2008 7:40 pm

Post by jonnys »

SpaceDude wrote:Tinnus, just had the chance to test it. And it does work as expected, I tested at both 400 FPS and 10 FPS (by introducing a sleep of 100 ms between renders) and the amount of blur I see is the same (i.e. the length of the trail). Obviously they look different because the higher frame rate is sampling frames more often but this can't be avoided. Without using this adjustment the blur trail is very long at 10 FPS, and practically invisible at 400 FPS. So I'm quite happy with it.

Since it would be a waste to use the exp() function for every pixel, I calculated this on the CPU once before each rendered frame and pass in the resulting blend factor as a parameter to the shader. So in fact the shader is unchanged, but the blur parameter is calculated dynamically based on the frame rate. The code goes something like this:

Code: Select all

void CGame::notifyMaterialSetup(Ogre::uint32 pass_id, Ogre::MaterialPtr &mat)
{
	if (pass_id == 666)
	{
		m_BlurParameters = mat->getTechnique(0)->getPass(0)->getFragmentProgramParameters();
	}
}

void CGame::notifyMaterialRender(Ogre::uint32 pass_id, Ogre::MaterialPtr &mat)
{
	if (pass_id == 666)
	{
		unsigned long iRenderTime = Root::getSingleton().getTimer()->getMilliseconds();
		Real ElapsedTime = Real(iRenderTime-m_iLastRenderTime)/1000.0;
		if (ElapsedTime < 0)
			ElapsedTime = 0;
		Real BlurAdjust = 10;
		Real Blur = Math::Exp(-ElapsedTime*BlurAdjust);
		m_BlurParameters->setNamedConstant("blur", Blur);
		m_iLastRenderTime = iRenderTime;
	}
}
sinbad thanks for the link, but depends what you need it for. I want the whole screen to look blury to make the player appear disoriented after hit by a flashbang :) (I posted another thread about that). So for my purposes this shader works quite well.


Space Dude I have been trying to integrate your snippet into the ogre motion blur compositor; but I have not been getting it. I have a few questions:

1. Where you should call those two functions?
2. How do you use them
3. Which material should you use with the two functions?
4. What is m_BlurParameters declared as and when is it declared?

Thanks a lot in advance.
nbeato
Gnome
Posts: 372
Joined: Thu Dec 20, 2007 1:00 am
Location: Florida
x 3

Post by nbeato »

jonnys wrote: 1. Where you should call those two functions?
2. How do you use them
3. Which material should you use with the two functions?
4. What is m_BlurParameters declared as and when is it declared?

Thanks a lot in advance.
You should look at the compositor demos to understand this better.

1) The are implementations of Ogre::CompositorInstance::Listener. You should let Ogre call them and you need need to add a listener to the compositor.

2) Ogre uses them to change material instances of compositors. They are mainly used to control shader parameters, but you can modify any part of the material (for that one frame).

3) Modify the motion blur parameter with the described function above. The shader is the same (exactly what is in the post above).

4) Ogre::GpuProgramParametersSharedPtr
jonnys
Halfling
Posts: 77
Joined: Thu Mar 27, 2008 7:40 pm

Post by jonnys »

Thanks nbeato
but I have 1 more question.

What is m_iLastRenderTime declared as?

I get the following errors:

Code: Select all

1>------ Build started: Project: OgreTest, Configuration: Release Win32 ------
1>Compiling...
1>Main.cpp
1>..\src\Main.cpp(55) : error C2440: '=' : cannot convert from 'Ogre::GpuProgramParametersSharedPtr' to 'Ogre::GpuProgramParameters *'
1>        No user-defined-conversion operator available that can perform this conversion, or the operator cannot be called
1>..\src\Main.cpp(64) : error C2065: 'm_iLastRenderTime' : undeclared identifier
1>..\src\Main.cpp(64) : error C2514: 'Ogre::Real' : class has no constructors
1>        C:\OgreSDK\include\OgrePrerequisites.h(110) : see declaration of 'Ogre::Real'
1>..\src\Main.cpp(70) : error C2065: 'm_iLastRenderTime' : undeclared identifier
1>Build log was saved at "file://e:\Test\Ogre Test\obj\Release\BuildLog.htm"
1>OgreTest - 4 error(s), 0 warning(s)