Tinnus, just had the chance to test it. And it does work as expected, I tested at both 400 FPS and 10 FPS (by introducing a sleep of 100 ms between renders) and the amount of blur I see is the same (i.e. the length of the trail). Obviously they look different because the higher frame rate is sampling frames more often but this can't be avoided. Without using this adjustment the blur trail is very long at 10 FPS, and practically invisible at 400 FPS. So I'm quite happy with it.
Since it would be a waste to use the exp() function for every pixel, I calculated this on the CPU once before each rendered frame and pass in the resulting blend factor as a parameter to the shader. So in fact the shader is unchanged, but the blur parameter is calculated dynamically based on the frame rate. The code goes something like this:
Code: Select all
void CGame::notifyMaterialSetup(Ogre::uint32 pass_id, Ogre::MaterialPtr &mat)
{
if (pass_id == 666)
{
m_BlurParameters = mat->getTechnique(0)->getPass(0)->getFragmentProgramParameters();
}
}
void CGame::notifyMaterialRender(Ogre::uint32 pass_id, Ogre::MaterialPtr &mat)
{
if (pass_id == 666)
{
unsigned long iRenderTime = Root::getSingleton().getTimer()->getMilliseconds();
Real ElapsedTime = Real(iRenderTime-m_iLastRenderTime)/1000.0;
if (ElapsedTime < 0)
ElapsedTime = 0;
Real BlurAdjust = 10;
Real Blur = Math::Exp(-ElapsedTime*BlurAdjust);
m_BlurParameters->setNamedConstant("blur", Blur);
m_iLastRenderTime = iRenderTime;
}
}
sinbad thanks for the link, but depends what you need it for. I want the whole screen to look blury to make the player appear disoriented after hit by a flashbang

(I posted another thread about that). So for my purposes this shader works quite well.