

Has anyone got MyGUI working with it at all? Any tips? I don't see any UI at all
MyGUI can only render to one viewport at once. So your best bet would be to render MyGUI to a texture, then put a 3d plane with that texture in your gameworld. The added bonus is that it will even "seem" 3d.AshMcConnell wrote: Has anyone got MyGUI working with it at all? Any tips? I don't see any UI at all
Kojack wrote:Cool.
I'll try to get an update out soonish (very busy at work, we just started a new trimester and I'm teaching first and second year students) that adds in the new sdk features (chromatic aberation correction and yaw drift correction). I've already got the chromatic aberation shader working.
This is more or less the chromatic aberation shader. I've skipped the early break out if the blue is outside the visible area but the example code is there. Honestly I can't see a difference between using it and not but most of the demos are pretty simple. Thanks to Kojack for the starting point change the main_fp shader over to this and set the extra param.xrgo wrote: Hi! can you share the shader with chromatic aberration please!?
thanks in advance!
Code: Select all
uniform float2 ScreenCenter;
uniform float2 LensCentre;
uniform float2 Scale;
uniform float2 ScaleIn;
uniform float4 HmdWarpParam;
uniform float4 ChromAbParam; // <= this is the new paramater.
float4 main_fp(float4 pos : POSITION,
float2 iTexCoord : TEXCOORD0,
uniform sampler2D RT : register(s0)) : COLOR
{
float2 theta = (iTexCoord - LensCentre) * ScaleIn; // Scales to [-1, 1]
float rSq = theta.x * theta.x + theta.y * theta.y;
float2 theta1 = theta * (HmdWarpParam.x + HmdWarpParam.y * rSq + HmdWarpParam.z * rSq * rSq + HmdWarpParam.w * rSq * rSq * rSq);
float2 tc = LensCentre + Scale * theta1;
// Detect whether blue texture coordinates are out of range
// since these will scaled out the furthest.
float2 thetaBlue = theta1 * (ChromAbParam.z + ChromAbParam.w * rSq);
float2 tcBlue = LensCentre + Scale * thetaBlue;
//todo break out early here note the following lines are hlsl while the rest is in cg
//if (any(clamp(tcBlue, ScreenCenter -float2(0.25,0.5),
//ScreenCenter+float2(0.25, 0.5)) - tcBlue)
// Now do blue texture lookup.
float blue = tex2D(RT, tcBlue).b;
// Do green lookup (no scaling).
float2 tcGreen = LensCentre + Scale * theta1;
float green = tex2D(RT, tcGreen).g;
// Do red scale and lookup.
float2 thetaRed = theta1 * (ChromAbParam.x + ChromAbParam.y * rSq);
float2 tcRed = LensCentre + Scale * thetaRed;
float red = tex2D(RT, tcRed).r;
return float4(red, green, blue ,1);
}
Code: Select all
Ptr<HMDDevice> pHMD;
OVR::HMDInfo devinfo;
pHMD->GetDeviceInfo(&devinfo);
ChromAb = Ogre::Vector4(devinfo.ChromaAbCorrection[0],
devinfo.ChromaAbCorrection[1],
devinfo.ChromaAbCorrection[2],
devinfo.ChromaAbCorrection[3]);
pParamsLeft->setNamedConstant("ChromAbParam", ChromAb);
Hi amartin,amartin wrote:Is anyone else seeing distortion towards the top and bottom of the view using the Y FOV value provided by the sdk. The default value I'm getting is 125 degrees but that gives me some visible distortion towards the top and bottom objects tend to stretch and appear closer if they are at the edges of the screen. I tried manually performing the cals from the docs for Y FOV but that came up with a different answer and distortion in the opposite direction. At the moment I'm using a magic number of 1.9 radians (110 degrees) which pretty much removes the distortion but has no basis in any of the provided numbers. I would like to know if anyone else is seeing this issue I'm trying to figure out if I'm missing some steps or Ogre just handles things differently to some of the systems that already have integration.
Code: Select all
uniform float2 LensCentre;
uniform float2 Scale;
uniform float2 ScaleIn;
uniform float4 HmdWarpParam;
Code: Select all
pParamsLeft->setNamedConstant("HmdWarpParam", hmdwarp);
pParamsRight->setNamedConstant("HmdWarpParam", hmdwarp);
pParamsLeft->setNamedConstant("LensCentre", 0.5f+(m_stereoConfig->GetProjectionCenterOffset()/2.0f));
pParamsRight->setNamedConstant("LensCentre", 0.5f-(m_stereoConfig->GetProjectionCenterOffset()/2.0f));
Code: Select all
float w = float(VP.w) / float(WindowWidth),
h = float(VP.h) / float(WindowHeight),
x = float(VP.x) / float(WindowWidth),
y = float(VP.y) / float(WindowHeight);
float as = float(VP.w) / float(VP.h);
pPostProcessShader->SetUniform2f("LensCenter",
x + (w + Distortion.XCenterOffset * 0.5f)*0.5f, y + h*0.5f);
pPostProcessShader->SetUniform2f("ScreenCenter", x + w*0.5f, y + h*0.5f);
float scaleFactor = 1.0f / Distortion.Scale;
pPostProcessShader->SetUniform2f("Scale", (w/2) * scaleFactor, (h/2) * scaleFactor * as);
pPostProcessShader->SetUniform2f("ScaleIn", (2/w), (2/h) / as);
pPostProcessShader->SetUniform4f("HmdWarpParam",
Distortion.K[0], Distortion.K[1], Distortion.K[2], Distortion.K[3]);
Code: Select all
w: 0.5, h:1, x:0.5, y:0, as: 0.8
LensCenter: 0.712006, 0.5
ScreenCenter: 0.75, 0.5
scaleFactor: 0.583225
Scale: 0.145806, 0.23329
ScaleIn: 4, 2.5
HmdWarpParam: 1, 0.22, 0.24, 0
This is the easy part simply adjust the y and z values when positioning the camera in addition to setting the x for each camera.bullale wrote: First, not related to the distortion, the camera movement might seem more natural if we model the head. Our axis of rotation is the base of our head so we might want to add some y (base of head to eye level) and -z (center of head to eye protrusion) to the camera positions. This is done in the samples in the SDK but I don't know how to translate that to Ogre.
Code: Select all
mCamera->setPosition(mStereoConfig->GetIPD() * -0.5f, 0.075f, 0.045f);
I think the problem with the x direction stems from the fact they are using a square texture for their shader which we are not. You are correct though we were not setting the scale in Y properly and this was causing the distortion vertically. You can trybullale wrote: I've identified some inconsistencies between the Oculus SDK samples and this ogreoculus project but I'm not quite good enough with C++/Ogre/shaders to resolve them to determine if they are indeed the source of the distortion problems. I hope others here can help.
...snip
I plugged these numbers into oculus.material. They are crazy wrong for the x-direction, they aren't so bad for y. They certainly alleviate the y-distortion.
It would be nice to know how to set these numbers in code, and also how to calculate the x numbers properly for ogreoculus.
Code: Select all
float scaleFactor = 1.0f / mStereoConfig->GetDistortionScale();
float aspectRatio = mCamera->getAspectRatio();
Ogre::Vector2 scaleIn = Ogre::Vector2(2.0f, 2.0f /aspectRatio);
Ogre::Vector2 scale = Ogre::Vector2(0.5f* scaleFactor, 0.5f * scaleFactor * aspectRatio);
pParamsLeft->setNamedConstant("Scale", scale);
pParamsLeft->setNamedConstant("ScaleIn", scaleIn);
Code: Select all
HmdWarpParam = {x=1.0000000 y=0.22000000 z=0.23999999 0.0}
ChromAbParam = {x=0.99599999 y=-0.0040000002 z=1.0140001 0.0}
I managed to make one of my work projects require the Rift, otherwise I'm right there with you.Kojack wrote:Unfortunately work hit me pretty hard, I haven't even plugged my oculus in for a month or so.
I noticed the SDK updates the shader parameters on every frame. That seems terribly inefficient for parameters that rarely change. I think it would be more efficient to set a IsDirty boolean and only update when the user manually changes some settings. Even then, I think a better model would be to rely on the user to use the official utilities to set IPD and other settings then assume the values obtained from the device upon start-up are accurate.Kojack wrote:One thing I need to do is rewrite most of the way the compositors and shaders are handled, so real time changes to properties like ipd is possible. Ogre's compositors don't like that. I've had it working in the past, but I didn't like it. Looks like there's no choice though.
Based on my recent experience learning msvc++, mingw, python, ogre, and python-ogre: When releasing something open-source, please do not bother with automatic build scripts. You'll have several types of people downloading, including those that just want sample code, those that just want the binaries to try it out, and those that are just starting out (like me). The former two types of people only need the project source files and the binaries. People like me do not benefit from automatic build scripts, especially since those scripts tend to be system-specific and they age quickly. Instead, consider providing instructions on where to find the necessary headers and compiled libraries and what needs to be included and linked. If you want to host the necessary SDK elements because you are worried that particular version might not be available to download in the future then that's good, but not at the expense of instructions. This kind of documentation can be tedious but I can write most of it if you want.amartin wrote:I've got my code in a public repo but I won't post a link to that till after I've forked ogre otherwise nobody would be able to build it.
I don't know if this is helpful, but the chromatic aberration was only in the x-direction (blue shifted +x), not in the y.amartin wrote:I'm not really sure what is wrong with it at the moment but I'm guessing it is due to the difference in X axis values we are seeing in the Scale and ScaleIn paramaters.
Unfortunately setNamedConstant does not allow inputs of type Ogre::Vector2. This alternative may work but I don't know how to do it in Python http://www.ogre3d.org/docs/api/html/cla ... d10cc2101e.amartin wrote: You can tryCode: Select all
float scaleFactor = 1.0f / mStereoConfig->GetDistortionScale(); float aspectRatio = mCamera->getAspectRatio(); Ogre::Vector2 scaleIn = Ogre::Vector2(2.0f, 2.0f /aspectRatio); Ogre::Vector2 scale = Ogre::Vector2(0.5f* scaleFactor, 0.5f * scaleFactor * aspectRatio); pParamsLeft->setNamedConstant("Scale", scale); pParamsLeft->setNamedConstant("ScaleIn", scaleIn);
Code: Select all
class CustomShader4(object):
def __init__(self):
s = " void customColourVp(float4 position : POSITION,"
s+= "out float4 oPosition : POSITION,"
s+= "uniform float4x4 worldViewProj)"
s+= "{"
s+= " oPosition = mul(worldViewProj, position);"
s+= "}"
customCasterMatVp = s
s = " void customColourFp(\n"
s+= " uniform float4 MyColour,"
s+= "out float4 oColor : COLOR)"
s+= "{"
s+= " oColor = MyColour;"
s+="}"
customCasterMatFp = s
grpName = ogre.ResourceGroupManager.DEFAULT_RESOURCE_GROUP_NAME
vp = ogre.HighLevelGpuProgramManager.getSingleton().createProgram(
"CustomColourVp",
ogre.ResourceGroupManager.DEFAULT_RESOURCE_GROUP_NAME,
"cg",
ogre.GPT_VERTEX_PROGRAM)
vp.setSource(customCasterMatVp)
vp.setParameter("profiles", "vs_1_1 arbvp1")
vp.setParameter("entry_point", "customColourVp")
vp.load()
fp = ogre.HighLevelGpuProgramManager.getSingleton().createProgram(
"CustomColourFp",
ogre.ResourceGroupManager.DEFAULT_RESOURCE_GROUP_NAME,
"cg",
ogre.GPT_FRAGMENT_PROGRAM)
fp.setSource(customCasterMatFp)
fp.setParameter("profiles", "ps_1_1 arbfp1")
fp.setParameter("entry_point", "customColourFp")
fp.load()
mat = ogre.MaterialManager.getSingleton().create(
"CustomColour",
ogre.ResourceGroupManager.DEFAULT_RESOURCE_GROUP_NAME)
p = mat.getTechnique(0).getPass(0)
p.setVertexProgram("CustomColourVp")
p.getVertexProgramParameters().setNamedAutoConstant("worldViewProj",
ogre.GpuProgramParameters.ACT_WORLDVIEWPROJ_MATRIX)
p.setFragmentProgram("CustomColourFp")
colours = [0.2,0.2,0.2,0.2 ]
## Python vs C++ difference-- we use ctypes buffers
storageclass = ctypes.c_float * (1*4)
colours=storageclass(1.1)
ctypes.memset ( colours , 0, 1*4 )
colours[0] = 0.35
colours[1] = 0.38
colours[2] = 0.16
colours[3] = 0.2
p.getFragmentProgramParameters().setNamedConstantFloat("MyColour", ctypes.addressof(colours), 1)
#p.getFragmentProgramParameters().setNamedConstant("MyColour", ogre.Vector4(0.2,0.2,0.2,0.2))
def update(self):
pass
...
self.customShaders = []
...
s = CustomShader4()
self.customShaders.append(s)
ent = sceneManager.createEntity( "robot", "robot.mesh" )
ent.setMaterialName("CustomColour")
node = sceneManager.getRootSceneNode().createChildSceneNode()
node.attachObject( ent )
I've got build instructions, a dependency list and I have no intention right now of making a custom build script. The reason I want the fork available is that I am making modifications to Ogre in order to add features I need so it will no longer compile with the SDK or the main ogre source. I'm too early in my project for a binary release to be usable and it's probably more useful if I push my changes to the oculus example than post mine at the moment.bullale wrote: Based on my recent experience learning msvc++, mingw, python, ogre, and python-ogre: When releasing something open-source, please do not bother with automatic build scripts. You'll have several types of people downloading, including those that just want sample code, those that just want the binaries to try it out, and those that are just starting out (like me). The former two types of people only need the project source files and the binaries. People like me do not benefit from automatic build scripts, especially since those scripts tend to be system-specific and they age quickly. Instead, consider providing instructions on where to find the necessary headers and compiled libraries and what needs to be included and linked. If you want to host the necessary SDK elements because you are worried that particular version might not be available to download in the future then that's good, but not at the expense of instructions. This kind of documentation can be tedious but I can write most of it if you want.
setNamedConstant supports Ogre::Vector2 just fine in 1.9.1 any code I post is copied directly from my running project where I have tested it first. pyogre on the other hand may not have support for it you can try using a vector 4 in that case and seeing if it is smart enough to just pass what it needs. using a vector4 instead seems to work fine for me but I can pass in Vector2s.bullale wrote: Unfortunately setNamedConstant does not allow inputs of type Ogre::Vector2. This alternative may work but I don't know how to do it in Python http://www.ogre3d.org/docs/api/html/cla ... d10cc2101e.
So we can have something like a slider or other controls so the user can adjust the settings in real time and see the change.bullale wrote: Why do you think it's necessary to update the IPD in real-time?