My app is targeting iPad only, not iPhone. On non-retina iPad it works great, but in the retina simulator I have input problems - it renders fine but touches are reporting coordinates based on the view size (in 2048x1536 resolution) which messes up since Ogre thinks the display is 1024x768.
Here's my utility code, based on some Ogre sample code IIRC:
Code: Select all
OIS::MouseState MainView::ConvertMultiTouch(const OIS::MultiTouchState &in)
{
Ogre::Viewport *vp = m_renderTarget.getViewport(0);
OIS::MultiTouchState state = in;
int w = vp->getActualWidth();
int h = vp->getActualHeight();
int absX = state.X.abs;
int absY = state.Y.abs;
int relX = state.X.rel;
int relY = state.Y.rel;
UIInterfaceOrientation interfaceOrientation = [UIApplication sharedApplication].statusBarOrientation;
switch (interfaceOrientation)
{
case UIInterfaceOrientationPortrait:
break;
case UIInterfaceOrientationLandscapeLeft:
state.X.abs = w - absY;
state.Y.abs = absX;
state.X.rel = -relY;
state.Y.rel = relX;
break;
case UIInterfaceOrientationPortraitUpsideDown:
state.X.abs = w - absX;
state.Y.abs = h - absY;
state.X.rel = -relX;
state.Y.rel = -relY;
break;
case UIInterfaceOrientationLandscapeRight:
state.X.abs = absY;
state.Y.abs = h - absX;
state.X.rel = relY;
state.Y.rel = -relX;
break;
}
OIS::MouseState s;
s.X = state.X;
s.Y = state.Y;
I wondered what the correct way is to get touch coordinates into viewport space safely? This works great on non-retina but it seems to me my code would ideally not have to check if I'm on a retina device! And also I have a feeling this might only work when my renderwindow is fullscreen.
I also wondered, in Ogre 1.9 and latest version of dependencies, if there are fixes relevant here.