iOS Input via UIView in wiki

Discussion of issues specific to mobile platforms such as iOS, Android, Symbian and Meego.
d000hg
Goblin
Posts: 257
Joined: Tue Sep 02, 2008 9:41 pm
x 1

iOS Input via UIView in wiki

Post by d000hg »

http://www.ogre3d.org/tikiwiki/tiki-ind ... via+UIView

A couple of questions about this useful looking sample...

1)With all these PIMPL classes, how does one actually USE it? It looks like I simply do:

Code: Select all

UIInput input;
input.setInputEnabled(true)
But where in code? I'd guess maybe we put the object in the AppDelegate interface declaration, then just call setInputEnabled() in -go()?

2)It doesn't seem you get any identification of the touch ids... doesn't that make it pretty bad for multi-touch where you need to track touches moving around?
aleroy
Kobold
Posts: 31
Joined: Thu Feb 02, 2012 9:34 pm
x 6

Re: iOS Input via UIView in wiki

Post by aleroy »

d000hg wrote:1) With all these PIMPL classes, how does one actually USE it? It looks like I simply do:

Code: Select all

UIInput input;
input.setInputEnabled(true)
But where in code? I'd guess maybe we put the object in the AppDelegate interface declaration, then just call setInputEnabled() in -go()?
You could instantiate it in AppDelegate - go() ... but its not advisable. The UIInput object is intended to exist on the cpp++ side. It should be created only once when your app starts. I would suggest making it a member of OgreFramework (from the application example). That way ithe UIInput instance can be created when Ogre starts up and destroyed when Ogre shuts down. That also puts it in a singleton object so it's easy to access.

I have a similar version of that code that instantiates on the Obj-C side ... from AppDelegate - go(). It's much better. I'll update the wiki article soon.

d000hg wrote:2) It doesn't seem you get any identification of the touch ids... doesn't that make it pretty bad for multi-touch where you need to track touches moving around?
Definitely. It would be better to respond to gestures than touches. When I update the article, I'll explain how to do that.
caseybasichis
Greenskin
Posts: 100
Joined: Wed Jan 25, 2012 7:50 pm
x 1

Re: iOS Input via UIView in wiki

Post by caseybasichis »

Aren't gesture more along the lines of hot keys - once a gesture is recognized a command is dispatched?

Multitouch is really an entirely different need so to suggest that gestures would be better is rather confusing. If I wanted to arbitrarily drag two object around gestures wouldn't be of much help.

Is the issue that UIInput just wont transmit the information or that Ogre is not set up to relay it. If its the latter would it be possible to encode the touch number into the resolution like:

(screenWidth * touchNumber) + x

p.s. I am very excited to take a look at the gesture article
d000hg
Goblin
Posts: 257
Joined: Tue Sep 02, 2008 9:41 pm
x 1

Re: iOS Input via UIView in wiki

Post by d000hg »

I assume gestures and MT are separate on some level, however they must be linked because otherwise when someone did a pinch, the first touch would get treated as mouse movement or something.
Maybe iOS handles this, maybe you just stop handing touches when there are >1 and let gestures kick in?

Regarding touch IDs I imagine the only 2 issues are:
1)Finding a way to identify the touch from the UIEvent data
2)Representing this in a platform-agnostic way

But for many of us I'm sure an iOS-only version is just fine so we can rip his PIMPL stuff apart and access the UIEvent however we please.
aleroy
Kobold
Posts: 31
Joined: Thu Feb 02, 2012 9:34 pm
x 6

Re: iOS Input via UIView in wiki

Post by aleroy »

Okay, I just posted the new code on the wiki. I still need to add some comments...

The original purpose of the snippet was to take advantage of the Cocoa <UIKit> framework as a means to get multi-touch input into your C++ code ... which it did, but only just. I've had a lot more experience in iOS since posting it. For one, I discovered gesture recognizers. The gesture recognizers (GRs) do a lot of the heavy lifting for you. No need to track touches to figure out where a touch started, where it's moved, how fast it's moving, and in which direction. The pan GR is especially good for that. It works much like a MouseDrag event. As the touch slides across the screen, the pan GR sends a translation and a velocity vector (as well as the starting point). The tap and pinch GRs are equally friendly as well as long-press. The first three cover the needs of most games.

So, how does it work?

First, notice that there is a singleton named iOS_UIManager. It's purpose is to provide an interface for manipulating UIKit UI stuff. Here, it has only one view controller to manage, iOS_InputVC. iOS_InputVC is a UIViewController responsible for setting up the gesture recognizers and dispatching GRs actions to the C++ side. This happens by way of the InputVCDelegate protocol. The input view controller is provided with weak reference to an iOS_InputImpl object (which conforms to the protocol). iOS_InputVC handles any GR actions (ObjC for event) by grabbing whatever info it can from the GR before passing it through the respective protocol (here, always an instance of iOS_InputImpl). From there, it finally passes over the C++ in a call to one of the _notify***WasRecognized methods. The last class in the wiki snippet is a concrete State (from the Advanced Ogre Framework tutorial) which also privately implements InputPimpl. MyStateWithGestureSupport can then do whatever it wants with the data it received ... including passing something back through to the iOS side (perhaps to change which gesture should be detected).

... and how do I implement it?

Well, if your project is based of the Advanced Ogre Framework, you can pretty much copy/paste everything over. However, it may be better to create a "Listener" class that implement InputImpl (rather than having the state implement it). Then make the listener a member of the state ... or of whichever class needs the input data.

Oh, and you have to add two lines to AppDelegate.h.

Code: Select all

// Near the top; after #import <UIKit/UIKit.h>
#import "iOS_UIManager.h"

// ...

- (void) go
{
    // Make this the first line in this method.
    [iOS_UIManager getInstance];

    // ...
}

Hope that helps,
~aleroy~


{PS: sorry it took so long to get this up... I had to ween myself off Skyrim (glad I was finally able to play it).}
aleroy
Kobold
Posts: 31
Joined: Thu Feb 02, 2012 9:34 pm
x 6

Re: iOS Input via UIView in wiki

Post by aleroy »

{err, sorry for double-post, but I thought I should respond more directly to the responses}

caseybasichis wrote:Aren't gesture more along the lines of hot keys - once a gesture is recognized a command is dispatched?

Multitouch is really an entirely different need so to suggest that gestures would be better is rather confusing. If I wanted to arbitrarily drag two object around gestures wouldn't be of much help.
Yes and no. The gestures themselves are not actually tied to any one action ... in that sense a gesture is a slightly more elaborate key. The Pan and Tap gestures are synonymous to drag and click; you get the same information. However, if you need the data to supplement another GUI system then gestures is not the way to go. But it's really easy to add handlers for the multi-touch events ... I should probably append how to do that to the wiki.
caseybasichis wrote:Is the issue that UIInput just wont transmit the information or that Ogre is not set up to relay it?
The UIKit documentation for touch events isn't clear on how to obtain the id of a UITouch object. And yet, there are methods within UIKit which require a touch ID ... so there must be a way.
d000hg
Goblin
Posts: 257
Joined: Tue Sep 02, 2008 9:41 pm
x 1

Re: iOS Input via UIView in wiki

Post by d000hg »

@aleroy that looks very cool. One question though, have you removed your low-level touch code from the wiki? I reckon both are very useful so perhaps you could resurrect it... have one page for "iOS gestures" and another for "iOS multitouch"?

For instance I use MyGUI so poking a dragged touch/tap in is useful, but I might separately want to catch pinch and other multi-touch gestures. You say "The Pan and Tap gestures are synonymous to drag and click; you get the same information. However, if you need the data to supplement another GUI system then gestures is not the way to go". But why not use pan/tap to inject 'mouse' actions into a UI framework?


ps: I sent you a PM.