RTT on iOS

Discussion of issues specific to mobile platforms such as iOS, Android, Symbian and Meego.
Post Reply
Posts: 38
Joined: Thu Jul 05, 2012 3:51 pm
x 4

RTT on iOS

Post by Carlyone »

I realise this is a topic that has been posted a few times and I have read those posts and tried to implement them.

However I cannot get RTT to work on the iPad. I am using an iPad Air and Ogre 1.9.

I have the capture from the device camera and the data in there saves as an Ogre Image correctly. I am using RTT code that I usually use for video playing on OSX but it's pretty much the same idea.

I believe the issue is that the data is non pow2, it is 1920 x 1080. To prove this I took two images one 1024x1024 and one 1024x768, the 1024x1024 renders on the texture fine but the other just renders black. Our build of ogre does render non pow2 textures and it does render non pow2 texture created after the initial load of ogre.

I have tried Copy and FBO for the RTT Modes and have tried FSAA at 0,2 and 4.


Code: Select all

// Get the pixel buffer from the system
        CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        // Lock it for processing
        CVPixelBufferLockBaseAddress(pixelBuffer, 0);
        int size = (int)CVPixelBufferGetBytesPerRow(pixelBuffer);
        size *= CVPixelBufferGetHeight(pixelBuffer);
        void* data = CVPixelBufferGetBaseAddress(pixelBuffer);
        // Unlock the pixel buffer
        CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
       renderManager->setCurrentVideoFrame((unsigned char*)data, size);

        data = NULL;
Frame Listener

Code: Select all

 * Constructor
    // Create a new texture for our video
    _texturePtr = Ogre::TextureManager::getSingleton().createManual("RTTTexture",
                                                                    1920, 1080,

bool RTTFrameListener::frameRenderingQueued(const Ogre::FrameEvent& evt){
    int size = 0;
    unsigned char* frameData = renderManager->copyCurrentVideoFrame(&size);
    if(frameData != NULL){
        Ogre::PixelBox pb(1920, 1080, 1, Ogre::PF_BYTE_RGBA, frameData);
        Ogre::HardwarePixelBufferSharedPtr buffer = _texturePtr->getBuffer();
        free((unsigned char*)frameData);
    frameData = NULL;

Code: Select all

_rttFrameListener = new RTTFrameListener();
        // Attach it to a material
        Ogre::MaterialPtr material = Ogre::MaterialManager::getSingleton().getByName(rttMaterial);
        Ogre::TextureUnitState* ts = material->getTechnique((unsigned short)0)->getPass((unsigned short)0)->getTextureUnitState((unsigned short)0);
        //Now, attach the texture to the material texture unit (single layer) and setup properties
        ts->setTextureFiltering(Ogre::FO_LINEAR, Ogre::FO_LINEAR, Ogre::FO_NONE);
        ts = NULL;

Code: Select all

material RTTMaterial
    receive_shadows off

			depth_check off
            depth_write off
			cull_software none
			cull_hardware none
			lighting off

			ambient 0 0 0
			diffuse 1 1 1
			texture_unit RTTMaterial


Any help would be appreciated.

Kind Regards


Posts: 38
Joined: Thu Jul 05, 2012 3:51 pm
x 4

Re: RTT on iOS

Post by Carlyone »


I have returned to this issue as it has become quite critical to progressing our application, eyesee (http://www.pinnaclevl.com).

At the moment we adapt the Viewport settings to allow us to see through the ogre layer and onto the camera view but this causes a restriction on transparent materials cutting through everything and the solution is to use a dynamic texture updated from the device camera.

I am currently getting a white screen, a slight change from the black screen.

After a lot more searching through the forums I have tried turning off RTSS, copying the pixels manually and using memcpy to move the pixels over.

Copying the pixels manually and memcpy produces the frame from the camera but all distorted, my assumption for this was I was copying data of 1920x1080x4 into 2048x2048x4 which would make it not line up properly.

I made a loop to replace only a section in a 2048x2048 buffer but the result was still a white screen or a distorted image, I'm open to having got that loop wrong.

Code: Select all

int mainBufferWidth = 2048;
        int mainBufferHeight = 2048;
        int yOffset = mainBufferWidth - 1080;
        int xOffset = mainBufferHeight - 1920;
        int halfYOffset = yOffset/2;
        int halfXOffset = xOffset/2;
        int mainBufferYIndex = 0;
        int mainBufferXIndex = 0;

        int bufferYIndex;
        int bufferXIndex;

        for(int y = 0;y < 1080;y++){
            bufferYIndex = y * 1920 * 4;
            mainBufferYIndex = bufferYIndex + yOffset;
            for(int x = 0;x < 1920;x++){
                bufferXIndex = bufferYIndex + (x * 4);
                mainBufferXIndex = bufferXIndex + xOffset;
                _currentFrameData[mainBufferXIndex + 3] = data[bufferXIndex + 3];
                _currentFrameData[mainBufferXIndex + 2] = data[bufferXIndex + 2];
                _currentFrameData[mainBufferXIndex + 1] = data[bufferXIndex + 1];
                _currentFrameData[mainBufferXIndex + 0] = data[bufferXIndex + 0];
This is extremely slow but at this moment I just need something working.

If I take the frame data (straight from the camera) and load it into a dynamic image then save it it looks good. I also did a texture->unload() then texture->loadImage() which seems very wrong a cheeky but still got a white screen even though the system.

This was on Ogre 1.10. I am away to make the conversion to Ogre 2.0 (possibly 2.1 if GLES2 is playing well) and try this on there but would still appreciate any input anyone has.

Kind Regards


Post Reply