Hi all,
I'm using Ogre 1.7.3 and trying to copy the iPhone camera video to texture, using AVCaptureSession and HardwarePixelBuffer::blitFromMemory().
I got 2 problems with this approach:
1. Once blitFromMemory is called, the whole screen is stretched, as if the frame buffer is resized to 1024x512 and the screen can only display top-left portion. I guess it's related to AVCaptureSession data is 640x480 and the texture is 1024x512 (power of 2 limitation). With this method, the video texture is blank although there's no error.
2. If I provide a 640x480 dstBox to blitFromMemory, I get the "data.getWidth() != data.rowPitch" exception (OgreGLESHardwarePixelBuffer.cpp line 377).
It works if I copy pixel by pixel, but it is very slow.
Any suggestions? Thanks.
Man
Video texture via AVCaptureSession and blitFromMemory
-
- Halfling
- Posts: 45
- Joined: Tue Feb 15, 2011 7:23 am
-
- Halfling
- Posts: 45
- Joined: Tue Feb 15, 2011 7:23 am
Re: Video texture via AVCaptureSession and blitFromMemory
Sorry for bumping this, because blitFromMemory is still has problem in Ogre 1.8 GLES2...
When I do blitFromMemory with scaling in GLES2, GLES2TextureBuffer::blitFromTexture() will be called. But this function is not yet implemented (It says "todo - add a shader attach..."). I'd like to know will Ogre team work on this soon?
Thanks,
Man
When I do blitFromMemory with scaling in GLES2, GLES2TextureBuffer::blitFromTexture() will be called. But this function is not yet implemented (It says "todo - add a shader attach..."). I'd like to know will Ogre team work on this soon?
Thanks,
Man