I started a project with Ogre.
The aim is to display live video-stream of a stereo camera in a VR headset, additionally I want to add objects so they overlay the scene from the video-stream.
Sadly I haven't done anything before with ogre. I choose ogre 2.2 because I found the openvr example from dark_sylinc. (https://www.ogre3d.org/tag/openvr)
I implemented drawing the stereo video-stream in the following way, and I wonder if this is the proper way one would do this in ogre:
Every Frame
Load Videoframe from OpenCv Capture or ROS...
Resize in OpenCv::Mat
Write the resized Mat to the "Background"-Texture, which resides on GPU as well as RAM:
I've done it the way described in the doku: (https://ogrecave.github.io/ogre-next/ap ... anges.html)
I have to wait VaoManager->getDynamicBufferMultiplier Frames.
Does anyone has a better or the proper approach streaming content to an ogre scene? You can also have a look on my repo: https://github.com/peetCreative/svr
Loading images is done in SVR.cpp and writing them to texture in OpenVrCompositorListener.cpp
If you find programming or application design fault, please just let me know.

Thanks!