3D output
-
- Minaton
- Posts: 933
- Joined: Mon Mar 05, 2012 11:37 am
- Location: Germany
- x 110
3D output
It would be nice to have support for 3d displays as a core feature. There is an addin: http://www.ogre3d.org/forums/viewtopic.php?f=5&t=32284
-
- OGRE Retired Team Member
- Posts: 972
- Joined: Mon Jun 02, 2008 6:52 pm
- Location: Berlin
- x 65
Re: 3D output
The problem I see with this is that there cannot be a solution that fits all projects. It might work well for fixed function pipeline, but by the time 2.0 comes out, who will still use fixed function? 10%?
Also, creating 3D outputs is something that is rather easy to do yourself (as the theory behind it is rather simple) and you find lots of examples online.
I'd rather have a sample for that. (Within the Compositor samples, I'd suggest)
Also, creating 3D outputs is something that is rather easy to do yourself (as the theory behind it is rather simple) and you find lots of examples online.
I'd rather have a sample for that. (Within the Compositor samples, I'd suggest)
- runfrodorun
- Gremlin
- Posts: 154
- Joined: Fri Jun 24, 2011 3:50 am
- Location: 192.168.0.1
- x 3
Re: 3D output
If you create two different renderframes that feed from offset cameras to each lcd of your monitor this should be possible already
Sass not tolerated
LAY DOWN THE LAW!
When I don't understand a problem, I punch it in the face and try again.
LAY DOWN THE LAW!
When I don't understand a problem, I punch it in the face and try again.
-
- Gremlin
- Posts: 180
- Joined: Tue Nov 25, 2008 10:58 am
- Location: Kristiansand, Norway
- x 23
- Contact:
Re: 3D output
We do it as above. it works very well.
just like those 'split screen' 3d images that the tv overlap.
like this:
works for tv's and other monitors that supports 3d.
just like those 'split screen' 3d images that the tv overlap.
like this:
works for tv's and other monitors that supports 3d.
-
- OGRE Expert User
- Posts: 1227
- Joined: Thu Dec 11, 2008 7:56 pm
- Location: Bristol, UK
- x 157
Re: 3D output
I am getting an oculus rift in April so I think ill create some form of native support for it, but wont know how it might fit into Ogre before I get the SDK.
- Kojack
- OGRE Moderator
- Posts: 7157
- Joined: Sun Jan 25, 2004 7:35 am
- Location: Brisbane, Australia
- x 534
Re: 3D output
Unfortunately there's not much detail around on how the oculus sdk works. Is it alternating frame or side by side?
From what I've read it does require a special shader to correct for the distortion of the lens, so just rendering to it directly will look distorted.
From what I've read it does require a special shader to correct for the distortion of the lens, so just rendering to it directly will look distorted.
-
- OGRE Expert User
- Posts: 1227
- Joined: Thu Dec 11, 2008 7:56 pm
- Location: Bristol, UK
- x 157
Re: 3D output
Looking at videos and some other development information, it is rendered side by side, so will not have to worry about alternative frames. Yes a shader is definitely needed, which they provide, so I am hoping it wll be as simple as just creating a SRS for the RTSSKojack wrote:Unfortunately there's not much detail around on how the oculus sdk works. Is it alternating frame or side by side?
From what I've read it does require a special shader to correct for the distortion of the lens, so just rendering to it directly will look distorted.
- Kojack
- OGRE Moderator
- Posts: 7157
- Joined: Sun Jan 25, 2004 7:35 am
- Location: Brisbane, Australia
- x 534
Re: 3D output
We might need offset projection matrices too, since each eye's view doesn't fully overlap (so it's like real life).
Other than that, if it's side by side then it should be easy.
The next step would be adding a Novint Xio.
(full arm force feedback vr controller)
Other than that, if it's side by side then it should be easy.
The next step would be adding a Novint Xio.
(full arm force feedback vr controller)
-
- Gremlin
- Posts: 180
- Joined: Tue Nov 25, 2008 10:58 am
- Location: Kristiansand, Norway
- x 23
- Contact:
Re: 3D output
I backed the oculus rift project on kickstarter, and I must say I'm REALLY looking forward to get it.
Support for it in Ogre would be very nice
Support for it in Ogre would be very nice
- AshMcConnell
- Silver Sponsor
- Posts: 605
- Joined: Fri Dec 14, 2007 11:44 am
- Location: Northern Ireland
- x 16
- Contact:
Re: 3D output
That is an armful of awesome! WowKojack wrote: The next step would be adding a Novint Xio.
(full arm force feedback vr controller)
- Klaim
- Old One
- Posts: 2565
- Joined: Sun Sep 11, 2005 1:04 am
- Location: Paris, France
- x 56
- Contact:
Re: 3D output
I'm incredibly tempted to get an Occulus Rift. I also have the money and my game would benefit greatly from it. However I try to not succumb to the urge to get it as my game is far from being finished and I don't think I'll have time to develop for Occulus Rift this year. Or maybe if I find a clone.
-
- OGRE Expert User
- Posts: 1920
- Joined: Sun Feb 19, 2012 9:24 pm
- Location: Russia
- x 201
Re: 3D output
Drop me the link to the clone factory when you find it.
- Klaim
- Old One
- Posts: 2565
- Joined: Sun Sep 11, 2005 1:04 am
- Location: Paris, France
- x 56
- Contact:
- Kojack
- OGRE Moderator
- Posts: 7157
- Joined: Sun Jan 25, 2004 7:35 am
- Location: Brisbane, Australia
- x 534
Re: 3D output
Plus you need more time to play with Falcon 1.0.
- Klaim
- Old One
- Posts: 2565
- Joined: Sun Sep 11, 2005 1:04 am
- Location: Paris, France
- x 56
- Contact:
-
- Gnoblar
- Posts: 6
- Joined: Thu Mar 28, 2013 8:56 am
- Location: Japan
Re: 3D output (oculus rift)
Hello,
I am a graduate researcher at a robotics lab, and we are going to look into the use of the Oculus Rift for interfacing with our robot. I'd been planning on using the Unity3d engine, but since it was announced that the free version will not support the rift, we've been looking into alternatives; Ogre3D seems great.
I would be interested in keeping in contact with anyone else who has access to the rift devkit and is going to be implementing it with Ogre3D.
I am a graduate researcher at a robotics lab, and we are going to look into the use of the Oculus Rift for interfacing with our robot. I'd been planning on using the Unity3d engine, but since it was announced that the free version will not support the rift, we've been looking into alternatives; Ogre3D seems great.
I would be interested in keeping in contact with anyone else who has access to the rift devkit and is going to be implementing it with Ogre3D.
-
- OGRE Expert User
- Posts: 1227
- Joined: Thu Dec 11, 2008 7:56 pm
- Location: Bristol, UK
- x 157
Re: 3D output
I will be receiving a Dev Kit and I will be integrating it with Ogre 3D, however time scales I can not promise!
- Kojack
- OGRE Moderator
- Posts: 7157
- Joined: Sun Jan 25, 2004 7:35 am
- Location: Brisbane, Australia
- x 534
Re: 3D output
Making it work with ogre will be easy, we just need more details from the sdk (mainly the specs of the distortion that needs to be applied as a post process to counteract the lenses).
Apparently the oculus rift will start shipping tomorrow, but still no sign of the sdk. I hope they aren't going to make open source devs be second class citizens and let unity / udk get exclusive access at the beginning.
Apparently the oculus rift will start shipping tomorrow, but still no sign of the sdk. I hope they aren't going to make open source devs be second class citizens and let unity / udk get exclusive access at the beginning.
I don't have an order in (I got burnt on my last vr headset purchase, I'm waiting for more reviews or the consumer high def version), but I'mplanning on making a demo for others (with the actual headset) to try once the sdk is out.I will be receiving a Dev Kit and I will be integrating it with Ogre 3D, however time scales I can not promise!
-
- Gnoblar
- Posts: 6
- Joined: Thu Mar 28, 2013 8:56 am
- Location: Japan
Re: 3D output
They said that they will only be able to ship a couple thousand per week, so I'm not likely to get mine until the end of April. But it sounds like there are at least 3 or 4 of us who will get the hardware soon enough.
As you say, it does sound like it should be simple, as long as they are open about how things work with the sensors/shaders/distortion/etc.
Thank you, Kojack, for selflessly offering your experience for our benefit!I don't have an order in (I got burnt on my last vr headset purchase, I'm waiting for more reviews or the consumer high def version), but I'm planning on making a demo for others (with the actual headset) to try once the sdk is out.
As you say, it does sound like it should be simple, as long as they are open about how things work with the sensors/shaders/distortion/etc.
- Kojack
- OGRE Moderator
- Posts: 7157
- Joined: Sun Jan 25, 2004 7:35 am
- Location: Brisbane, Australia
- x 534
Re: 3D output
Apparently the oculus is using a Hillcrest motion sensor and the libfreespace api.
The code looks pretty simple. There's a few calls to initialise the sensor, then in the game loop you freespace_readMessage() and it returns packets. If the packet is of type FREESPACE_MESSAGE_BODYFRAME you can read it's fields to get the sensor state.
http://bazaar.launchpad.net/~merrill-ro ... _example.c
If they are still using that for the oculus, getting the data should be easy. Putting it to use, however, will need some fine tuning.
I read on the team fortress wiki's oculus rift support page that you need to tell the software which eye pieces you have attached. There's different pieces (3 or so per eye) you can choose depending on your eyesite (near sighted vs far sighted). The software doesn't know on it's own which you have, and each one affects the distortion needed to correctly render.
The code looks pretty simple. There's a few calls to initialise the sensor, then in the game loop you freespace_readMessage() and it returns packets. If the packet is of type FREESPACE_MESSAGE_BODYFRAME you can read it's fields to get the sensor state.
http://bazaar.launchpad.net/~merrill-ro ... _example.c
If they are still using that for the oculus, getting the data should be easy. Putting it to use, however, will need some fine tuning.
I read on the team fortress wiki's oculus rift support page that you need to tell the software which eye pieces you have attached. There's different pieces (3 or so per eye) you can choose depending on your eyesite (near sighted vs far sighted). The software doesn't know on it's own which you have, and each one affects the distortion needed to correctly render.
-
- Gnoblar
- Posts: 6
- Joined: Thu Mar 28, 2013 8:56 am
- Location: Japan
Re: 3D output
I just remembered an email that Oculus sent to the kickstarter backers back in January about sensor fusion. It sounds fairly promising:
Another thing that I remember reading, although I haven't been able to locate the source, is that the SDK does not currently support tracking of head position. So if you keep your head locked upright and move left/right/forward/back, nothing moves. Theoretically, all you need to do is integrate accelerometer data, but we've found that's very hard to get accurate for tracking the location of a robot; getting it accurate enough for the subtle movement of a head will be rough. Which I assume is why they haven't managed to implement it yet.
If you watch some of the demos carefully, they always instruct the user to look around/look behind them without moving their body. That's unfortunate because the head's positional trajectory is a fairly important depth cue. John Carmack mentioned it in an interview back when the rift first came to light, so I assume it will be ready before the commercial rollout. I can imagine an FPGA might end up being necessary and they'll need higher volume than the Devkit to make that cheap.
So that should make the camera rotation fairly easy. As I understand it, we should set up a virtual "bar" with a camera attached to either end of it and rotate the center of the bar based on the orientation returned from the SDK... does that sound right?If all of this drift correction and sensor fusion business seems like a lot of work, don’t worry! In addition to raw data, the Oculus SDK provides a SensorFusion class that takes care of the details, returning orientation data as either rotation matrices, quaternions, or Euler angles. The SDK also includes a complete C++ “Oculus Room” example that demonstrates many different player input schemes integrated with Oculus head tracking.
Another thing that I remember reading, although I haven't been able to locate the source, is that the SDK does not currently support tracking of head position. So if you keep your head locked upright and move left/right/forward/back, nothing moves. Theoretically, all you need to do is integrate accelerometer data, but we've found that's very hard to get accurate for tracking the location of a robot; getting it accurate enough for the subtle movement of a head will be rough. Which I assume is why they haven't managed to implement it yet.
If you watch some of the demos carefully, they always instruct the user to look around/look behind them without moving their body. That's unfortunate because the head's positional trajectory is a fairly important depth cue. John Carmack mentioned it in an interview back when the rift first came to light, so I assume it will be ready before the commercial rollout. I can imagine an FPGA might end up being necessary and they'll need higher volume than the Devkit to make that cheap.
- Kojack
- OGRE Moderator
- Posts: 7157
- Joined: Sun Jan 25, 2004 7:35 am
- Location: Brisbane, Australia
- x 534
Re: 3D output
Yep, it's only 3 degrees of freedom (pitch, yaw, roll), not position. Same with my vuzix 1200vr headset, it has a 3dof tracker.
There's a few ways to go for full head tracking. One is to use something like TrackIR or Freetrack. It can do 6dof, but rotations are limited (the reflectors or ir lights on the head unit thingy need to roughly face the camera), so if you look backwards then you'd lose positioning. (TrackIR is opensource hostile, Freetrack would be better)
The second and probably better way to go is using a Kinect. In half body mode (where it doesn't track legs, so you can use it sitting at a desk) it can pretty accurately track head position regardless of your facing direction. It can do direction as well using facial tracking, but that won't work so well with the oculus covering the face, and wouldn't be as good anyway. But the position would be fine.
Plus with a kinect it could put your hands and arms into the virtual world (or whole body if you use it standing up).
There's a few ways to go for full head tracking. One is to use something like TrackIR or Freetrack. It can do 6dof, but rotations are limited (the reflectors or ir lights on the head unit thingy need to roughly face the camera), so if you look backwards then you'd lose positioning. (TrackIR is opensource hostile, Freetrack would be better)
The second and probably better way to go is using a Kinect. In half body mode (where it doesn't track legs, so you can use it sitting at a desk) it can pretty accurately track head position regardless of your facing direction. It can do direction as well using facial tracking, but that won't work so well with the oculus covering the face, and wouldn't be as good anyway. But the position would be fine.
Plus with a kinect it could put your hands and arms into the virtual world (or whole body if you use it standing up).
- Kojack
- OGRE Moderator
- Posts: 7157
- Joined: Sun Jan 25, 2004 7:35 am
- Location: Brisbane, Australia
- x 534
Re: 3D output
Oculus discussion best continued over here: http://www.ogre3d.org/forums/viewtopic. ... 74#p486174
Since this isn't really a 2.0 feature.
Since this isn't really a 2.0 feature.
-
- Gnoblar
- Posts: 6
- Joined: Thu Mar 28, 2013 8:56 am
- Location: Japan
Re: 3D output
I'm sorry for taking this thread OT. I think I must not have the necessary permissions for the OT lounge, and I didn't want to start a new thread with my first post.
- Kojack
- OGRE Moderator
- Posts: 7157
- Joined: Sun Jan 25, 2004 7:35 am
- Location: Brisbane, Australia
- x 534
Re: 3D output
Ok, ignore last link, it's now at: http://www.ogre3d.org/forums/viewtopic.php?f=5&t=76970