

“Vidusayura” Perception enhanced immersive VR Solution with Ogre 3D
In this VR Solution (ship simulation system) we use our own computational ship models based on our algorithms and constrain so that we can incorporate any computational ship model to our database which satisfy our algorithms. We have developed several six degrees of freedom computational ship models such as benchmark tanker “Esso Ossaka” and offshore patrol vessel quite identical to “Jayasagara” class which has been locally built by the Colombo dockyard. We used Ogre3D rendering engine with Hydrax which has capability of realistic ocean wave rendering. and create real-time interactive environment. We incorporated real world geographical sceneries with cultural objects, moving/ fixed targets. Several environmental conditions and wide range of visibility effects such as daytime, dusk and night were incorporated into our database.
We modeled a Sri Lankan harbor and incorporated it into to our data base. We used British admiralty charts “Galle harbour and approaches” as shown in the figure 1 and Google earth images shown in the figure 2 for the 3D modeling. Subsequently the shore line was finalized and a sequence of digital pictures was taken from the sea while keeping the same distance from the shore line.Same process was repeated several times to get different image sequences with different distances from the shore line. Various moving and fixed targets and cultural objects were observed in the sea around the selected harbor. The relative sizes of the observed object were recorded with respect to a selected earth fixed object and digital pictures were taken from different distances.


We used 3D studio max to create mesh models. All naval vessels, moving or fixed targets, and cultural objects, scenes of navigation areas were modeled using 3D studio max. When were modeling the navigation areas and the shore line major objects were modeled with polygonal meshes and the other objects were placed by using the billboards. Throughout this 3D modeling we used appropriate textures to model more realistic scenery. Most of the time we captured the real texture from the digital camera and enhanced by image editing software. Several times we used standard materials from the 3D studio max library. We used OgreMax Scene Exporter to export the modeled scenery from 3D studio max to Ogre 3D. The OgreMax Scene Exporter is a 3DS Max plugin that exports 3DS Max scenes to OgreMax scene files. Finally the virtually created environment is compared with the real environment. The outcome is quite satisfactory level as shown in the figure 3 and figure 4.


Development of Immersive Environment
Panoramic Vision System
Panoramic images and videos are regularly used in various virtual reality applications such as autonomous navigation and virtual walk throughs. This vision system is based on the client-server architecture. It supports real-time six degrees of freedom autonomous navigation system with 3000 field of view. The server computer sends the navigational instructions (latitude, longitude, altitude, roll, pitch, and yaw) to the six the client computers as below.

In each client computer the same virtual environment is loaded and the position and the orientation values are received from a parental node. Each virtual camera inherits it’s position and orientation from the parental node while maintaining 60 degrees with respect to adjacent virtual cameras. To create tile 300°field of view (FOV) each virtual camera occupies 600 angle of view. Navigational instructions (latitude, longitude, altitude, roll, pitch, yaw) are send to virtual cameras from the master computer over the network.
We develop a method to synchronize each virtual camera. The sever computer send data set containing latitude, longitude, altitude, roll, pitch, yaw simultaneously to each client computer through net work. Each client PC should place the virtual camera according to the given position and orientation and render the scenery. After completing the rendering process each client machine should send a message to sever computer. Then the sever computer send the next navigational instruction data set after acquiring all six massages from the client computers. This synchronization mechanism reduces the frame rate a little but 2.66GHz Intel Core 2 Duo iMacs with NVIDIA GeForce 9400M graphic cards were able to maintain more than 25 Frames per second with a complex scenery.
The immersive environment is composed of 4 standard game PCs – (3 clients and a server) and multi projector seamless tiled display system. It was constructed using three multimedia projectors with 2500 ANSI lumens. This large polygonal screen projected realistic visuals with wide field of view and a real scale bridge was constructed and placed as shown in the figure 5.This brings a sense of seriousness and realism to the users perception hence this strengths the ecological validity of the environment.


