BRIDGES project: banner of platform page, users testing extended reality BRIDGES project: banner of platform page, users testing extended reality PLATFORM

The BRIDGES solution will be the result of further developments of the existing platform Immersive DeckⓇ which has been researched and developed by the Technical University of Vienna and extended by Illusion Walk, as well as extending outputs, experiences, and knowledge gained from several major national and European RTD initiatives.

Within BRIDGES, we will be enhancing technical and operational aspects of the Immersive DeckⓇ, piloting and testing it extensively and in various domains, and assembling it into a flexible and scalable solution to be  applied in different domains.
The Immersive DeckⓇ is a large-scale multi-user VR platform that provides an immersive experience for up to five users, who can walk and interact freely and untethered in an area of several hundred square-meters.

The required equipment (a high-end Head Mounted Display (HMD) which is tethered to a laptop in a backpack) is worn on the body and rendering is performed locally on each user to minimize latency. Inside-out optical hand and finger tracking is performed by a low-cost Motion Capture (MoCap) suit or a stereo-camera that is attached to the HMD and thus allows the tracking of hand movements for haptic interaction. Movements of users, 3D interactions and the positions of selected real world objects are distributed over a wireless network in a server-client architecture. As a result, users see the effect of their interactions with objects and other users in real time.

Immersive Deck: photo of user experience, multiple users
Immersive Deck: photo of user experience in industrial training
Immersive Deck: photo of user experience in industrial training
Immersive Deck: photo of user experience

BRIDGES will extend the Immersive Deck technologies and software tools on a number of levels:

Complex real-world objects will be brought into the virtual space by integrating an RGB-D sensor

Device integration will be reimplemented to decouple the tracking framework from the rendering engine and the application;

Networking will become more scalable, transparent, and consistent to support a larger number (up to 10) of concurrent users

The mapping process to reconstruct the built environment will be optimized with methods from photogrammetry to become more accurate

The multisensory output will be refined and optimized with additional stimuli (e.g.: heat, wind, smells etc.)

Request a demo

    Contact Us

    We're not around right now. But you can send us an email and we'll get back to you, asap.