Recently, we explained on our blog how Station IX uses Reflected Reality™ to produce the most accurate sense of depth and three dimensionality without VR goggles or head-mounted devices (HMDs). We also explored what immersive media are and how it will impact industry in the coming years. In an effort to continue answering all your questions, we thought it was important to explain how content is displayed inside Station IX.

Movie theatres abound with animated films filled with 3D characters, military training centers use 3D models and synthetic environments to familiarize soldiers with weapons and remote areas, and architects showcase their newly designed virtual spaces to stakeholders to encourage investment. 3D content is everywhere, it fills many needs, and there are many ways to present it. Station IX is one possibility.

Station IX is a large 3D immersive display system. Once converted using an image rendering platform, or game engine, our customers’ CAD files, 3D models, BIM content, and more, is playable inside Station IX.

Picture of user inside of Station IX

How does Station IX run 3D content?

For the most part, if you can run your content on your computer, then we can run it in Station IX.  Typically, only three modifications to software applications are required to properly integrate content into Station IX:

  • Texture mapping
  • Multiple camera rendering
  • Parallel computing

Texture Mapping

Station IX uses projectors to display images onto a curved projection screen, which is then reflected by large curved mirrors. This means the image from each projector must be properly adjusted to account for how the image will appear to the user. In addition, the images from each projector need to be combined so that the user sees a single, continuous image reflected on seven mirrors. The process to achieve this is known as warping and blending.

Image depicting how Station IX's Reflected Reality works

How Reflected Reality™ is achieved

Fortunately, Imagine 4D has developed the tools and software necessary to determine exactly how each image needs to be warped and blended. The only modification required to an existing application is to add the capability to map a captured image to a new image based on a set of coordinates that our tools provide. This process may seem daunting, but it is simply an example of texture-mapping, which is a routine process in any application that presents images to a user.

Multiple Camera Rendering

Station IX immerses users in visuals of up to 360°. This means single-camera rendering commonly used in most software applications cannot be used to render the necessary fields of view. Multiple cameras, typically seven, are used for rendering visuals in Station IX.

Similarly, the fields of view of each camera need to be set based on the number and arrangement of the cameras. If this is not done correctly, the visuals on Station IX will looked stretched, flat, and without proper 3D depth sensation.

This might sound like a lot of work, but there’s good news: The difficulty of achieving multiple camera rendering with the correct fields of view is equivalent to the difficulty of integrating a software application to work on a virtual reality (VR) headset. So, if you have content that works with VR already, you have essentially done all the work to get it ready for Station IX! The only work left is to add a few more cameras and set their fields of view to match our immersive environment. After that, your content is ready to be displayed using Reflected Reality™.

Parallel Computing

While not technically needed, it is highly recommended to take advantage of parallel computing to render visuals in Station IX. Parallel computing is a type of computing architecture in which several computers execute an application simultaneously. This divides the workload between more than one computer, which allows for larger and more intricate visuals to be displayed while keeping the application running smoothly.

Station IX displays extremely high resolution visuals over a field of view of 280°, so a lot of processing power is needed to take full advantage of its capabilities. The best performance is achieved when multiple computers are used in tandem to render the different parts of a 3D scene. Unity and Unreal Engine, as examples, have built-in capabilities to do parallel computing, so software running with these game engines can readily be adjusted to add parallel computing capability.

How can I integrate my own content in Station IX?

To help our customers with integrating their content, we have developed software development kits (SDKs) with plug-and-play support. This makes it easy to properly adjust the camera settings of the content to match the fields of view required within Station IX and helps to set up parallel computing capability.

Once your rendering cameras have been set up, and your computers are running an application in parallel, high-resolution 3D content is ready to be displayed inside Station IX. The combination of the large curved mirrors and projectors create an accurate sense of depth and 3D. Station IX surrounds users in their content and provides the impression that they are looking through a window into another world. What our customers appreciate most is that you do not need VR goggles or HMDs to look into this new 3D world.

Content formats:

We can work with a variety of content formats by adjusting our approach to rendering accordingly.

Pictograph describing how to display 3D content inside Station IX

How to display content inside Station IX

  1. 3D content is created and taken from data sources such as CAD files, BIM visualizations, CGI content, 3D videos, satellite imagery, and live camera footage.
  2. The content is then converted and optimized for Station IX using SDKs for image generators/game engines such as Unity and Unreal Engine.
  3. 3D content is now ready to be displayed inside of Station IX with near-retinal image quality, which creates an immersive environment.

All content created for Station IX is interactive, meaning users can manipulate the 3D data using a variety of tools such as keyboards, touchscreens, a 3D mouse, or haptic gloves. This content includes:

  • Flyovers
  • Walkthroughs
  • Links to schematics
  • Training materials
  • Technical specifications
  • Videos

Why can’t files be plugged directly from a data source into Station IX?

CAD files, for example, do not allow us to set the rendering camera properties to what Station IX needs nor do they allow for a large number of rendering cameras to portray a 3D scene. We can, however, export the 3D models from CAD into a number of other programs we have already integrated into Station IX in order to view them in 3D.

Can I display recorded video and live camera footage in Station IX?

It is possible to display high-resolution video and live camera footage in Station IX and there are a few ways that Imagine 4D can help customers with this content.

Showing video that appears 2D in Station IX is quite simple (think of a movie screen playing your video inside a virtual environment that can be moved around by the user).

Showing recorded or streamed video that appears 3D inside Station IX is slightly more difficult. The easiest way to achieve this would be to record a 360° video. Imagine 4D has created software that can play the correct portions of 360° videos so that they look 3D inside Station IX.

Otherwise, a 3D perspective can also be achieved by recording the proper fields of view with your own camera(s). Although it is difficult to control these fields of view during filming, we can later edit or crop the video to meet the requirements of our display system.

One important note to consider is that the human eye can see about 60 pixels per degree, and Station IX can display up to 50 pixels per degree. A 4K 360° video, however, only contains about 10 pixels per degree. So an HD or 4K video that looks phenomenal on your TV or PC monitor will appear to be very low resolution when properly set up in Station IX. The higher the resolution, the better the final product will look, so it is best to record or use 360°videos that are shot in 8k resolution or higher to take advantage of the full resolution of Station IX.

Do I have to create my own 3D content?

No, and our customers typically do not. We usually suggest one of three options:

  1. Let our team integrate your existing content into Station IX. We can also provide you with the SDKs you need to readily integrate your content on your own.
  2. Let us put you in touch with one of our strategic partners that create 3D content for your industry and who already have experience providing content for Station IX.
  3. We can create content for you using our own camera recording devices, or provide content we have already developed ourselves.
A view inside Station IX of people observing the world from space

A view inside Station IX

 

If you have 3D content that you would like to see inside of Station IX, or if you have questions, please contact us here.

For more general information regarding Station IX, please download our brochure.

 

 

Imagine 4D

Author Imagine 4D

More posts by Imagine 4D

Leave a Reply