VR Toolkit

Software and hardware made/used by the group

Equipment available


On the hardware side, we are focusing on VR headsets, which offer excellent immersion at a reasonable cost. We currently have available

The Vive is tethered to (and driven by) a VR-capable computer. The Quest (currently in test) is a standalone system.

On the software side, we are developing our prototypes using the Unity 3D game engine. Rather than extending existing astronomy software to use advanced displays, we have taken the approach of implementing scientific visualization into a software that already takes care of technical aspects like stereoscopy or user tracking.

X

Prototypes produced


SN2SNR

This demo focuses on the science of supernovae (SNe) and supernova remnants (SNRs). It was developed primarily to illustrate the results of a research project, and has been polished to be public-friendly and easy to navigate. The experience contains two parts. The SNR part shows the time evolution of the remnant from two initial conditions: it is a fully-interactable 3D movie. The SN part shows the abundances of different elements in the supernova as iso-contours, letting you select both visible elements and their colors. This demo uses predefined content and rendering. That is, while the user can view the data from any angle they wish, and fully interact with the data, options are more limited for selecting which data is visible and how. (This aspect is addressed by the Cube2 app described below.)
Screenshot of the SN2SNR VR experience. Screenshot of the SN2SNR VR experience. Screenshot of the SN2SNR VR experience. Screenshot of the SN2SNR VR experience.

Cube2

This is the evolution of the original software developed by Gilles for 3D visualization of radio data cubes in VR. It is oriented toward researchers who want to inspect a scalar field, defined densely in 3D space (either actual X/Y/Z space, or a configuration space with different dimensions). It allows the user to load any rectangular prism of data sampled on a Cartesian grid. Using an in-world UI based around physical gestures, users can explore the data through any of the following operations: slice/threshold displayed data, colorize it, and set other rendering options.

This product has been tested with both simulation data (SNRs, large-scale-structure simulations), and observational data (galaxies in HI, stellar outflows in HI, dark matter distribution in the nearby universe).
Screenshot of the Cube2 GUI and a radio observation of a galaxy. Screenshot of the Cube2 GUI and an SNR simulation. Dark matter distribution viewed within the Cube2 program.

(PointData)

This software is intended for examining catalogue data with three or more attributes: for example, X/Y/Z position of stars in a cluster, or Epeak/Lpeak/Liso values of gamma-ray bursts (GRBs). It was originally developed to look at GRB correlations for our colleague Maria Dainotti. The different data points are displayed with different shapes and colors, which adds still more dimensions that can be coded, such as the type of GRB.

This prototype is still largely unpolished and unfinished. It could be revisited and made into a more versatile and functional app.
A visiting researcher to the Astrophysical Big Bang Laboratory (ABBL), who is working on gamma-ray bursts, sees her main research finding (a fundamental plane in parameter space) for the first time in real 3D.

Sketchfab

Some of the 3D models from the SN2SNR research project have been uploaded to a gallery on Sketchfab, an online platform for sharing and discovering (mesh-based) 3D models. Sketchfab offers a VR mode, on mobile (e.g. with a Google Cardboard headset) or on a desktop PC (via WebVR). Although user motion and interaction is much restricted compared to a native VR app, these models can be easily distributed, for a first 3D experience.

Techniques


Visualization methods

We are looking forward to adding performance optimizations, allowing us to load either more complex data or to run our apps on more lightweight hardware.

As of now we have not looked into techniques appropriate for higher-rank data, e.g. vector fields or tensor fields.

User interactions

The data being shown is fixed ahead of time, but custom user interactions must be implemented in Unity (via C# scripts) so that the user can manipulate the 3D model, and/or control its rendering. VR headsets rely chiefly on a pair of hand-held controllers that are represented in the same space as the user and the data. Having a natural UI is therefore important to have people actually want to use the new system.

Immersive 3D brings its own set of opportunities and challenges, which we will keep exploring with the help of computer scientists. Our current prototypes are appropriate for the phase of data exploration. Lines of future investigation include allowing the user to easily perform quantitative analysis on the data.