The XR (eXtended Reality) Lab provides facilities related to Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) to researchers at the Faculty of Arts.
Videos
The following videos provide an overview of several facilities.
The XR (eXtended Reality) Lab provides facilities related to Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) to researchers at the Faculty of Arts.
In this 36 m2 experiment room, trackers, controllers and a headset can be tracked, allowing participants to interact with objects in physical space that are also represented in the virtual world.
Because of its external tracking, the Varjo Aero headset can only be used in the XR lab's experiment room. This headset has superior visual quality and a 3.5mm audio jack for an external microphone and headphones. The headset has built-in eye tracking and hand tracking is provided by the Leap Motion 2 controller mounted to the headset.
Because of its internal tracking, the HP Reverb G2 Omnicept can be used with a powerful gaming laptop to test participants outside of the XR Lab. This headset has a built-in microphone, headphones, eye tracking and heart rate monitor. Hands can be visualized using controllers.
Body movements and gestures can be recorded using the Rokoko Smartsuit Pro II, Smart Gloves and Coil Pro. Facial expressions can be recorded using an iPhone 12 Pro's TrueDepth camera.
The Lenovo Tab P11 can be used for augmented reality applications.
The Insta360 X3 can be used to record 360-degree videos. These videos can also be viewed in VR headsets.
The Leica FlexLine TS10 Total Station and two Garmin eTrex 22x GPS handhelds can be used to map outdoor environments (such as archaeological sites).
The following videos provide an overview of several facilities.
This video provides an example of the VR environments we can design to make experiments more immersive.
This video demonstrates how eye tracking can be applied within a VR environment. The red cursor indicates where the VR user is looking. To measure how often a user looks at the avatar’s face, we can track how frequently the cursor touches the red sphere surrounding the avatar’s head. In an actual experiment, both the cursor and the red sphere are invisible.
This video shows the result of a face capture session, in which we recorded an actor’s facial expressions and lip movements. These recordings were then converted into animations of an avatar’s face, enabling highly realistic lip synchronisation with the audio from the recording session.
This video shows an intermediate result from a motion capture session, in which we recorded an actor’s body movements. These recordings have been visualised using a simplified avatar.
We are always looking for participants. Sign up as a participant in the Radboud SONA system to take part in our lab experiments.
All research projects carried out in the CLS Experiment Lab, the Video Lab and the XR Lab must be assessed by the Ethics Assessment Committee Humanities (EACH).
Erasmus building, basement level
Erasmusplein 1
6525HT Nijmegen
The XR Lab also provides technical support for developing VR and AR applications. For more information, please send an email to xrlab-let [at] ru.nl (xrlab-let[at]ru[dot]nl).