M. Handosa ,  H. Schulze ,  D. Gračanin ,  M. Tucker ,  M. Manuel (2018)

Extending Embodied Interactions in Mixed Reality Environments

Wo ist die Publikation erschienen?

Virtual, Augmented and Mixed Reality: Interaction, Navigation, Visualization, Embodiment, and Simulation


The recent advances in mixed reality (MR) technologies provide a great opportunity to support deployment and use of MR applications for training and education. Users can interact with virtual objects that can help them be more engaged and acquire more information compared to the more traditional approaches. MR devices, such as the Microsoft HoloLens device, use spatial mapping to place virtual objects in the surrounding space and support embodied interaction with those objects. However, some applications may require an extended range of embodied interactions that are beyond the capabilities of the MR device. For instance, interaction with virtual objects using arms, legs, and body almost the same way we interact with physical objects. We describe an approach to extend the functionality of Microsoft HoloLens to support an extended range of embodied interactions in an MR space by using the Microsoft Kinect V2 sensor device. Based on that approach, we developed a system that maps the captured skeletal data from the Kinect device to the HoloLens device coordinate system. We have measured the overall delay of the developed system to evaluate its effect on application responsiveness. The described system is currently being used for the development of a HoloLens application for nurse aide certification in the Commonwealth of Virginia.