Blog: Controller – Based Gesture Interactions in Virtual Reality

-

Alexandros Doumanoglou, Antonis Karakottas, Georgios Papadopoulos, Dimitrios Zarpalas, CERTH

For decades, the typical interface for Human-Computer - Interaction has been the combination of standard 2D displays for computer visualizations and keyboard/mouse devices for user input. With the emergence of virtual reality (VR) technology is creating exciting new ways for humans to immerse inside virtual environments. In modern VR implementations, head-mounted-displays (HMDs) are used as the primary medium for computer visualizations. Those helmet-like devices allow full 6 degrees-of-freedom (6-DOF) navigation inside virtual environments by translating user head movements in the physical world to equivalent ones in the virtual environment. While the VR headsets constitute the key enablers for inspecting the virtual world from any  [KA1] viewpoint in an immersive manner , typical keyboard/mouse interfaces are not convenient for that purpose. This is mainly due to the HMD hiding sight to the physical devices and partially due to the unnatural feeling of translating mouse 2D movements to 3D pointing actions.

As of today, hand controllers are the most common interface for human interaction in VR environments. Hand controllers use gyroscope, accelerometer and tracking technologies that allow for accurate estimation of their position and orientation in the physical world. Subsequently, those estimations are translated into pointing directions in the virtual world, enabling natural point-and-click interactions via the use of controller buttons. Moreover, apart from point-and-click interactions, other innovative and interesting ways of interaction may be achieved by analyzing the controllers’ motion in combination with button clicks. While this is the standard way to achieve engaging gameplay in VR games, this type of interaction has not yet been widely adopted in other contexts. In those cases, utilizing controller motion gestures could provide an alternative way to perform actions inside the VR environment, bypassing the standard point-and-click interface of going through menu choices before enabling a functionality, in a similar fashion like keyboard hot-keys serve as shortcuts for a series of mouse clicks in traditional 2D display setups.

In INFINITY’s first pilot, we developed a Proof-of-Concept (PoC) office virtual environment, allowing users to interact with the environment’s virtual elements using a combination of point-and-click and controller motion gestures. Opening action menus, confirming choices in dialog boxes or starting and stopping videos in a video player can be performed via dynamic controller motion gestures. A video demonstrating this PoC is provided  below. During the 1st pilot, assessment was performed using questionnaires, with end-users conveying a positive feeling for the overall experience. For the next pilot we aim to build upon their feedback and comments and provide actual integration of the gesture interaction system with INFINITY’s I3CE environment.

Video: https://youtu.be/l4eCGz13iEI

 [1] https://uxdesign.cc/stop-trying-to-make-air-gestures-happen-6ed6d76d55b7