iSpace Research Agenda & Vision in a Nutshell
Virtual reality software and hardware is becoming increasingly affordable and powerful, and is increasingly being used in experimental research. In fact, the possibility to conduct tightly controlled and repeatable experiments with naturalistic multi-modal stimuli in a closed action-perception loop suggest that VR could become an increasingly powerful yet flexible research tool.
Despite increasing computational power and rendering quality, though, it is debatable whether humans necessarily perceive, feel, think, and behave similarly in real and virtual environments – which is essential for achieving sufficient real-world transfer of experimental results gained in the lab, and providing compelling experiences. What might be missing? What can we learn from this? How can we use this basic information to improve both technology and user experience?
How might we be able to “cheat intelligently” in VR and, e.g., provide users with a compelling sensation of being in (“presence”) and moving through (“vection”) the simulated environments without the need for full physical locomotion or large costly motion simulators? Why is it so hard to control all 4 degrees of freedom when flying a quadcopter drone or through VR with the standard gamepad or RC controller? And how could we use our knowledge about human embodied perception to design more effective yet affordable locomotion interfaces for both 2D (ground-based) and 3D (flying), for both telepresence and immersive VR? Can the mere illusion of self-motion (“vection”) be sufficient for providing similar benefits as actual locomotion? i.e., what is the functional significance of vection? How far can we get with just visual cues? What benefits do we gain from multi-modal stimuli?
And last but not least, how could we leverage the rapidly increasing power of emerging technologies such as VR/XR and AI to make a positive contribution to our society and planet?
Summary of iSpace Main Research Program(s)
[Note that there are many other areas that we work in and are interested in expanding into. See also additional research topics and interests and our iSpace Youtube Playlist]
VR4Good
Please see my TEDxEastVan TEDx talk and the diverse research topics and interests for an overview of the different projects and our vision on how we could use the increasing potential of immersive VR (combined with other technologies and approaches, including biosensing) to create meaningful positive experiences.
Could Virtual Reality make us more human? | Bernhard Riecke | TEDxEastVan
Hands-Free 2D & 3D Locomotion Interfaces for Virtual Reality and Telepresence
Since ancient times, humans have dreamt of flying. Yet, despite technological advances, only a small fraction of humans (such as aircraft pilots) have the opportunity to fly themselves. Even for those few lucky ones however, flight is mediated through an aircraft and controllers.
In this research program, I will investigate how we could use the power and increasing affordability of immersive virtual reality (VR) to provide a much wider audience with a truly embodied and compelling first-person locomotion experience, such as flying. Just like a bird or in dreams, VR offers the potential for humans to fly (or drive) unencumbered by any hand-held controllers. However, with current prevailing hand-controller based VR systems, users frequently experience motion sickness (35–95%), disorientation, high cognitive load, and reduced performance compared to natural walking. This critically impairs user experience and commercial success. The goal of my research program is to tackle these critical challenges, by designing motion sickness mitigation techniques and improved low-cost flying and driving interfaces.
Specifically, we will expand our recent research on the innovative design, evaluation and examination of novel and highly efficient “leaning-based” interfaces where one can simply lean in the direction one wishes to travel in (similar to controlling a Segway), without the distraction of gamepads or other controllers. These interfaces will be designed to be accessible and easy to use for flying or driving through virtual environments, while reducing motion sickness and cognitive overload. Further, this program will investigate the most effective adaptation of these hands-free interfaces for usage with remote-controlled “telepresence robots”, such as camera-equipped flying quadcopter drones or driving conference robots (e.g., “Skype on wheels” increasingly used in workplaces). This unprecedented research will provide an opportunity for users to “be” somewhere else (“(tele)presence”) while having their hands free for other tasks. For instance, a user may sit/stand comfortably while wearing a head-mounted display showing an immersive view from the perspective of their flying drone. Simply by leaning, they can fly in any direction they desire, while having their hands free to zoom in and capture a stunning video.
This research can improve numerous applications including architectural planning and walk/flythroughs; virtual tourism; immersive education, training, flight simulation, entertainment and gaming; and telepresent filming, photography, tourism, conferencing, and search-and-rescue. Improving telepresence can also help reduce needs to travel, benefiting workplace productivity, remote collaboration, and reducing environmental footprint. This program builds on more than 20 years of experience in using immersive multi-modal VR to study human perception, cognition, and performance, and applying this knowledge to design novel human-centred VR interfaces.
See also our iSpace Youtube Playlist
Below is a short intro video of iSpace lab and a few graphics to explain our research agenda and vision (note the video is from 2011, so since then we added a lot of additional research topics and interests). enjoy.