Overview
The iSpaceLab at Simon Fraser University’s School of Interactive Arts and Technology (SIAT) conducts innovative research at the intersection of Virtual Reality/XR, Psychology/Cognitive Science, Informatics, Human Factors/HCI, and Art/Design. Our interdisciplinary team investigates the underlying perceptual, cognitive, and social processes that govern human interaction, moving far beyond the traditional desktop metaphor.
Our vision is to understand and design profoundly new ways for people to interact with and benefit from technology. We question how human perception and behaviour translate to virtual environments and explore how we can “intelligently cheat” perception to create compelling and effective experiences. This involves tackling fundamental challenges like cybersickness and developing more effective, embodied locomotion interfaces (including hands-free, leaning-based, and BCI) that support effortless spatial orientation.
Ultimately, our goal is to leverage these insights to design transformative technologies that have a positive impact. This could include fostering not just an improve user experiene and usablility, but also enhanced well-being, empathy, awe, and social connection.
iSpace Research Agenda & Vision in a Nutshell
Virtual reality software and hardware is becoming increasingly affordable and powerful, and is increasingly being used in experimental research. In fact, the possibility to conduct tightly controlled and repeatable experiments with naturalistic multi-modal stimuli in a closed action-perception loop suggest that VR could become an increasingly powerful yet flexible research tool.
Despite increasing computational power and rendering quality, though, it is debatable whether humans necessarily perceive, feel, think, and behave similarly in real and virtual environments – which is essential for achieving sufficient real-world transfer of experimental results gained in the lab, and providing compelling experiences. What might be missing? What can we learn from this? How can we use this basic information to improve both technology and user experience?
How might we be able to “cheat intelligently” in VR and, e.g., provide users with a compelling sensation of being in (“presence”) and moving through (“vection”) the simulated environments without the need for full physical locomotion or large costly motion simulators? Why is it so hard to control all 4 degrees of freedom when flying a quadcopter drone or through VR with the standard gamepad or RC controller? And how could we use our knowledge about human embodied perception to design more effective yet affordable locomotion interfaces for both 2D (ground-based) and 3D (flying), for both telepresence and immersive VR? Can the mere illusion of self-motion (“vection”) be sufficient for providing similar benefits as actual locomotion? i.e., what is the functional significance of vection? How far can we get with just visual cues? What benefits do we gain from multi-modal stimuli?
And last but not least, how could we leverage the rapidly increasing power of emerging technologies such as VR/XR and AI to make a positive contribution to our society and planet?
[Note that there are many other areas that we work in and are interested in expanding into. See also additional research topics and interests and our iSpace Youtube Playlist]
VR4Good
Please see my TEDxEastVan TEDx talk and the diverse research topics and interests and our Contact & Join Us pages for an overview of the different projects and our vision on how we could use the increasing potential of immersive VR (combined with other technologies and approaches, including biosensing, AI, XR etc) to create meaningful positive experiences.
Could Virtual Reality make us more human? | Bernhard Riecke | TEDxEastVan
See also our iSpace Youtube Playlist
Below is an older short intro video of iSpace lab and a few graphics to explain our research agenda and vision (note the video is from 2011, so since then we added a lot of additional research topics and interests). enjoy.



