iSpace Research Agenda & Vision in a Nutshell

Virtual real­ity soft­ware and hard­ware is becom­ing increas­ingly afford­able and pow­er­ful, and is increas­ingly being used in exper­i­men­tal research. In fact, the pos­si­bil­ity to con­duct tightly con­trolled and repeat­able exper­i­ments with nat­u­ral­is­tic multi-modal stim­uli in a closed action-perception loop sug­gest that VR could become an increas­ingly pow­er­ful yet flex­i­ble research tool.

Despite increas­ing com­pu­ta­tional power and ren­der­ing qual­ity, though, it is debat­able whether humans nec­es­sar­ily per­ceive, feel, think, and behave sim­i­larly in real and vir­tual envi­ron­ments – which is essen­tial for achiev­ing suf­fi­cient real-world trans­fer of exper­i­men­tal results gained in the lab, and pro­vid­ing com­pelling expe­ri­ences. What might be miss­ing? What can we learn from this? How can we use this basic infor­ma­tion to improve both tech­nol­ogy and user experience?

How might we be able to “cheat intel­li­gently” in VR and, e.g., pro­vide users with a com­pelling sen­sa­tion of being in (“pres­ence”) and moving through (“vec­tion”) the sim­u­lated envi­ron­ments with­out the need for full phys­i­cal loco­mo­tion or large costly motion sim­u­la­tors? Why is it so hard to con­trol all 4 degrees of free­dom when flying a quad­copter drone or through VR with the stan­dard gamepad or RC con­troller? And how  could we use our knowl­edge about human embod­ied per­cep­tion to design more effec­tive yet afford­able loco­mo­tion inter­faces for both 2D (ground-based) and 3D (flying), for both telep­res­ence and immer­sive VR? Can the mere illu­sion of self-motion (“vec­tion”) be suf­fi­cient for pro­vid­ing sim­i­lar ben­e­fits as actual loco­mo­tion? i.e., what is the func­tional sig­nif­i­cance of vec­tion? How far can we get with just visual cues? What ben­e­fits do we gain from multi-modal stimuli?

And last but not least, how could we lever­age the rapidly increas­ing power of emerg­ing tech­nolo­gies such as VR/XR and AI to make a pos­i­tive con­tri­bu­tion to our soci­ety and planet?

Summary of iSpace Main Research Program(s)

[Note that there are many other areas that we work in and are inter­ested in expand­ing into. See also addi­tional research topics and inter­ests and our iSpace Youtube Playlist]


Please see my TEDxEastVan TEDx talk and the diverse research topics and inter­ests for an overview of the dif­fer­ent projects and our vision on how we could use the increas­ing poten­tial of immer­sive VR (com­bined with other tech­nolo­gies and approaches, includ­ing biosens­ing) to create mean­ing­ful pos­i­tive experiences.

Could Virtual Reality make us more human? | Bernhard Riecke | TEDxEastVan

Hands-Free 2D & 3D Locomotion Interfaces for Virtual Reality and Telepresence

Since ancient times, humans have dreamt of flying. Yet, despite tech­no­log­i­cal advances, only a small frac­tion of humans (such as air­craft pilots) have the oppor­tu­nity to fly them­selves. Even for those few lucky ones how­ever, flight is medi­ated through an air­craft and controllers.

In this research pro­gram, I will inves­ti­gate how we could use the power and increas­ing afford­abil­ity of immer­sive vir­tual real­ity (VR) to pro­vide a much wider audi­ence with a truly embod­ied and com­pelling first-person loco­mo­tion expe­ri­ence, such as flying. Just like a bird or in dreams, VR offers the poten­tial for humans to fly (or drive) unen­cum­bered by any hand-held con­trollers. However, with cur­rent pre­vail­ing hand-controller based VR sys­tems, users fre­quently expe­ri­ence motion sick­ness (35–95%), dis­ori­en­ta­tion, high cog­ni­tive load, and reduced per­for­mance com­pared to nat­ural walk­ing. This crit­i­cally impairs user expe­ri­ence and com­mer­cial suc­cess. The goal of my research pro­gram is to tackle these crit­i­cal chal­lenges, by design­ing motion sick­ness mit­i­ga­tion tech­niques and improved low-cost flying and dri­ving interfaces.

Specifically, we will expand our recent research on the inno­v­a­tive design, eval­u­a­tion and exam­i­na­tion of novel and highly effi­cient “leaning-based” inter­faces where one can simply lean in the direc­tion one wishes to travel in (sim­i­lar to con­trol­ling a Segway), with­out the dis­trac­tion of gamepads or other con­trollers. These inter­faces will be designed to be acces­si­ble and easy to use for flying or dri­ving through vir­tual envi­ron­ments, while reduc­ing motion sick­ness and cog­ni­tive over­load. Further, this pro­gram will inves­ti­gate the most effec­tive adap­ta­tion of these hands-free inter­faces for usage with remote-controlled “telep­res­ence robots”, such as camera-equipped flying quad­copter drones or dri­ving con­fer­ence robots (e.g., “Skype on wheels” increas­ingly used in work­places). This unprece­dented research will pro­vide an oppor­tu­nity for users to “be” some­where else (“(tele)presence”) while having their hands free for other tasks. For instance, a user may sit/stand com­fort­ably while wear­ing a head-mounted dis­play show­ing an immer­sive view from the per­spec­tive of their flying drone. Simply by lean­ing, they can fly in any direc­tion they desire, while having their hands free to zoom in and cap­ture a stun­ning video.

This research can improve numer­ous appli­ca­tions includ­ing archi­tec­tural plan­ning and walk/flythroughs; vir­tual tourism; immer­sive edu­ca­tion, train­ing, flight sim­u­la­tion, enter­tain­ment and gaming; and telep­re­sent film­ing, pho­tog­ra­phy, tourism, con­fer­enc­ing, and search-and-rescue. Improving telep­res­ence can also help reduce needs to travel, ben­e­fit­ing work­place pro­duc­tiv­ity, remote col­lab­o­ra­tion, and reduc­ing envi­ron­men­tal foot­print. This pro­gram builds on more than 20 years of expe­ri­ence in using immer­sive multi-modal VR to study human per­cep­tion, cog­ni­tion, and per­for­mance, and apply­ing this knowl­edge to design novel human-centred VR interfaces.

See also our iSpace Youtube Playlist

Below is a short intro video of iSpace lab and a few graph­ics to explain our research agenda and vision (note the video is from 2011, so since then we added a lot of addi­tional research topics and inter­ests). enjoy.

YouTube Preview Image