Bernhard E. Riecke

profile

Position:

Professor

Contact:

ber1 at sfu dot ca
http://www.siat.sfu.ca/faculty/Bernhard-Riecke/

Affiliations:

Full Professor at SFU-SIAT (School of Interactive Arts and Technology)

Associate Member of SFU Cognitive Science Program

Biography

 Positions and Affiliations:

Contact:

ber1 at sfu dot ca

SIAT homepage:

www.sfu.ca/siat/people/faculty/bernhard-riecke.html

Brief Biography

I am a psycho-physicist and Cognitive Scientist who’s excited about study­ing how humans orient in vir­tual and real envi­ron­ments. I received my PhD in Physics from the Tübingen University in Germany and researched for a decade in the Virtual Reality group of the Max Planck Institute for Biological Cybernetics in Germany. After a post-doc in Psychology at Vanderbilt University I joined the School of Interactive Arts & Technology of Simon Fraser University as an assis­tant pro­fes­sor in 2008. My research approach com­bines fun­da­men­tal sci­en­tific research with an applied per­spec­tive of improv­ing human-computer interaction.

I com­bine multi-disciplinary research approaches and immer­sive vir­tual envi­ron­ments to inves­ti­gate what con­sti­tutes effec­tive, robust, embod­ied and intu­itive human spa­tial cog­ni­tion, ori­en­ta­tion and behav­iour (and many other things as you can see on the projects pages). This fun­da­men­tal knowl­edge is used to guide the design of novel, more effec­tive human-computer inter­faces and inter­ac­tion par­a­digms that enable sim­i­lar processes in computer-mediated envi­ron­ments like vir­tual real­ity (VR) and multi-media. These improved inter­faces can then enable and inspire fur­ther research, both fun­da­men­tal and applied.

Supervision

super­vised theses

Teaching

Publications

Research Interests

My research inter­ests include:

  • Human multi-modal spa­tial cog­ni­tion, spa­tial ori­en­ta­tion, spa­tial updat­ing, and navigation
  • Enabling robust and effort­less spa­tial ori­en­ta­tion in vir­tual environments
  • Self-motion per­cep­tion, illu­sions (“vec­tion”), and sim­u­la­tion; Multi-modal con­tri­bu­tions and interactions
  • Multi-modal cue inte­gra­tion: Experimentation and the­o­ret­i­cal modeling
  • Design and iter­a­tive eval­u­a­tion and improve­ment of per­cep­tu­ally ori­ented, multi-modal human-computer inter­faces and human-centered, effec­tive vir­tual real­ity simulations
  • Immersion and presence
  • Multi-modal, inter­ac­tive art/dance performances
  • and more…

Please see my TEDxEastVan TEDx talk and the diverse research topics and inter­ests for an overview of the dif­fer­ent projects and our vision on how we could use the increas­ing poten­tial of immer­sive VR (com­bined with other tech­nolo­gies and approaches, includ­ing biosens­ing) to create mean­ing­ful pos­i­tive experiences.

Could Virtual Reality make us more human? | Bernhard Riecke | TEDxEastVan

Below is a short (older) intro video explain­ing my over­all research agenda and vision for the iSpace lab. See the Vision sub­page for details.

YouTube Preview Image

Interested in Joining the iSpace Team?

I’m cur­rently look­ing for bright and moti­vated PhD stu­dents to join our iSpace lab. See the Contact & Join Us sub­page for details.

Projects

Cybersickness Survey: Key Factors and Prevention/Reduction Strategies

Welcome to our research project on VR-induced motion sick­ness (aka cyber­sick­ness). We are explor­ing the fac­tors that cause dizzi­ness and nausea while using VR to make vir­tual real­ity expe­ri­ences more com­fort­able for every­one — join our study today and be a part of research on com­bat­ing cyber­sick­ness! Who Can Participate? We are look­ing for indi­vid­u­als who: Have over two years of experience…


Cybersickness & Benchmarking Tutorial at ISMAR 2024

Half-day tuto­r­ial at ISMAR 2024 on “Cybersickness: Understanding the Challenge and Building Solutions with Standardized Benchmarks” Join us for an in-depth explo­ration of cyber­sick­ness, a per­sis­tent chal­lenge in vir­tual real­ity (VR). In this tuto­r­ial, we’ll delve into the causes and effects of cyber­sick­ness, draw­ing on the latest research and the­o­ries. We’ll…


1st International Workshop on Standardization in Cybersickness Research

1st International Workshop on Standardization in Cybersickness Research: “Establishing Standards for Cybersickness Measurement and Mitigation: A Community-Driven Approach” Date & time: Monday morn­ing 8am-noon, 21 October 2024 at ISMAR 2024 Location: Redmond room, see ISMAR sched­ule Join remotely using this Zoom link or check on the ISMAR web­site for updates (con­tact us if you don’t rec…


Pathways to flourishing

Leveraging Virtual Reality for cul­ti­vat­ing com­pas­sion, resilience, social con­nect­ed­ness, and healthy habits in emerg­ing adults facing chronic health chal­lenges About half of youths with chronic phys­i­cal con­di­tions develop anx­i­ety and/or depres­sion, caus­ing sig­nif­i­cant dis­tress and dis­rup­tion within their lives over many years. This under­scores their need for well-being tools– par­tic­u­larly ones t…


Awedyssey: VR for pro­mo­ting and enhancing well-being

We are inves­ti­gat­ing and cre­at­ing a new vir­tual real­ity (VR) expe­ri­ence, ‘Awedyssey’, for the pro­mo­tion and enhance­ment of well-being. Today, dig­i­tal tech­nol­ogy per­va­sively inter­sects with our daily lives, and VR stands out as a dig­i­tal tool capa­ble of fos­ter­ing pos­i­tive emo­tion like awe, self-transcendence, and authen­tic social con­nec­tion. Connecting with nature is very impor­tant for our mental…


VR Sickness Benchmark System

Tackling VR Sickness: A Novel Benchmark System for Assessing Contributing Factors and Mitigation Strategies through Rapid VR Sickness Induction and Recovery Abstract This research intro­duces a novel VR sick­ness bench­mark system, designed to address the lack of stan­dard­ized tools for assess­ing and mit­i­gat­ing VR sick­ness. It aims to rec­tify the incon­sis­ten­cies and lim­i­ta­tions prev…


Multisensory Contributions Illusory Self-Motion (Vection)

Beyond the Eye: Multisensory Contributions to the Sensation of Illusory Self-Motion (Vection) Abstract Vection is typ­i­cally defined as the embod­ied illu­sion of self-motion in the absence of real phys­i­cal move­ment through space. Vection can occur in real-life sit­u­a­tions (e.g., ‘train illu­sion’) and in vir­tual envi­ron­ments and sim­u­la­tors. The vast major­ity of vec­tion research focuses on vectio…


InExChange

InExChange: Fostering Genuine Social Connection through Embodied Breath Sharing in Mixed Reality InExChange is an inter­ac­tive mixed real­ity expe­ri­ence cen­ter­ing around an inflat­able vest which con­veys a phys­i­cal sense of shared breath­ing on the diaphragm between two or more par­tic­i­pants. The expe­ri­ence is com­posed of three acts in which the par­tic­i­pants’ breaths are trans­formed into metapho…


Designing with Biosignals Workshop at ACM DIS 2023

Designing with Biosignals: Challenges, Opportunities, and Future Directions for Integrating Physiological Signals in Human-Computer Interaction ABSTRACT Biosensing tech­nolo­gies are a rapidly increas­ing pres­ence in our daily lives. These sensor-based tech­nolo­gies mea­sure phys­i­o­log­i­cal processes includ­ing heart rate, breath­ing, skin con­duc­tance, brain activ­ity and more. Researchers are exploring…


ETC - Embodied Telepresent Connection

Embodied Telepresent Connection (ETC): Exploring Virtual Social Touch Through Pseudohaptics ETC (Embodied Telepresent Connection) is an artis­tic VR project explor­ing ways of elic­it­ing a feel­ing of embod­ied con­nec­tion telep­resently through pseudo­hap­tics. This project emerged during the begin­ning of COVID-19-related lock­downs when our social inter­ac­tions began to inhabit nearly exclu­sively virtual …


Virtual Earthgazing

During months-long mis­sions, Astronauts expe­ri­ence var­i­ous aspects of social iso­la­tion and con­fine­ment. Our main aim is to study the fea­si­bil­ity of a vir­tual real­ity sen­sory stim­u­la­tion aug­men­ta­tion called Virtual Earthgazing (VE). The VE expe­ri­ence is designed to elicit an “overview effect” of the Earth and induce the feel­ing of awe, which has been shown to expand time per­cep­tion, increase e…


Flow

Flow is a pro­to­type of a real-time inter­ac­tive instal­la­tion with Brain-Computer Interface (BCI) tech­nol­ogy, aimed at evok­ing social con­nec­tion through bio-signal shar­ing. The Muse™ head­band would stream user’s EEG (brain wave) data and PPG (breath­ing) data into Unity 3D engine using LSL (Lab Streaming Layer). The gen­er­ated visu­al­iza­tion inter­act subtly with the participant’s breath­ing pattern…


Synedelica

Reality — Reimagined Synedelica reimag­ines what is pos­si­ble with immer­sive tech­nol­ogy, pro­vid­ing a new per­spec­tive on real­ity. In this synes­thetic mixed real­ity expe­ri­ence, visu­als of the real world are mod­u­lated by sound, attun­ing imm­er­sants to the beauty hidden in the seem­ingly mun­dane. Synedelica shows the world in a new light, rekin­dling child­like wonder and encour­ag­ing exploration. …


Autoethnographic Close Reading of Self-transcendent VR Experiences

Sipping the Virtual Elixir: An autoethno­graphic close read­ing of Ayahuasca Kosmik Journey a self-transcendent vir­tual expe­ri­ence. Recently self-transcendent expe­ri­ences are gain­ing inter­est in the research com­mu­nity because of their abil­ity to sup­port well­be­ing. Experiences of self-transcendence can be trans­for­ma­tive, lead­ing to a dimin­ish­ment of self/ego and the feel­ing of unity with n…


Novel Cybersickness Measures and Countermeasures BoF

Novel Cybersickness Measures and Countermeasures:  Birds of a Feather ses­sion at SIGGRAPH 2022   Interested in con­nect­ing & join­ing? If you’re inter­ested in con­nect­ing to others engaged or inter­ested in the Novel Cybersickness Measures and Countermeasures, you could join our online inter­ac­tive Birds of a Feather ses­sion at Siggraph 2022, on Fri Aug 05, 10 am-11:30 am Log in through h…


Telepresence

How can we improve telep­res­ence sys­tems (such as con­fer­ence robots) so they are not just “zoom on wheels” but actu­ally allow users to feel more present and nav­i­gate more easily around remote envi­ron­ments?” FeetBack: Augmenting Robotic Telepresence with Haptic Feedback on the Feet Telepresence robots allow people to par­tic­i­pate in remote spaces, yet they can be dif­fi­cult to manoeu­vre with people …


HyperJump flying to combat motion sickness

HyperJumping in Virtual Vancouver: Combating Motion Sickness by Merging Teleporting and Continuous VR Locomotion in an Embodied Hands-Free VR Flying Paradigm Motion sick­ness, unin­tu­itive nav­i­ga­tion, and lim­ited agency are crit­i­cal issues in VR/XR imped­ing wide-spread adop­tion and enjoy­able user expe­ri­ences. To tackle these chal­lenges, we present HyperJump, a novel VR inter­face merg­ing advantages …


Design Strategies for Genuine Connection

There is a promi­nent inter­est in the poten­tial of tech­nol­ogy for medi­at­ing social con­nec­tion, with a wealth of sys­tems designed to foster the feel­ing of con­nec­tion between strangers, friends, and family. In this project, we are explor­ing this design land­scape to derive a tran­si­tional def­i­n­i­tion of medi­ated gen­uine con­nec­tion and design strate­gies employed by artists and design­ers to sup­port the f…


Concurrent locomotion and interaction in VR

Can more embod­ied and leaning-based inter­faces help sup­port con­cur­rent loco­mo­tion and inter­ac­tion in VR when phys­i­cal walk­ing isn’t fea­si­ble? Physical walk­ing is often con­sid­ered the gold stan­dard for VR travel when­ever fea­si­ble. However, espe­cially for larger-scale vir­tual travel the free-space walk­ing areas are typ­i­cally too small, thus requir­ing hand­held con­trollers to nav­i­gate, which …


Star-Stuff: a way for the universe to know itself

Available on Oculus AppLab at https://www.oculus.com/experiences/quest/3367089710082568/ Inspired by Carl Sagan, Star-Stuff: a way for the uni­verse to know itself is an immer­sive expe­ri­ence cre­ated to remind imm­er­sants of their fun­da­men­tal con­nec­tion to human­ity and the Universe. This hybrid VR art­work brings two people together remotely or in a co-present instal­la­tion. In both cases, the …


SIRIUS - Virtual Earthgazing to mitigate effects of sensory isolation

SIRIUS (Scientific International Research in Unique Terrestrial Station) is a series of on-land iso­la­tion exper­i­ments mod­el­ling long-term space­flight in order to assess the psy­chophys­i­o­log­i­cal effects of iso­la­tion on a crew and pre­pare for long-duration space­flights, such as a trip to Mars. An 8-month-long iso­la­tion study com­menced in Moscow on Nov 4th, 2021, where a crew of 6 people (from Roscosm…


Leaning-based interfaces improve ground-based VR locomotion

Hand-held VR con­trollers are widely avail­able and used, how­ever they can con­tribute to unwanted side-effects, such as increased cyber­sick­ness, dis­ori­en­ta­tion, and cog­ni­tive load. Here, we show how a leaning-based inter­faces (“HeadJoystick”) can help improve user expe­ri­ence, usability,and per­for­mance in diverse ground-based nav­i­ga­tion includ­ing three com­ple­men­tary tasks: reach-the-target, follow-th…


Virtual Transcendent Dream

Flying dreams have the poten­tial to evoke a feel­ing of empow­er­ment (or self-efficacy, con­fi­dence in our abil­ity to suc­ceed) and self-transcendent expe­ri­ence (STE), which have been shown to con­tribute to an individual’s over­all well-being. However, these excep­tional dream­ing expe­ri­ences remain dif­fi­cult to induce at will. Inspired by the poten­tial of Virtual Reality (VR) to sup­port pro­found emoti…


Breath of Light

One must first come to know, through observ­ing one­self — just what one does with breath­ing. — Elsa Gindler Breath of light is a gen­er­a­tive inter­ac­tive instal­la­tion, exhib­ited at the 13th Shanghai Biennale. The instal­la­tion aims to foster a feel­ing of con­nec­tion and aware­ness through the process of breath­ing syn­chro­niza­tion. Each of the two par­tic­i­pants gen­er­ates their own light with their breat…


Integrating Continuous and Teleporting VR Locomotion into a Seamless "HyperJump" Paradigm

Here we pro­pose a hybrid inter­face that allows user to seam­lessly tran­si­tion between a slow ‘con­tin­u­ous’ mode and a fast ‘hyper­jump’ mode. The inter­face aims to main­tain the immer­sion, pres­ence, accu­racy and spa­tial updat­ing of con­tin­u­ous loco­mo­tion while adding the travel effi­ciency and min­i­miz­ing the cyber­sick­ness. Continuous loco­mo­tion in VR pro­vides unin­ter­rupted opti­cal flow, which mimics re…


VR Locomotion Interfaces Survey: How to Move in VR?

There are a mul­ti­tude of dif­fer­ent VR loco­mo­tion inter­faces out there, all with their own pros and cons. In fact, far too many to all inves­ti­gate in one behav­ioural study — so let’s ask diverse VR experts for their opin­ion… Interested in sup­port­ing research on VR loco­mo­tion inter­faces and help­ing the VR com­mu­nity better under­stand the pros and cons of dif­fer­ent inter­faces? We cre­ated a surve…


FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments

Beyond audio­vi­sual cues in VR: Using an HMD-mounted robot arm for ver­sa­tile facial hap­tics Abstract: FaceHaptics is a novel haptic dis­play based on a robot arm attached to a head-mounted vir­tual real­ity dis­play. It pro­vides local­ized, multi-directional and mov­able haptic cues in the form of wind, warmth, moving and single-point touch events and water spray to ded­i­cated parts of the face …


BioSpaceVR

Experience space like never before: An awe-inspiring VR expe­ri­ence that takes place in space where the Sun and stars react to biosen­sors. BioSpaceVR seeks to pro­vide a vir­tual self-transcendent expe­ri­ence. Self-transcendent expe­ri­ences are char­ac­ter­ized by the feel­ing of unity with others and the world. Our project is a bio-responsive VR expe­ri­ence that cre­ates an almost child­like expe­ri­ence of …


Immersive Installation for Creative Expression and Public Performance: Transcending Perception

  Artist Statement Transcending Perception is an inter­ac­tive Virtual Reality (VR) instal­la­tion devel­oped by John Desnoyers-Stewart that allows par­tic­i­pants to col­lab­o­rate in the cre­ative, impro­vi­sa­tional pro­duc­tion of mul­ti­sen­sory expe­ri­ences. Bodies and space are trans­formed into instru­ments which trans­late pres­ence into per­for­mance. This instal­la­tion reminds par­tic­i­pants that they are cre…


Body RemiXer

Extending Bodies to Stimulate Social Connection in an Immersive Installation Body RemiXer con­nects bodies through move­ment. It is an expe­ri­en­tial pro­jec­tion based Virtual Reality instal­la­tion that explores novel forms of embod­ied inter­ac­tion between mul­ti­ple par­tic­i­pants where their bodies mix into a shared embod­ied rep­re­sen­ta­tion pro­duc­ing a play­ful inter­ac­tion that aims to sup­port the feel­ing o…


Lucid Loop: A Virtual Deep Learning Biofeedback System for Lucid Dreaming Practice

Can VR and neu­ro­feed­back deep learn­ing art help enhance atten­tion and lucid dream­ing prac­tice? Lucid dream­ing, know­ing one is dream­ing while dream­ing, is an impor­tant tool for explor­ing con­scious­ness and bring­ing aware­ness to dif­fer­ent aspects of life. We cre­ated a system called Lucid Loop: a vir­tual real­ity expe­ri­ence where one can prac­tice lucid aware­ness via neu­ro­feed­back. Visuals are creativ…


Connecting through JeL – bio-responsive VR for interpersonal synchronization

Can a bio-responsive gen­er­a­tive art instal­la­tion foster inter­per­sonal syn­chro­niza­tion and con­nec­tion? JeL is a bio-responsive, immer­sive, inter­ac­tive, gen­er­a­tive art instal­la­tion designed to encour­age phys­i­o­log­i­cal syn­chro­niza­tion between the imm­er­sants. In this project, we will be explor­ing how novel forms of inter­ac­tion can be included in immer­sive tech­nol­ogy to foster the feel­ing of connection…


Embodied & Intuitive Flying for VR, Gaming, and TeleOperation

Flying has been a dream for mankind for mil­lenia — but flying inter­faces for VR, gaming, and tele­op­er­a­tion (e.g., drones) typ­i­cally rely on cum­ber­some double-joystick/gamepads and do not allow for intu­itive and embod­ied flying expe­ri­ences. Here, we develop low-cost embod­ied flying inter­faces that adapt leaning-based motion cueing par­a­digms thus free­ing up hands for addi­tional tasks beyond just na…


NaviBoard: Efficiently Navigating Virtual Environments

Here we pro­pose a novel and cost-effective setup of a leaning-based inter­face (“NaviBoard”) that allows people to effi­ciently nav­i­gate vir­tual envi­ron­ments — with per­for­mance levels match­ing the gold stan­dard of free-space walk­ing, with­out any increase in motion sick­ness Abstract Walking has always been the most common loco­mo­tion mode for humans in the real world. As a result, it has also been co…


3D User Interfaces Course

Siggraph 2018: 3D User Interfaces for Virtual Reality and Games: 3D Selection, Manipulation, and Spatial Navigation 3-hour Course Presented at Siggraph 2018 In this course, we will take a detailed look at dif­fer­ent topics in the field of 3D user inter­faces (3DUIs) for Virtual Reality and Gaming. With the advent of Augmented and Virtual Reality in numer­ous appli­ca­tion areas, the need and interest…


VR/MR/AR 4 Good: Creating with a Purpose

Interested in con­nect­ing & join­ing? If you’re inter­ested in con­nect­ing to others engaged or inter­ested in the xR4Good field, you could join the xR4Good face­book group, or fill out this online signup sheet and state what you’re inter­ested in Introduction Over the past five years, we have seen aware­ness and cre­ation of Virtual, Mixed, and Augmented Reality (VR/MR/AR or, the ‘immer­sive realit…


Connected through "AWE": creating immersive experiences for social connection

Do you get enough “awe” in your life? In our busy day-to-day lives, we often take our expe­ri­ences for granted. While we have the tech­nol­ogy to con­nect with one another, like smart phones, we don’t nec­es­sar­ily get out­side with nature, or stargaze. Such activ­i­ties may con­sist of common awe-inspiring moments, and we now under­stand that feel­ing awe is asso­ci­ated with all sorts of social and well…


Navigation Interface Tutorial

Navigation Interfaces for Virtual Reality and Gaming: Theory and Practice First ver­sion held at IEEE VR 2017, Sunday, March 19, 1:30pm — 5:00pm, updated vari­ants of the tuto­r­ial will be pre­sented at ACM Chi 2018 (slides) and IEEE VR 2018. At The Spatial Cognition 2018 con­fer­ence we will present a new tuto­r­ial on Spatial Navigation Interfaces for Immersive Environments focus­ing more on the…


Gamified Research

Gamifying Research — Researchifying Games While tra­di­tional exper­i­men­tal par­a­digms offer tight stim­u­lus con­trol and repeata­bil­ity, then tend to be a bit boring and removed from many real-world sit­u­a­tions, which can limit real-world trans­fer­abil­ity of results. How can we bring together the method­olog­i­cal strenghs of research with the intrin­sic moti­va­tion of play­ful­ness and gaming? The …


Navigational Search in VR: Do Reference Frames Help?

Would the rec­tan­gu­lar ref­er­ence frame of a CAVE help to reduce dis­ori­en­ta­tion and improve nav­i­ga­tion per­for­mance in VR? Here, we show that simply pro­vid­ing the rec­tan­gu­lar ref­er­ence frame of a room (as a simple wire­frame cuboid), but not a CAVE improved nav­i­ga­tional search per­for­mance.   Despite recent advances in vir­tual real­ity, loco­mo­tion in a vir­tual envi­ron­ment is still restricted becau…


Virtual Earthgazing - towards an overview effect in Virtual Reality

How can we use immer­sive VR to give people piv­otal pos­i­tive expe­ri­ences with­out having to send them out into space?   “We went to the Moon as tech­ni­cians, we returned as human­i­tar­i­ans” reflected Edgar Mitchell after his space flight. This describes the overview effect – a pro­found awe-inspiring expe­ri­ence of seeing Earth from space result­ing in a cog­ni­tive shift in world­view, le…


Lean and Elegant Motion Cueing in VR

How do we best design loco­mo­tion inter­faces for VR that pro­vide “enough” phys­i­cal motion cues (vestibular/proprioceptive) while still being effec­tive, afford­able, com­pact, and safe? Despite amaz­ing progress in com­puter graph­ics and VR dis­plays, most afford­able and room-sized VR loco­mo­tion inter­faces pro­vide only little phys­i­cal motion cues (e.g., vestibu­lar & pro­pri­o­cep­tive cues). To provide…


Pulse Breath Water

Pulse Breath Water is an immer­sive vir­tual envi­ron­ment manip­u­lated by the pulse of a participant’s breath that pro­vokes and chal­lenges the inter­ac­tion between a user and the sub­stan­tial ele­ment of the envi­ron­ment: water. The system “senses” the par­tic­i­pant, while the participant’s breath­ing feeds the system. The process is a sym­bi­otic play between inter­nal human processes [biosens­ing t…


Lost Spirit

Flight after death: Lost Spirit is an experiential-based Virtual Reality (VR) game whereby the player is trans­ported into the spirit world as they take flight to the after­life. Experience flight, weight­less­ness, and wonder. In Lost Spirit, you are stuck in the limbo — a world between the living and the dead. You will drift and fly through dif­fer­ent envi­ron­ments, each cor­re­spond­ing to different…


Immersive & Embodied Teleoperation Interfaces

Developing vir­tual inter­faces for embod­ied tele-operation and loco­mo­tion. How can we best design and imple­ment an embod­ied telep­res­ence system for tele-robotics, so we can safely explore remote, hard-to-reach, or poten­tially haz­ardous areas or sit­u­a­tions? The goal of the “TeleSpider” project is to design and imple­ment a telep­res­ence system where users can remotely oper­ate a robotic spid…


Biofeedback in VR - SOLAR

Resonance in Virtual Environments: hack­ing biofeed­back for alter­ing user’s affec­tive states How can we com­bine immer­sive vir­tual envi­ron­ments (VE) with biofeed­back and gam­i­fi­ca­tion to foster relax­ation, de-stressing and med­i­ta­tive states? That is, instead of increas­ing sen­sory over­load, can we use the immer­sive and affec­tive poten­tial of VE and gam­i­fi­ca­tion to assist espe­cially novice meditato…


Motion Seats for VR

Using motion seats for enhanc­ing loco­mo­tion and immer­sion in VR How can we pro­vide a “moving expe­ri­ence” through VR with­out having to use a full-scale motion plat­form? Could a com­pact and rel­a­tively low-cost “motion seat” pro­vide some of the same ben­e­fits, thus reduc­ing cost, com­plex­ity, space & safety require­ments? Despite con­sid­er­able advances in Simulation and Virtual Real…


VR in Architecture Design & Review

How can we use immer­sive Virtual Reality and embod­ied loco­mo­tion inter­faces to to design more  cost– and space-efficient solu­tions for effec­tive pre­sen­ta­tion and com­mu­ni­ca­tion of archi­tec­tural designs and ideas?  Our over­all goal is to iter­a­tively design and eval­u­ate a novel embod­ied VR system that enables users to quickly, intu­itively, and pre­cisely posi­tion their vir­tual view­point in 3D space…


Transition into VR: TransLocation

How can we ease users’ tran­si­tion from the real sur­round­ings into the vir­tual world? Many of today’s vir­tual real­ity (VR) setups are very much focused on tech­ni­cal aspects rather then the ben­e­fits of a coher­ent user expe­ri­ence. This work explores the idea of enhanc­ing the VR expe­ri­ence with a tran­si­tion phase. On a phys­i­cal level, this tran­si­tion offers the user a mean­ing­ful metaphor for en…


Cross-Disciplinary 'Immersion' Framework

Describing media as ‘immer­sive’ is ambigu­ous.  From debil­i­tat­ing addic­tion to ther­a­peu­tic relief, media engage­ment holds a clear dual­ity in its effect on human­ity… Without an inter­dis­ci­pli­nary char­ac­ter­i­za­tion of “immer­sion”, why do we allow this con­cept to be so read­ily invoked in dis­cus­sions of books, visual art, video games, vir­tual real­ity sys­tems and more? While “immer­sion” into tradit…


The Tabletop Makerspace

The table­top Makerspace was a Mitacs intern­ship project con­ducted in col­lab­o­ra­tion with Science World. A set of class­room tools was devel­oped to sup­port ‘Maker’ activ­i­ties at the museum. The tools included a home-built 3D printer and a set of elec­tron­ics kits for work­ing with the Arduino micro­con­troller. An intro­duc­tion to elec­tron­ics work­shop was devel­oped with local makers and Science World …


Gyroxus Gaming Chair for Motion Cueing in VR

Can self-motion per­cep­tion in vir­tual real­ity (VR) be enhanced by pro­vid­ing afford­able, user-powered min­i­mal motion cueing? Introduction & Motivation:  Can self-motion per­cep­tion in vir­tual real­ity (VR) be enhanced by pro­vid­ing afford­able, user-powered min­i­mal motion cueing? To inves­ti­gate this, we com­pared the effect of dif­fer­ent inter­ac­tion and motion par­a­digms on onset latency and intensi…


Embodied Self-Motion Illusions in VR

How can we pro­vide humans with a believ­able sen­sa­tion of being in and moving through computer-generated envi­ron­ments (like VR, com­puter games, or movies) with­out the need for costly and cum­ber­some motion plat­forms or large free-space walk­ing areas? That is, how can we “cheat intel­li­gently” by pro­vid­ing a com­pelling, embod­ied self-motion illu­sion (“vec­tion”) with­out the need for full phys­i­cal mo…


Dynamic Visual Cues for Spatial Updating

Why is object recog­ni­tion from novel view­points facil­i­tated if not the object rotates, but the observer moves around the object? According to the pre­vail­ing opin­ion, “spa­tial updat­ing” of our mental spa­tial rep­re­sen­ta­tion is sup­posed to be the under­ly­ing process. Here, we pro­vide first evi­dence that chal­lenge this notion, in that dynamic visual cues alone might be suf­fi­cient or at least contrib…


Navigational Search in VR: Do we need to walk?

Do we need full phys­i­cal motions for effec­tive nav­i­ga­tion through Virtual Environments? Recent results sug­gest that trans­la­tions might not be as impor­tant as pre­vi­ously believed, which could enable us to reduce over­all sim­u­la­tion effort and cost Physical rota­tions and trans­la­tions are the basic con­stituents of nav­i­ga­tion behav­ior, yet there is mixed evi­dence about their rel­a­tive impor­tance for co…


Spatial Cognition in VR vs. real world

Comparing spa­tial perception/cognition in real versus immer­sive vir­tual envi­ron­ments: it doesn’t com­pare! Virtual real­ity (VR) is increas­ingly used in psy­cho­log­i­cal research and appli­ca­tions – but does VR really afford nat­ural human spa­tial perception/cognition, which is a pre­req­ui­site for effec­tive spa­tial behav­ior? Using judg­ment of rel­a­tive direc­tion (JRD) tasks, Riecke & McNamara (Psychonom…


iSpaceMecha

Collaboration between the iSpace lab at SIAT and Mechatronics Undergraduate Interns to design and build a unique, vir­tual real­ity multi-modal motion sim­u­la­tor The iSpace pro­gram is cen­tered on inves­ti­gat­ing what con­sti­tutes effec­tive, robust, and intu­itive human spa­tial ori­en­ta­tion and behav­iour. This fun­da­men­tal knowl­edge will be applied to design novel, more effec­tive human-computer interfaces …


Path Integration in 3D

Switching Spatial Reference Frames for Yaw and Pitch Navigation: We’re used to nav­i­gat­ing on the ground plane, and have devel­oped spe­cific strate­gies to do so. How do these change if we move in a ver­ti­cal plane (roller-coaster-like, includ­ing head over heels motions)? Can we still main­tain ori­en­ta­tion and remem­ber where we came from, even though such upwards or down­wards (pitch) motions are less …


Auditory Navigation Displays

Can spa­tial audi­tory cues enable us to remain ori­ented while nav­i­gat­ing real or vir­tual envi­ron­ments? Non-visual nav­i­ga­tion inter­faces are cru­cial for the blind, who suffer great reduc­tions in mobil­ity because of the dif­fi­culty of nav­i­gat­ing new envi­ron­ments. Sighted users may also ben­e­fit from these types of dis­plays when they are nav­i­gat­ing but can’t see the screen of their mobile devi…


Sympathetic Guitar

Do humans response socially to abstract, expres­sive human-computer inter­faces? To inter­act with the Sympathetic Guitar is to use a famil­iar and com­fort­able Western musi­cal inter­face to feel an instant con­nec­tion to musi­cal cul­ture and style of the East.  The pro­to­type senses gui­tarists’ hand motions and per­for­mance dynam­ics to aug­ment a stan­dard clas­si­cal guitar with a dig­i­tal drone…


Spatial Updating With(out) Physical Motions?

How impor­tant are phys­i­cal motions for effec­tive spa­tial ori­en­ta­tion in VR? Most vir­tual real­ity sim­u­la­tors have a  seri­ous flaw: Users tend to get easily lost and dis­ori­ented as they nav­i­gate. According to the pre­vail­ing opin­ion, this is because phys­i­cal motion cues are absolutely required for stay­ing ori­ented while moving. In this study, we inves­ti­gated how phys­i­cal motion cues contribute …


Sonic Cradle

Sonic Cradle sus­pends the body is a com­pletely dark cham­ber which encour­ages expe­ri­ences com­pa­ra­ble to mind­ful­ness med­i­ta­tion.  Users com­pose peace­ful sound­scapes in real-time using only their breath­ing. Introduction and demo of the Sonic Cradle Sonic Cradle is a relax­ing human-computer inter­ac­tion par­a­digm designed to foster med­i­ta­tive atten­tional pat­terns.  The cur­rent p…



Projects

Cybersickness Survey: Key Factors and Prevention/Reduction Strategies

Welcome to our research project on VR-induced motion sickness (aka cybersickness). We are exploring the factors that cause dizziness and nausea while using VR to make virtual reality experiences more comfortable for everyone — join our study today and be a part of research on combating cybersickness! Who Can Participate? We are looking for individuals who: Have over two years of experience...


Cybersickness & Benchmarking Tutorial at ISMAR 2024

Half-day tutorial at ISMAR 2024 on “Cybersickness: Understanding the Challenge and Building Solutions with Standardized Benchmarks” Join us for an in-depth explo­ration of cyber­sick­ness, a per­sis­tent chal­lenge in vir­tual real­ity (VR). In this tuto­r­ial, we’ll delve into the causes and effects of cyber­sick­ness, draw­ing on the latest research and the­o­ries. We’ll...


1st International Workshop on Standardization in Cybersickness Research

1st International Workshop on Standardization in Cybersickness Research: “Establishing Standards for Cybersickness Measurement and Mitigation: A Community-Driven Approach” Date & time: Monday morning 8am-noon, 21 October 2024 at ISMAR 2024 Location: Redmond room, see ISMAR schedule Join remotely using this Zoom link or check on the ISMAR website for updates (contact us if you don't rec...


Pathways to flourishing

Leveraging Virtual Reality for cultivating compassion, resilience, social connectedness, and healthy habits in emerging adults facing chronic health challenges About half of youths with chronic physical conditions develop anxiety and/or depression, causing significant distress and disruption within their lives over many years. This underscores their need for well-being tools- particularly ones t...


Awedyssey: VR for pro­mo­ting and enhancing well-being

We are investigating and creating a new virtual reality (VR) experience, 'Awedyssey', for the promotion and enhancement of well-being. Today, digital technology pervasively intersects with our daily lives, and VR stands out as a digital tool capable of fostering positive emotion like awe, self-transcendence, and authentic social connection. Connecting with nature is very important for our mental...


VR Sickness Benchmark System

Tackling VR Sickness: A Novel Benchmark System for Assessing Contributing Factors and Mitigation Strategies through Rapid VR Sickness Induction and Recovery Abstract This research introduces a novel VR sickness benchmark system, designed to address the lack of standardized tools for assessing and mitigating VR sickness. It aims to rectify the inconsistencies and limitations prev...


Multisensory Contributions Illusory Self-Motion (Vection)

Beyond the Eye: Multisensory Contributions to the Sensation of Illusory Self-Motion (Vection) Abstract Vection is typically defined as the embodied illusion of self-motion in the absence of real physical movement through space. Vection can occur in real-life situations (e.g., ‘train illusion’) and in virtual environments and simulators. The vast majority of vection research focuses on vectio...


InExChange

InExChange: Fostering Genuine Social Connection through Embodied Breath Sharing in Mixed Reality InExChange is an interactive mixed reality experience centering around an inflatable vest which conveys a physical sense of shared breathing on the diaphragm between two or more participants. The experience is composed of three acts in which the participants' breaths are transformed into metapho...


Designing with Biosignals Workshop at ACM DIS 2023

Designing with Biosignals: Challenges, Opportunities, and Future Directions for Integrating Physiological Signals in Human-Computer Interaction ABSTRACT Biosensing technologies are a rapidly increasing presence in our daily lives. These sensor-based technologies measure physiological processes including heart rate, breathing, skin conductance, brain activity and more. Researchers are exploring...


ETC - Embodied Telepresent Connection

Embodied Telepresent Connection (ETC): Exploring Virtual Social Touch Through Pseudohaptics ETC (Embodied Telepresent Connection) is an artistic VR project exploring ways of eliciting a feeling of embodied connection telepresently through pseudohaptics. This project emerged during the beginning of COVID-19-related lockdowns when our social interactions began to inhabit nearly exclusively virtual ...


Virtual Earthgazing

During months-long missions, Astronauts experience various aspects of social isolation and confinement. Our main aim is to study the feasibility of a virtual reality sensory stimulation augmentation called Virtual Earthgazing (VE). The VE experience is designed to elicit an “overview effect” of the Earth and induce the feeling of awe, which has been shown to expand time perception, increase e...


Flow

Flow is a prototype of a real-time interactive installation with Brain-Computer Interface (BCI) technology, aimed at evoking social connection through bio-signal sharing. The Muse™ headband would stream user’s EEG (brain wave) data and PPG (breathing) data into Unity 3D engine using LSL (Lab Streaming Layer). The generated visualization interact subtly with the participant's breathing pattern...


Synedelica

Reality - Reimagined Synedelica reimagines what is possible with immersive technology, providing a new perspective on reality. In this synesthetic mixed reality experience, visuals of the real world are modulated by sound, attuning immersants to the beauty hidden in the seemingly mundane. Synedelica shows the world in a new light, rekindling childlike wonder and encouraging exploration. ...


Autoethnographic Close Reading of Self-transcendent VR Experiences

Sipping the Virtual Elixir: An autoethno­graphic close read­ing of Ayahuasca Kosmik Journey a self-transcendent vir­tual expe­ri­ence. Recently self-transcendent experiences are gaining interest in the research community because of their ability to support wellbeing. Experiences of self-transcendence can be transformative, leading to a diminishment of self/ego and the feeling of unity with n...


Novel Cybersickness Measures and Countermeasures BoF

Novel Cybersickness Measures and Countermeasures:  Birds of a Feather session at SIGGRAPH 2022   Interested in connecting & joining? If you're interested in connecting to others engaged or interested in the Novel Cybersickness Measures and Countermeasures, you could join our online interactive Birds of a Feather session at Siggraph 2022, on Fri Aug 05, 10 am-11:30 am Log in through h...


Telepresence

How can we improve telepresence systems (such as conference robots) so they are not just "zoom on wheels" but actually allow users to feel more present and navigate more easily around remote environments?" FeetBack: Augmenting Robotic Telepresence with Haptic Feedback on the Feet Telepresence robots allow people to participate in remote spaces, yet they can be difficult to manoeuvre with people ...


HyperJump flying to combat motion sickness

HyperJumping in Virtual Vancouver: Combating Motion Sickness by Merging Teleporting and Continuous VR Locomotion in an Embodied Hands-Free VR Flying Paradigm Motion sickness, unintuitive navigation, and limited agency are critical issues in VR/XR impeding wide-spread adoption and enjoyable user experiences. To tackle these challenges, we present HyperJump, a novel VR interface merging advantages ...


Design Strategies for Genuine Connection

There is a prominent interest in the potential of technology for mediating social connection, with a wealth of systems designed to foster the feeling of connection between strangers, friends, and family. In this project, we are exploring this design landscape to derive a transitional definition of mediated genuine connection and design strategies employed by artists and designers to support the f...


Concurrent locomotion and interaction in VR

Can more embodied and leaning-based interfaces help support concurrent locomotion and interaction in VR when physical walking isn't feasible? Physical walking is often considered the gold standard for VR travel whenever feasible. However, especially for larger-scale virtual travel the free-space walking areas are typically too small, thus requiring handheld controllers to navigate, which ...


SIRIUS - Virtual Earthgazing to mitigate effects of sensory isolation

SIRIUS (Scientific International Research in Unique Terrestrial Station) is a series of on-land isolation experiments modelling long-term spaceflight in order to assess the psychophysiological effects of isolation on a crew and prepare for long-duration spaceflights, such as a trip to Mars. An 8-month-long isolation study commenced in Moscow on Nov 4th, 2021, where a crew of 6 people (from Roscosm...


Leaning-based interfaces improve ground-based VR locomotion

Hand-held VR controllers are widely available and used, however they can contribute to unwanted side-effects, such as increased cybersickness, disorientation, and cognitive load. Here, we show how a leaning-based interfaces ("HeadJoystick") can help improve user experience, usability,and performance in diverse ground-based navigation including three complementary tasks: reach-the-target, follow-th...


Virtual Transcendent Dream

Flying dreams have the potential to evoke a feeling of empowerment (or self-efficacy, confidence in our ability to succeed) and self-transcendent experience (STE), which have been shown to contribute to an individual’s overall well-being. However, these exceptional dreaming experiences remain difficult to induce at will. Inspired by the potential of Virtual Reality (VR) to support profound emoti...


Breath of Light

One must first come to know, through observing oneself — just what one does with breathing. - Elsa Gindler Breath of light is a generative interactive installation, exhibited at the 13th Shanghai Biennale. The installation aims to foster a feeling of connection and awareness through the process of breathing synchronization. Each of the two participants generates their own light with their breat...


Integrating Continuous and Teleporting VR Locomotion into a Seamless "HyperJump" Paradigm

Here we propose a hybrid interface that allows user to seamlessly transition between a slow 'continuous' mode and a fast 'hyperjump' mode. The interface aims to maintain the immersion, presence, accuracy and spatial updating of continuous locomotion while adding the travel efficiency and minimizing the cybersickness. Continuous locomotion in VR provides uninterrupted optical flow, which mimics re...


VR Locomotion Interfaces Survey: How to Move in VR?

There are a multitude of different VR locomotion interfaces out there, all with their own pros and cons. In fact, far too many to all investigate in one behavioural study - so let's ask diverse VR experts for their opinion... Interested in supporting research on VR locomotion interfaces and helping the VR community better understand the pros and cons of different interfaces? We created a surve...


FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments

Beyond audiovisual cues in VR: Using an HMD-mounted robot arm for versatile facial haptics Abstract: FaceHaptics is a novel haptic display based on a robot arm attached to a head-mounted virtual reality display. It provides localized, multi-directional and movable haptic cues in the form of wind, warmth, moving and single-point touch events and water spray to dedicated parts of the face ...


BioSpaceVR

Experience space like never before: An awe-inspiring VR experience that takes place in space where the Sun and stars react to biosensors. BioSpaceVR seeks to provide a virtual self-transcendent experience. Self-transcendent experiences are characterized by the feeling of unity with others and the world. Our project is a bio-responsive VR experience that creates an almost childlike experience of ...


Immersive Installation for Creative Expression and Public Performance: Transcending Perception

  Artist Statement Transcending Perception is an interactive Virtual Reality (VR) installation developed by John Desnoyers-Stewart that allows participants to collaborate in the creative, improvisational production of multisensory experiences. Bodies and space are transformed into instruments which translate presence into performance. This installation reminds participants that they are cre...


Body RemiXer

Extending Bodies to Stimulate Social Connection in an Immersive Installation Body RemiXer connects bodies through movement. It is an experiential projection based Virtual Reality installation that explores novel forms of embodied interaction between multiple participants where their bodies mix into a shared embodied representation producing a playful interaction that aims to support the feeling o...


Lucid Loop: A Virtual Deep Learning Biofeedback System for Lucid Dreaming Practice

Can VR and neurofeedback deep learning art help enhance attention and lucid dreaming practice? Lucid dreaming, knowing one is dreaming while dreaming, is an important tool for exploring consciousness and bringing awareness to different aspects of life. We created a system called Lucid Loop: a virtual reality experience where one can practice lucid awareness via neurofeedback. Visuals are creativ...


Connecting through JeL – bio-responsive VR for interpersonal synchronization

Can a bio-responsive generative art installation foster interpersonal synchronization and connection? JeL is a bio-responsive, immersive, interactive, generative art installation designed to encourage physiological synchronization between the immersants. In this project, we will be exploring how novel forms of interaction can be included in immersive technology to foster the feeling of connection...


Embodied & Intuitive Flying for VR, Gaming, and TeleOperation

Flying has been a dream for mankind for millenia - but flying interfaces for VR, gaming, and teleoperation (e.g., drones) typically rely on cumbersome double-joystick/gamepads and do not allow for intuitive and embodied flying experiences. Here, we develop low-cost embodied flying interfaces that adapt leaning-based motion cueing paradigms thus freeing up hands for additional tasks beyond just na...


NaviBoard: Efficiently Navigating Virtual Environments

Here we propose a novel and cost-effective setup of a leaning-based interface ("NaviBoard") that allows people to efficiently navigate virtual environments - with performance levels matching the gold standard of free-space walking, without any increase in motion sickness Abstract Walking has always been the most common locomotion mode for humans in the real world. As a result, it has also been co...


3D User Interfaces Course

Siggraph 2018: 3D User Interfaces for Virtual Reality and Games: 3D Selection, Manipulation, and Spatial Navigation 3-hour Course Presented at Siggraph 2018 In this course, we will take a detailed look at different topics in the field of 3D user interfaces (3DUIs) for Virtual Reality and Gaming. With the advent of Augmented and Virtual Reality in numerous application areas, the need and interest...


VR/MR/AR 4 Good: Creating with a Purpose

Interested in connecting & joining? If you're interested in connecting to others engaged or interested in the xR4Good field, you could join the xR4Good facebook group, or fill out this online signup sheet and state what you're interested in Introduction Over the past five years, we have seen awareness and creation of Virtual, Mixed, and Augmented Reality (VR/MR/AR or, the ‘immersive realit...


Connected through "AWE": creating immersive experiences for social connection

Do you get enough “awe” in your life? In our busy day-to-day lives, we often take our experiences for granted. While we have the technology to connect with one another, like smart phones, we don’t necessarily get outside with nature, or stargaze. Such activities may consist of common awe-inspiring moments, and we now understand that feeling awe is associated with all sorts of social and well...


Navigation Interface Tutorial

Navigation Interfaces for Virtual Reality and Gaming: Theory and Practice First version held at IEEE VR 2017, Sunday, March 19, 1:30pm - 5:00pm, updated variants of the tutorial will be presented at ACM Chi 2018 (slides) and IEEE VR 2018. At The Spatial Cognition 2018 conference we will present a new tutorial on Spatial Navigation Interfaces for Immersive Environments focusing more on the...


Gamified Research

Gamifying Research - Researchifying Games While traditional experimental paradigms offer tight stimulus control and repeatability, then tend to be a bit boring and removed from many real-world situations, which can limit real-world transferability of results. How can we bring together the methodological strenghs of research with the intrinsic motivation of playfulness and gaming? The ...


Navigational Search in VR: Do Reference Frames Help?

Would the rectangular reference frame of a CAVE help to reduce disorientation and improve navigation performance in VR? Here, we show that simply providing the rectangular reference frame of a room (as a simple wireframe cuboid), but not a CAVE improved navigational search performance.   Despite recent advances in virtual reality, locomotion in a virtual environment is still restricted becau...


Virtual Earthgazing - towards an overview effect in Virtual Reality

How can we use immersive VR to give people pivotal positive experiences without having to send them out into space?   “We went to the Moon as technicians, we returned as humanitarians” reflected Edgar Mitchell after his space flight. This describes the overview effect – a profound awe-inspiring experience of seeing Earth from space resulting in a cognitive shift in worldview, le...


Lean and Elegant Motion Cueing in VR

How do we best design locomotion interfaces for VR that provide "enough" physical motion cues (vestibular/proprioceptive) while still being effective, affordable, compact, and safe? Despite amazing progress in computer graphics and VR displays, most affordable and room-sized VR locomotion interfaces provide only little physical motion cues (e.g., vestibular & proprioceptive cues). To provide...


Pulse Breath Water

Pulse Breath Water is an immersive virtual environment manipulated by the pulse of a participant’s breath that provokes and challenges the interaction between a user and the substantial element of the environment: water. The system “senses” the participant, while the participant’s breathing feeds the system. The process is a symbiotic play between internal human processes [biosensing t...


Lost Spirit

Flight after death: Lost Spirit is an experiential-based Virtual Reality (VR) game whereby the player is transported into the spirit world as they take flight to the afterlife. Experience flight, weightlessness, and wonder. In Lost Spirit, you are stuck in the limbo - a world between the living and the dead. You will drift and fly through different environments, each corresponding to different...


Immersive & Embodied Teleoperation Interfaces

Developing virtual interfaces for embodied tele-operation and locomotion. How can we best design and implement an embodied telepresence system for tele-robotics, so we can safely explore remote, hard-to-reach, or potentially hazardous areas or situations? The goal of the "TeleSpider" project is to design and implement a telepresence system where users can remotely operate a robotic spid...


state.scape: EEG-based Responsive Art Installation

State.scape: Using EEG-based brain-computer interfaces for a responsive art installation State.scape is an interactive installation in which audio-visuals are generated from users affective states (engagement, excitement, and meditation). The installation relies on a brain-computer interface based virtual environment and sonification, which both served as a platform for the exploration of users...


Biofeedback in VR - SOLAR

Resonance in Virtual Environments: hacking biofeedback for altering user's affective states How can we combine immersive virtual environments (VE) with biofeedback and gamification to foster relaxation, de-stressing and meditative states? That is, instead of increasing sensory overload, can we use the immersive and affective potential of VE and gamification to assist especially novice meditato...


Motion Seats for VR

Using motion seats for enhancing locomotion and immersion in VR How can we provide a "moving experience" through VR without having to use a full-scale motion platform? Could a compact and relatively low-cost "motion seat" provide some of the same benefits, thus reducing cost, complexity, space & safety requirements? Despite considerable advances in Simulation and Virtual Real...


VR in Architecture Design & Review

How can we use immersive Virtual Reality and embodied locomotion interfaces to to design more  cost- and space-efficient solutions for effective presentation and communication of architectural designs and ideas?  Our overall goal is to iteratively design and evaluate a novel embodied VR system that enables users to quickly, intuitively, and precisely position their virtual viewpoint in 3D space...


Transition into VR: TransLocation

How can we ease users' transition from the real surroundings into the virtual world? Many of today’s virtual reality (VR) setups are very much focused on technical aspects rather then the benefits of a coherent user experience. This work explores the idea of enhancing the VR experience with a transition phase. On a physical level, this transition offers the user a meaningful metaphor for en...


Cross-Disciplinary 'Immersion' Framework

Describing media as 'immersive' is ambiguous.  From debilitating addiction to therapeutic relief, media engagement holds a clear duality in its effect on humanity... Without an interdisciplinary characterization of "immersion", why do we allow this concept to be so readily invoked in discussions of books, visual art, video games, virtual reality systems and more? While "immersion" into tradit...


The Tabletop Makerspace

The tabletop Makerspace was a Mitacs internship project conducted in collaboration with Science World. A set of classroom tools was developed to support ‘Maker’ activities at the museum. The tools included a home-built 3D printer and a set of electronics kits for working with the Arduino microcontroller. An introduction to electronics workshop was developed with local makers and Science World ...


Gyroxus Gaming Chair for Motion Cueing in VR

Can self-motion perception in virtual reality (VR) be enhanced by providing affordable, user-powered minimal motion cueing? Introduction & Motivation:  Can self-motion perception in virtual reality (VR) be enhanced by providing affordable, user-powered minimal motion cueing? To investigate this, we compared the effect of different interaction and motion paradigms on onset latency and intensi...


Embodied Self-Motion Illusions in VR

How can we provide humans with a believable sensation of being in and moving through computer-generated environments (like VR, computer games, or movies) without the need for costly and cumbersome motion platforms or large free-space walking areas? That is, how can we "cheat intelligently" by providing a compelling, embodied self-motion illusion ("vection") without the need for full physical mo...


Dynamic Visual Cues for Spatial Updating

Why is object recognition from novel viewpoints facilitated if not the object rotates, but the observer moves around the object? According to the prevailing opinion, "spatial updating" of our mental spatial representation is supposed to be the underlying process. Here, we provide first evidence that challenge this notion, in that dynamic visual cues alone might be sufficient or at least contrib...


Navigational Search in VR: Do we need to walk?

Do we need full physical motions for effective navigation through Virtual Environments? Recent results suggest that translations might not be as important as previously believed, which could enable us to reduce overall simulation effort and cost Physical rotations and translations are the basic constituents of navigation behavior, yet there is mixed evidence about their relative importance for co...


Spatial Cognition in VR vs. real world

Comparing spatial perception/cognition in real versus immersive virtual environments: it doesn't compare! Virtual reality (VR) is increasingly used in psychological research and applications – but does VR really afford natural human spatial perception/cognition, which is a prerequisite for effective spatial behavior? Using judgment of relative direction (JRD) tasks, Riecke & McNamara (Psychonom...


iSpaceMecha

Collaboration between the iSpace lab at SIAT and Mechatronics Undergraduate Interns to design and build a unique, virtual reality multi-modal motion simulator The iSpace program is centered on investigating what constitutes effective, robust, and intuitive human spatial orientation and behaviour. This fundamental knowledge will be applied to design novel, more effective human-computer interfaces ...


Path Integration in 3D

Switching Spatial Reference Frames for Yaw and Pitch Navigation: We're used to navigating on the ground plane, and have developed specific strategies to do so. How do these change if we move in a vertical plane (roller-coaster-like, including head over heels motions)? Can we still maintain orientation and remember where we came from, even though such upwards or downwards (pitch) motions are less ...


Auditory Navigation Displays

Can spatial auditory cues enable us to remain oriented while navigating real or virtual environments? Non-visual navigation interfaces are crucial for the blind, who suffer great reductions in mobility because of the difficulty of navigating new environments. Sighted users may also benefit from these types of displays when they are navigating but can't see the screen of their mobile devi...


Sympathetic Guitar

Do humans response socially to abstract, expressive human-computer interfaces? To interact with the Sympathetic Guitar is to use a familiar and comfortable Western musical interface to feel an instant connection to musical culture and style of the East.  The prototype senses guitarists' hand motions and performance dynamics to augment a standard classical guitar with a digital drone...


Spatial Updating With(out) Physical Motions?

How important are physical motions for effective spatial orientation in VR? Most virtual reality simulators have a  serious flaw: Users tend to get easily lost and disoriented as they navigate. According to the prevailing opinion, this is because physical motion cues are absolutely required for staying oriented while moving. In this study, we investigated how physical motion cues contribute ...


Sonic Cradle

Sonic Cradle suspends the body is a completely dark chamber which encourages experiences comparable to mindfulness meditation.  Users compose peaceful soundscapes in real-time using only their breathing. [vimeo 35764652] Introduction and demo of the Sonic Cradle Sonic Cradle is a relaxing human-computer interaction paradigm designed to foster meditative attentional patterns.  The current p...



Publications

Schulte-Pelkum, J., Riecke, B. E., von der Heyde, M., & Bülthoff, H. H. (2004). Influence of display device and screen curvature on perceiving and controlling simulated ego-rotations from optic flow. Tech. rep., Max Planck Institute for Biological Cybernetics, T\übingen, Germany; Google Scholar.
Riecke, B. E. (1998). Untersuchung des menschlichen Navigationsverhaltens anhand von Heimfindeexperimenten in virtuellen Umgebungen (Investigating human navigation using homing experiments in virtual environments) [Master’s thesis]. Eberhard-Karls-Universität Tübingen, Fakultät für Physik.
Teramoto, W., & Riecke, B. E. (2010). Dynamic visual information facilitates object recognition from novel viewpoints. Journal of Vision, 10(13), 1–13. https://doi.org/10.1167/10.13.11
Riecke, B. E., Hendrik A.H.C. van Veen, and H. H. Bülthoff. 2000. “Visual Homing Is Possible without Landmarks: A Path Integration Study in Virtual Reality.” In Perception and Action in Virtual Environments, edited by M. von der Heyde and H. H. Bülthoff, 97–134. Max Planck Institute for Biological Cybernetics, Germany: Cognitive and Computational Psychophysics Department.
Riecke, B. E. (2003). How far can we get with just visual information? Path integration and spatial updating studies in virtual reality (Vol. 8). Logos. http://www.logos-verlag.de/cgi-bin/buch/isbn/0440
Riecke, B. E, Jacqueline D. Jordan, Mirjana Prpa, and Daniel Feuereissen. 2014. “Underlying Perceptual Issues in Virtual Reality Systems: Does Display Type Affect Self-Motion Perception?” Talk presented at the 55th Annual Meeting of the Psychonomic Society (Psychonomics), Los Angeles, USA.
Warren, J. (2014). Exploring Context: Using Teacher Perspective to Guide Tangible Multi-Touch Tabletop Design [MSc Thesis]. Simon Fraser University.
Riecke, B. E. (2008). Consistent Left-Right Reversals for Visual Path Integration in Virtual Reality: More than a Failure to Update One’s Heading? Presence: Teleoperators and Virtual Environments, 17(2), 143–175. https://doi.org/10.1162/pres.17.2.143
Riecke, B. E. (2016). Using Spatialized Sound to Enhance Self-Motion Perception in Virtual Environments and Beyond: Auditory and Multi-Modal Contributions. Canadian Acoustics, 33(3), 148–149.
Riecke, B. E., Väljamäe, A., & Schulte-Pelkum, J. (2009). Moving sounds enhance the visually-induced self-motion illusion (circular vection) in virtual reality. ACM Transactions on Applied Perception (TAP), 6, 7:1-7:27. https://doi.org/10.1145/1498700.1498701
Riecke, B. E. (2009). Cognitive and higher-level contributions to illusory self-motion perception (“vection”): does the possibility of actual motion affect vection? Japanese Journal of Psychonomic Science, 28(1), 135–139. http://ci.nii.ac.jp/naid/110007482465
Kelly, J. W., Riecke, B. E., Loomis, J. M., & Beall, A. C. (2008). Visual control of posture in real and virtual environments. Perception & Psychophysics, 70(1), 158–165. https://doi.org/10.3758/PP.70.1.158
Akyüz, A. O., Fleming, R., Riecke, B. E., Reinhard, E., & Bülthoff, H. H. (2007). Do HDR displays support LDR content?: a psychophysical evaluation. ACM Transactions on Graphics (TOG), 38.1-38.7. https://doi.org/10.1145/1275808.1276425
Riecke, B. E., Schulte-Pelkum, J., Avraamides, M. N., Heyde, M. V. D., & Bülthoff, H. H. (2006). Cognitive factors can influence self-motion perception (vection) in virtual reality. ACM Transactions on Applied Perception (TAP), 3, 194–216. https://doi.org/10.1145/1166087.1166091
Riecke, B. E., Heyde, M. V. D., & Bülthoff, H. H. (2005). Visual cues can be sufficient for triggering automatic, reflexlike spatial updating. ACM Transactions on Applied Perception (TAP), 2, 183–215. https://doi.org/10.1145/1077399.1077401
Riecke, B. E., van Veen, H. A. H. C., & Bülthoff, H. H. (2002). Visual homing is possible without landmarks: a path integration study in virtual reality. Presence: Teleoperators and Virtual Environments, 11, 443–473. https://doi.org/10.1162/105474602320935810
Prpa, Mirjana, Kivanc Tatar, Bernhard E. Riecke, and Philippe Pasquier. 2017. “The Pulse Breath Water System: Exploring Breathing as an Embodied Interaction for Enhancing the Affective Potential of Virtual Reality.” In Virtual, Augmented and Mixed Reality (VAMR 2017), edited by S. Lackey and J. Chen, 10280:153–72. Lecture Notes in Computer Science. Cham: Springer.
Stepanova, Ekaterina R., Markus von der Heyde, Alexandra Kitson, Thecla Schiphorst, and Bernhard E. Riecke. 2017. “Gathering and Apply Guidelines for TeleSpider Design for Urban Search and Rescue Applications on a Mobile Robot.” In Human-Computer Interaction. Interaction Contexts (HCI 2017), edited by M. Kurosu, 10272:562–81. Lecture Notes in Computer Science. Cham: Springer.
Quesnel, D., DiPaola, S., & Riecke, B. E. (2018). Deep Learning for Classification of Peak Emotions within Virtual Reality Systems. International SERIES on Information Systems and Management in Creative EMedia (CreMedia), 2017/2, 6–11. http://www.ambientmediaassociation.org/Journal/index.php/series/article/view/274
Riecke, B. E., Feuereissen, D., & Rieser, J. J. (2009). Auditory self-motion simulation is facilitated by haptic and vibrational cues suggesting the possibility of actual motion. ACM Transactions on Applied Perception (TAP), 6, 20:1-20:22. https://doi.org/10.1145/1577755.1577763
Kaltner, S., Jansen, P., & Riecke, B. E. (2017). Stimulus size matters: do life-sized stimuli induce stronger embodiment effects in mental rotation? Journal of Cognitive Psychology, 29(6), 701–716. https://doi.org/10.1080/20445911.2017.1310108
Pang, Carolyn E. 2013. “Technology Preferences and Routines for Distributed Families Coping with a Chronic Illness.” MSc Thesis, Surrey, BC, Canada: Simon Fraser University. http://summit.sfu.ca/item/12921.
Kaltner, S., Jansen, P., & Riecke, B. E. (2017). Stimulus size matters: Do life-sized stimuli induce stronger embodiment effects in mental rotation? Journal of Cognitive Psychology, 29(7), 701–716. https://doi.org/10.1080/20445911.2017.1310108
Riecke, B. E., & McNamara, T. P. (2017). Where you are affects what you can easily imagine: Environmental geometry elicits sensorimotor interference in remote perspective taking. Cognition, 169, 1–14. https://doi.org/10.1016/j.cognition.2017.07.014
Quesnel, D., & Riecke, B. E. (2018). Are You Awed Yet? How Virtual Reality Gives Us Awe and Goose Bumps. Frontiers in Psychology, 9, 1–22. https://doi.org/10.3389/fpsyg.2018.02158
Wrainwright, N., Stepanova, E. R., Aguilar, I., Kitson, A., & Riecke, B. E. (2019, June). Transcending the Lab: Supporting Self-Transcendent Experiences in VR [Talk]. FCAT Undergraduate Conference, Surrey City Hall, BC, Canada. https://www.sfu.ca/fcat/events/ugc.html
Kitson, Alexandra, and Bernhard E. Riecke. 2018. “Going Beyond: Lucid Dreaming as a Lens into Transformative Experience Design for Virtual Reality.” Symposium presentation presented at the 23rd Annual CyberPsychology, CyberTherapy & Social Networking Conference, Gatineau, Canada, June. http://interactivemediainstitute.com/cypsy23/.
Stepanova, Ekaterina, Denise Quesnel, Alexandra Kitson, Mirjana Prpa, Ivan Aguilar, and Bernhard E. Riecke. 2018. “A Framework for Studying Transformative Experiences through VR.” Symposium presentation presented at the 23rd Annual CyberPsychology, CyberTherapy & Social Networking Conference, Gatineau, Canada, June. http://interactivemediainstitute.com/cypsy23/.
Stepanova, E. R., Quesnel, D. T., & Riecke, B. E. (2019). Space - a Virtual Frontier: How to Design and Evaluate a Virtual Reality Experience of the Overview Effect Promoting The Feeling of Connectedness. Frontiers in Digital Humanities - Human-Media Interaction, 6(7), 1–22. https://doi.org/10.3389/fdigh.2019.00007
Singhal, Samarth. 2017. “Designing Communication Technologies for Couples to Support Touch Over Distance.” MSc Thesis, Surrey, BC, Canada: Simon Fraser University. http://summit.sfu.ca/item/17577.
Unterman, Benjamin. 2017. “Framing Effects: The Impact of Framing on Copresence in Virtual Theatre.” PhD Thesis, Surrey, BC, Canada: Simon Fraser University. http://summit.sfu.ca/item/17226.
Cramer, Emily. 2015. “A Code of Many Colours: A Rationale, Validation and Requirements for a Sound-Based Letter Colour-Code That Might Support Some Children with Dyslexia in Spelling Certain Words.” MSc Thesis, Surrey, BC, Canada: Simon Fraser University. http://summit.sfu.ca/item/15715.
Choo, Amber. 2015. “Virtual Reality Game Design for the Reduction of Chronic Pain Intensity in Clinical Settings.” MSc Thesis, Surrey, BC, Canada: Simon Fraser University. http://summit.sfu.ca/item/15695.
Meilinger, T., Riecke, B. E., & Bülthoff, H. H. (2014). Local and Global Reference Frames for Environmental Spaces. Quarterly Journal of Experimental Psychology, 67(3), 542–569. https://doi.org/10.1080/17470218.2013.821145
Wang, Sijie. 2010. “Comparing Tangible and Multi-Touch Interfaces for a Spatial Problem Solving Task.” MSc Thesis, Surrey, BC, Canada: Simon Fraser University. https://theses.lib.sfu.ca/thesis/etd6352.
Riecke, Bernhard E., and Jörg Schulte-Pelkum. 2013. “Perceptual and Cognitive Factors for Self-Motion Simulation in Virtual Environments: How Can Self-Motion Illusions (‘Vection’) Be Utilized?” In Human Walking in Virtual Environments, edited by Frank Steinicke, Yon Visell, Jennifer Campos, and Anatole Lécuyer, 27–54. New York: Springer. doi: 10.1007/978-1-4419-8432-6_2.
Kitson, A., Prpa, M., & Riecke, B. E. (2018). Immersive Interactive Technologies for Positive Change: A Scoping Review and Design Considerations. Frontiers in Psychology, 9, 1–19. https://doi.org/10.3389/fpsyg.2018.01354
Riecke, B. E., & McNamara, T. P. (2017). Where you are affects what you can easily imagine: Environmental geometry elicits sensorimotor interference in remote perspective taking. Cognition, 169, 1–14. https://doi.org/10.1016/j.cognition.2017.07.014
Riecke, B. E., & Jordan, J. D. (2015). Comparing the effectiveness of different displays in enhancing illusions of self-movement (vection). Frontiers in Psychology, 6(713). https://doi.org/10.3389/fpsyg.2015.00713
Riecke, Bernhard E., and Jörg Schulte-Pelkum. 2015. “An Integrative Approach to Presence and Self-Motion Perception Research.” In Immersed in Media: Telepresence Theory, Measurement and Technology, edited by Frank Biocca, Jonathan Freeman, Wijnand IJsselsteijn, Matthew Lombard, and Rachel Jones Schaevitz, 187–235. Springer. doi: 10.1007/978-3-319-10190-3_9.
Lawson, B.D., and Bernhard E. Riecke. 2014. “The Perception of Body Motion.” In Handbook of Virtual Environments: Design, Implementation, and Applications, edited by Kelly S. Hale and Kay M. Stanney, 2nd ed., 163–95. Ch. 7. CRC Press.
Stepanova, E. R., Quesnel, D. T., & Riecke, B. E. (2019). Understanding AWE: Can a virtual journey, inspired by the Overview Effect, lead to an increased sense of interconnectedness? Frontiers in Digital Humanities - Human-Media Interaction, 6(9), 1–21. https://doi.org/10.3389/fdigh.2019.00009
Keshavarz, B., Phillip-Muller, A. E., Hemmerich, W., Riecke, B. E., & Campos, J. J. (2018). The effect of visual motion stimulus characteristics on vection and visually induced motion sickness. Displays, 58, 71–81. https://doi.org/10.1016/j.displa.2018.07.005
Macaranas, Anna. 2013. “The Effects of Intuitive Interaction Mappings on the Usability of Body-Based Interfaces.” MSc Thesis, Surrey, BC, Canada: Simon Fraser University. https://theses.lib.sfu.ca/thesis/etd7660.
Kaltner, Sandra. 2015. “Verkörperte mentale Rotation: objektbasierte und egozentrische Transformationen vor dem Embodiment-Ansatz.” PhD thesis, Regensburg: University of Regensburg. http://epub.uni-regensburg.de/32447/.
Pennefather, P., Rizzotti, P., Desnoyers-Stewart, J., Stepanova, K., Riecke, B., Danenkov, L., Ryzhov, V., Saroyan, J., Beltran, W., & Chak, R. (2020). A Fun Palace: A Mixed Reality Event Through the Looking Glass of Cybernetics. Cybernetics and Human Knowing, 27(2), 61–80.
Palmisano, S., Nakamura, S., Allison, R. S., & Riecke, B. E. (2020). The Stereoscopic Advantage for Vection Persists Despite Reversed Disparity. Attention, Perception, & Psychophysics, 82(4), 2098–2118. https://doi.org/10.3758/s13414-019-01886-2)
Desnoyers-Stewart, J., Smith, M. L., & Riecke, B. E. (2020). Transcending the Virtual Mirror Stage: Embodying the Virtual Self through the Digital Mirror. In E. Papadaki (Ed.), Radical Immersion: Navigating between virtual/physical environments and information bubbles (pp. 156–167). https://gala.gre.ac.uk/id/eprint/31046/
Riecke, B. E. 2011. “Compelling Self-Motion Through Virtual Environments Without Actual Self-Motion – Using Self-Motion Illusions (‘Vection’) to Improve User Experience in VR. In J. Kim (Ed.).” In Virtual Reality, edited by Jae-Jin Kim, 149–76. Ch. 8. London, UK: InTechOpen. https://www.intechopen.com/books/virtual-reality/compelling-self-motion-through-virtual-environments-without-actual-self-motion-using-self-motion-ill.
Schulte-Pelkum, J. 2007. “Perception of Self-Motion: Vection Experiments in Multi-Sensory Virtual Environments.” PhD thesis, Ruhr-Universität Bochum. https://hss-opus.ub.ruhr-uni-bochum.de/opus4/frontdoor/deliver/index/docId/2735/file/diss.pdf.
Stahn, A C, M Basner, Bernhard E. Riecke, T Hartley, T Wolbers, K Brauns, A Friedl-Werner, et al. 2021. “Hypocampus in I1YMP: Spatial Cognition and Hippocampal Plasticity during Long-Duration Low-Earth Orbit Missions.” Presentation presented at the NASA HRP IWS 2021 meeting.
Zielasko, D., & Riecke, B. E. (2021). To Sit or Not to Sit in VR: Analyzing Influences and (Dis)Advantages of Posture and Embodied Interaction. Computers, 10(6), 1–20. https://doi.org/10.3390/computers10060073
Keshavarz, B., Riecke, B. E., Hettinger, L. J., & Campos, J. L. (2015). Vection and visually induced motion sickness: How are they related? Frontiers in Psychology, 6(413), 1–11. https://doi.org/10.3389/fpsyg.2015.00413
Riecke, B. E., Feuereissen, D., Rieser, J. J., & McNamara, T. P. (2015). More than a Cool Illusion? Functional Significance of Self-Motion Illusion (Circular Vection) for Perspective Switches. Frontiers in Psychology, 6(1174), 1–13. https://doi.org/10.3389/fpsyg.2015.01174
Palmisano, S., & Riecke, B. E. (2018). The search for instantaneous vection: An oscillating visual prime reduces vection onset latency. PLOS ONE, 13(5), 1–26. https://doi.org/10.1371/journal.pone.0195886
Desnoyers-Stewart, J. (2022). Star-Stuff [Oculus AppLab]. https://www.oculus.com/experiences/quest/3367089710082568/
Desnoyers-Stewart, J. (2022). Star-Stuff [Hoame VR]. Hoame VR. https://www.oculus.com/experiences/quest/3367089710082568/
Riecke, B. E., Trepkowski, C., & Kruijff, E. (2016). “Human Joystick”: Enhancing Self-Motion Perception (Linear Vection) by using Upper Body Leaning for Gaming and Virtual Reality (1; ISpaceLab Technical Report, pp. 1–12). Simon Fraser University. http://ispace.iat.sfu.ca/publications/
Freiwald, J. P., Schmidt, S., Riecke, B. E., & Steinicke, F. (2022). The Continuity of Locomotion: Rethinking Conventions for Locomotion and its Visualization in Shared Virtual Reality Spaces. ACM Transactions on Graphics, 41(6), 211:1-211:14. https://doi.org/10.1145/3550454.3555522
Gehrke, Lukas. 2015. “Brain Dynamics Underlying Physical vs. Optical Flow Rotation.” MSc Thesis, Berlin, Germany: Technical University Berlin.
Sigurdarson, Salvar. 2014. “The Influence of Visual Structure and Physical Motion Cues on Spatial Orientation in a Virtual Reality Point-to-Origin Task.” MSc Thesis, Surrey, BC, Canada: Simon Fraser University. http://summit.sfu.ca/item/14532.
Milne, Andrew P. 2014. “What Makes a Maker: Common Attitudes, Habits and Skills from the Do-It-Yourself (DIY) Community.” MSc Thesis, Surrey, BC, Canada: Simon Fraser University. http://summit.sfu.ca/item/14578.
Sproll, Daniel. 2013. “Influence of Ethnicity, Gender and Answering Mode on Reference Frame Selection for Virtual Point-to-Origin Tasks.” BSc Thesis, Osnabrück, Germany & Surrey, BC, Canada: Universität Osnabrück & Simon Fraser University.
Feuereissen, D. 2008. “VR: Getting the Reality Part Straight – Does Jitter and Suspension of the Human Body Increase Auditory Circular Vection?” Bachelor’s Thesis, Department of Computer Science in Media. http://www.kyb.mpg.de/publication.html?publ=5071.
Kitson, A., Riecke, B. E., & Gaggioli, A. (2019). Digital Wellbeing: Considering Self-transcendence. ACM CHI 2019 Workshop on “Designing for Digital Wellbeing,” 1–4. https://digitalwellbeingworkshop.wordpress.com/position-papers/
Riecke, B. E., LaViola, J. J., Jr., & Kruijff, E. (2019, submitted). Topics in Virtual Reality: 3D Selection, Manipulation, Spatial Navigation, and Cybersickness. ACM SIGGRAPH 2019 Courses.
Sproll, D., Freiberg, J., Grechkin, T., & Riecke, B. E. (2013). Paving the way into virtual reality - a transition in five stages. IEEE Symposium on 3D User Interfaces, 175–176. https://doi.org/10.1109/3DUI.2013.6550235
Vidyarthi, J., & Riecke, B. E. (2013). Mediated Meditation: Cultivating Mindfulness with Sonic Cradle. Proceedings of the 2013 Annual Conference on Human Factors in Computing Systems ALT.CHI, 2305–2314. https://doi.org/10.1145/2468356.2468753
Desnoyers-Stewart, John. 2022. “Star-Stuff: A Way for the Universe to Know Itself.” In SIGGRAPH ’22  Immersive Pavilion, 1–2. Vancouver, BC, Canada: ACM. https://doi.org/10.1145/3532834.3536198.
Zielasko, Daniel, and Bernhard E. Riecke. 2020. “Sitting vs. Standing in VR: Towards a Systematic Classification of Challenges and (Dis)Advantages.” In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 297–98. Atlanta, GA, USA: IEEE. https://doi.org/10.1109/VRW50115.2020.00067.
Riecke, Bernhard E., and Daniel Zielasko. 2020. “Towards an Affordance of Embodied Locomotion Interfaces in VR: How to Know How to Move?” In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 295–96. Atlanta, GA, USA: IEEE. https://doi.org/10.1109/VRW50115.2020.00066.
Zielasko, Daniel, and Bernhard E. Riecke. 2020. “Either Give Me a Reason to Stand or an Opportunity to Sit in VR.” In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 283–84. Atlanta, GA, USA: IEEE. https://doi.org/10.1109/VRW50115.2020.00060.
Zielasko, Daniel, and Bernhard E. Riecke. 2020. “Can We Give Seated Users in Virtual Reality the Sensation of Standing or Even Walking? Do We Want To?” In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 281–82. Atlanta, GA, USA: IEEE. https://doi.org/10.1109/VRW50115.2020.00059.
Riecke, Bernhard E, and Daniel Zielasko. 2021. “Continuous vs. Discontinuous (Teleport) Locomotion in VR: How Implications Can Provide Both Benefits and Disadvantages.” In 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 373–74. Lisbon, Portugal: IEEE. https://doi.org/10.1109/VRW52623.2021.00075.
Quesnel, D., & Riecke, B. E. (2017). Awestruck: Natural Interaction with Virtual Reality on Eliciting Awe. 205–206. https://doi.org/10.1109/3DUI.2017.7893343
Freiberg, Jacob, Timofey Grechkin, and Bernhard E. Riecke. 2013. “Do Walking Motions Enhance Visually Induced Self-Motion Illusions in Virtual Reality?” In IEEE Virtual Reality, 101–2. Lake Buena Vista, FL, USA: IEEE. https://doi.org/10.1109/VR.2013.6549382.
Calderon, Nadya, Bernhard E. Riecke, and Brian Fisher. 2012. “Augmenting Visual Representation of Affectively Charged Information Using Sound Graphs.” In Poster Abstracts of IEEE VisWeek 2012, 113–14. Seattle, USA: IEEE.
Jordan, Jacqueline D., Mirjana Prpa, Daniel Feuereissen, and Bernhard E. Riecke. 2014. “Comparing the Effectiveness of Stereo Projection vs 3D TV in Inducing Self-Motion Illusions (Vection).” In , 128. Vancouver, Canada. https://doi.org/10.1145/2628257.2628360.
Milne, Andrew P. 2013. “Conducting Research on Makers with Future Science Leaders: Experiences from a Museum Enrichment Program.” In . New York, NY, USA.
Heyde, M. von der, B. E. Riecke, D. W. Cunningham, and H. H. Bülthoff. 2001. “Visual-Vestibular Sensor Integration Follows a Max-Rule: Results from Psychophysical Experiments in Virtual Reality.” In VisionScienceS01, edited by K. Nakayama et al, 142. VisionScienceS Meeting. Sarasota, Florida, United States.
Heyde, M. von der, B. E. Riecke, D. W. Cunningham, and H. H. Bülthoff. 2001. “No Visual Dominance for Remembered Turns - Psychophysical Experiments on the Integration of Visual and Vestibular Cues in Virtual Reality.” In VisionScienceS01, edited by K. Nakayama et al. Vol. 1. VisionScienceS Meeting. Sarasota, Florida, United States. https://doi.org/doi:10.1167/1.3.188.
Desnoyers-Stewart, John, Megan L. Smith, and Bernhard E. Riecke. 2019. “Transcending the Virtual Mirror Stage: Embodying the Virtual Self through the Digital Mirror.” In . London, UK. https://youtu.be/L6ykZLiGC8o.
Spartin, L., & Desnoyers-Stewart, J. (2022). Digital Relationality: Relational aesthetics in contemporary interactive art. EVA London 2022, 1–8. https://www.youtube.com/watch?v=tbj1K7z1Zcc
Desnoyers-Stewart, J. (2022). Star-Stuff: A Shared Immersive Experience in Space. ISEA 2022, 1–8.
Pennefather, P. P., & Desnoyers-Stewart, J. (2022). The Fun Palace: Designing Human Experiences at Mixed Reality Events to Increase Engagement. ISEA 2022, 1–5.
Prpa, M., Cochrane, K., & Riecke, B. E. (2016). Hacking Alternatives in 21st Century: Designing a Bio-Responsive Virtual Environment for Stress Reduction. In S. Serino, A. Matic, D. Giakoumis, G. Lopez, & P. Cipresso (Eds.), Pervasive Computing Paradigms for Mental Health (pp. 34–39). Springer International Publishing. https://doi.org/10.1007/978-3-319-32270-4_4
Kitson, A. J., Muntean, R., DiPaola, S., & Riecke, B. E. (2022). Lucid Loop: Exploring the Parallels between Immersive Experiences and Lucid Dreaming. DIS ’22 Proceedings of the 2022 Conference on Designing Interactive Systems (ACM DIS ’22), 1–16.
Stepanova, E. R., Desnoyers-Stewart, J., Höök, K., & Riecke, B. E. (2022). Strategies for Fostering a Genuine Feeling of Connection in Technologically Mediated Systems. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1–26. https://doi.org/10.1145/3491102.3517580
Liu, P., Stepanova, E. R., Kitson, A. J., Schiphorst, T., & Riecke, B. E. (2022). Virtual Transcendent Dream: Empowering People through Embodied Flying in Virtual Reality. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1–18. https://doi.org/10.1145/3491102.3517677
Desnoyers-Stewart, J., Stepanova, E. R., Pasquier, P., & Riecke, B. E. (2019). JeL: Connecting Through Breath in Virtual Reality. Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, 1–6. https://doi.org/10.1145/3290607.3312845
Zhang, Y., Riecke, B. E., Schiphorst, T., & Neustaedter, C. (2019). Perch to Fly: Embodied Virtual Reality Flying Locomotion with a Flexible Perching Stance. Proceedings of the 2019 on Designing Interactive Systems Conference, 253–264. https://doi.org/10.1145/3322276.3322357
Wilberz, A., Leschtschow, D., Trepkowski, C., Maiero, J., Kruijff, E., & Riecke, B. E. (2020). FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1–14. https://doi.org/10.1145/3313831.3376481
Kitson, A. J., Desnoyers-Stewart, J., Miller, N., Adhikari, A., Stepanova, E. R., & Riecke, B. E. (2020). Can We Trust What’s Real? Using Fiction to Explore the Potential Dissociative Effects of Immersive Virtual Reality. Ethics of MR’20  Workshop at ACM CHI 2020 (Exploring Potentially Abusive Ethical, Social and Political Implications of Mixed Reality Research in HCI), Honolulu, HI, USA.
Adhikari, A., Zielasko, D., Bretin, A., von der Heyde, M., Kruijff, E., & Riecke, B. E. (2021). Integrating Continuous and Teleporting VR Locomotion into a Seamless “HyperJump” Paradigm. 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 370–372. https://doi.org/10.1109/VRW52623.2021.00074
Jones, B., Maiero, J., Mogharrab, A., Aguliar, I. A., Adhikari, A., Riecke, B. E., Kruijff, E., Neustaedter, C., & Lindeman, R. W. (2020). FeetBack: Augmenting Robotic Telepresence with Haptic Feedback on the Feet. Proceedings of the 2020 International Conference on Multimodal Interaction, 194–203. https://doi.org/10.1145/3382507.3418820
Joksimovic, S., Gasevic, D., Kovanovic, V., Riecke, B. E., & Hatala, M. (2015). Social presence in online discussions as a process predictor of academic performance. Journal of Computer Assisted Learning.
Bülthoff, H. H., B. E. Riecke, and H. A. H. C. van Veen. 2000. “Do We Really Need Vestibular and Proprioceptive Cues for Homing.” Invest. Ophthalmol. Vis. Sci. (ARVO) 41 (4): 225B225.
Schulte-Pelkum, J., B. E. Riecke, M. von der Heyde, and H. H. Bülthoff. 2003. “Screen Curvature Does Influence the Perception of Visually Simulated Ego-Rotations.” Journal of Vision, Poster presented at VSS, 3 (9). https://doi.org/doi: 10.1167/3.9.411.
Riecke, B. E, M. von der Heyde, and H. H Bülthoff. 2001. “How Real Is Virtual Reality Really? Comparing Spatial Updating Using Pointing Tasks in Real and Virtual Environments.” Journal of Vision 1 (3): 321a. http://www.kyb.mpg.de/publication.html?publ=629.