We presented our art paper “Body RemiXer: : Extending Bodies to Stimulate Social Connection in an Immersive Installation” online at SIGGRAPH 2020. The presentation is available on YouTube below and the full paper is published in Leonardo 53(4).
You can learn more about this project to connect people through abstract bodies and touch here or download the full paper below.
Posted inUncategorized|Comments Off on Body RemiXer Presented at SIGGRAPH 2020
Daniel Zielasko and Bernhard Riecke will be organizing the 1st Workshop on Seated VR on Sunday morning March 22, just before the IEEE VR 2020 conference. We call for 1–2 page extended abstracts for authors interested in presenting a lightning talk at the workshop. See our Seated VR workshop website for details.
Here’s my own presentation from the workshop , “Towards an Affordance of Embodied Locomotion Interfaces in VR: How to Know How to Move?” we’ll post links to all the talks soon
Posted inUncategorized|Comments Off on 1st Workshop on Seated VR co-hosted with IEEE VR 2020, March 22
Emerging Media BC Community of Practice Event on Friday April 12th 2019: Presentations & VR demos
The April 2019 meeting (details and RSVP here) will be hosted at the School of Interactive Arts and Technology (SIAT) at the SFU Surrey campus. The theme of this meeting is “VR 4Good”, based on the 4Good projects of the students at SIAT, with a focus on the benefits of facilitating such VR projects. Several of the VR4Good projects that were made during the Semester in Alternate Realities intensive (15 credit) course on the SFU Surrey campus will be presented.
These works were created by students from different SFU departments who joined our Semester in Alternate Realities. The works are part of an emerging field of development and research that use VR and other emerging technologies to increase awareness and be a catalyst for improving our world. These projects are the culmination of 14 weeks of an intense, iterative, agile, reflective and adaptive teaching and learning process. Four distinct yet interconnected “for good” Virtual Reality experiences focused on themes related to improving the social and environmental condition, by not only drawing awareness to issues in the world that require our attention, but also providing unique first-person experiences that we could not have without immersive technology.
Posted inUncategorized|Comments Off on Final event from Semester in Alternate Realities team
The Semester in Alternate Realities team is proud to present our students’ 1st showcase on their Virtual Reality for Good projects (“VR4Good”) in the Mezzanine on ourSFU Surrey campus, on February 1st from 1:00 — 4:30 pm. These works were created by students from different SFU departments who joined our Semester in Alternate Realities. They are part of an emerging field of development and research that use VR and other emerging technologies to increase awareness and be a catalyst for improving our world. Join us and be our first public user-testers for four short and unique VR projects.
Additional showcases with different projects will follow later this semester, likely on March 8th (2nd showcase) & April 4th (final showcase)
Rising Waters: Is this your future?
Rising Waters is an exploratory context driven game that sets you in the future, in which pollution has devastated the planet. Through narrative and context clues, the player is set to navigate the world and automatically obtaining samples within their robotic suite to see how a low-lying coastal city has been damaged through careless environmental damage. Once the mission is complete, the player exits the experience and reflects on their own environmental impact.
Goal: The goal of this project is to prompt participants to consider the long-term impact of their actions on the environment through visual, auditory and experiential learning. By situating the participant in a familiar location, city banners are used to help localize the experience, participants will learn about how a localized city can be impacted through a lack of immediate change towards combating global and local pollutants, affect how the biosphere behaves, which can contribute to rising sea levels, and degradation in the overall quality of the environment.
Core User Experience: The core user experience centers around narrative and experiential learning through interaction with the built environment. In Rising Waters, the focus of attention first and foremost should be on how the user interacts with the built environment, driving the core experience to be exploratory in nature. By having participants interact with the environment, Rising Waters wants to confront the user’s initial sense of awe, with one of dread and shock. Ideally, the user experience presents itself as something nouveau, driving forward a sense of urgency that imbues the user with a desire to complete the experience.
Restless Sleep - A Waking Coma Experience: To find the forest through the eyes of others
“Restless Sleep” is an innovative VR experience of a fabulated narrative in which users’ mind is connecting with the consciousness of coma patient in order to see and feel through a comatose perception. The experience is composed by both performative and virtual elements, incorporating interactive responses between haptics and visuals, to guide users into the full embodiment of a coma patient, who is constantly feeling and interacting with their physical environment.
Goal: Our goal of the project is to let the user foster empathy towards coma patients. Many people question how conscious of their surroundings Coma patients are and how much outside stimulation they can actually interpret. By being in a VR coma experience (per, during and post), the users are put into the shoes of a coma patient where they can learn how coma patients are treated and also to show that they have a mind of their own which are not fully unconscious.
Core User Experience: The core user experience we want to accomplish is to foster empathy for coma patients and develop an understanding of what they can feel and how bystander actions can influence the mental state along with the mental interpretations of the patient. We want users to walk away from the experience with a sense of introspective contemplation, and hopefully knowledge on how best to handle themselves around people in comas (or even those in other states of altered consciousness). This is because we identified a surprising amount of misinformation and ignorance surrounding comatose states, as well
as many first-hand experiences of comatose patients. The most impactful was the story of a girl who was in a two week medically induced coma and her recounts of the hallucinatory dreams she had while in the coma.. While we deliberately didn’t want to subject the users to this, we did want to bring attention to the feeling of helplessness one feels, and how it is analogous to being in VR — especially in a public setting.
BARRIERS is a virtual experience that allows the user to view interactions through the eyes and ears of a Non-Native English speaker. After biting into a “magical” cookie, the participant is transported to a fantasy world. In order to achieve their goal of finding a washroom, the immersant must interact with nearby talking animals.
Goal: The main emotion we are trying to evoke in our experience is the vulnerability of relying on other people to be willing to help you when they don’t understand what they are saying. What makes our project unique is its ability to make anyone feel as though they do not understand the language since it is made up. This ties in with the course theme because it invokes empathy in the user and give them an experience they would not otherwise have access to.
Core user experience:
Through BARRIERS, we hope to place users in the vulnerable position of relying on other people for help. This experience aims to recreate the frustration that comes with not being able to speak English and the gratitude that arises when an individual takes the time to explain and use different communication techniques to foster understanding.
The Pitch of Red: Every colour has a voice
The Pitch of Red is an immersive VR experience that allows the user to experience synesthesia — a condition that blends two or more senses together. In our experience, users are able to hear the sounds of the colours they see as they wander through the mind of Wassily Kandinsky — an artist who had such a condition.
Goal: The project tries to bring the user closer to the notion of mixing human senses. The psychological deviation can be experienced by anyone to any extent. It is mostly based on what the person associated with a certain sound, colour, taste or any other humanly understood medium. The amount of people who experience that feeling is relatively small, however, the impact they have made in history or their way of perceiving the world is worth studying. The project is focused on communicating the idea of mixing human senses, also known as — synesthesia to those who have never heard something similar to that or never experienced that sense.
Core user experience: For The Pitch of Red we aim to provide immersants with a greater understanding of what it is like to have synesthesia of associating colors to sounds, or at the very least, be able to describe what such synesthesia is to someone who may not know what it means. During the experience they will be granted a “superpower”: being able to hear every colour that they look at and we would expect users to feel curious, excited and intentionally a bit overwhelmed. After finishing our VR experience, we would like users will come out with a better understanding of what this particular type of synesthesia might feel like in real life.
Aesthetically, we want the experience to allow the immersant to see an alternate reality through the eyes of Wassily Kandinsky. With that said, one of the difficulties of VR is the lack of haptic touch so our focus with The Pitch of Red is to try to simulate the relationship between sight and sound.
Posted inUncategorized|Comments Off on Semester in Alternate Realities 1st VR showcase on Friday February 1st
Together with my colleague Patrick Pennefather we’re currently designing and getting ready to co-teach a brand-new 15-credit course entitled “Semester in Alternate Realities”
In this project-based course, participants will be challenged to develop solutions using technologies such as VR (e.g., Oculus Rift, HTC Vive) and immersive multi-modal media installations. In addition to focusing on the co-construction of digital prototypes affording meaningful experiences in “alternate realities”, our objective is to stimulate documented reflection and discussion throughout the process. Participants can expect to work collaboratively, be matched according to the skills they bring, and be provided time and resources to learn new techniques and approaches, soft– and hard skills, and processes to conduct user research. Participants will get the opportunity to reflect on future technologies and their potential impact on the world, and improve their presentation skills and publicly showcase their projects. To incorporate diverse perspectives, students from different disciplines are invited to apply and, in their application, argue how they could contribute to the course and the co-construction of team projects(application deadline: end of October 2018).
This semester’s design challenge is “creating for good”: Use alternate realities techniques and technologies, guiding theoretical frameworks, and appropriate processes, project management and collaboration approaches to iteratively ideate, design, prototype, and evaluate an interactive alternate realities experience that affords meaningful experiences for the betterment of humanity and/or our planet.
To give you an idea what projects might look like, view the VR project videos or this AWE video from previous student teams in courses that Bernhard and Patrick have taught. There is increasing evidence that the immersive nature of VR makes it a powerful medium for “doing good,” and it is particularly well-suited for helping people develop compassion and empathy. In this course, we will explore the potential of doing good using alternate realities (that are booming around the world and particularly in Vancouver)
We’ll have our first Student Virtual Reality Showcase on Friday February 1st, ca 12:30 — 4:30pm, on the SFU Surrey Campus Mezzanine, where the 18 students from different SFU departments show their first immersive project.
Posted inUncategorized|Comments Off on upcoming "Semester in Alternate Realities" course (spring 2019)
My TEDxEastVan talk from September 16th is now finally online! It was entitled “Could Virtual Reality Make us More Human” and includes some of our recent research and future directions and examples from my teaching on immersive environments. Enjoy!
Posted inUncategorized|Comments Off on Bernhard's TEDxEastVan talk now online!
It was such an honour and amazing experience to present at TEDxEastVan on September 16th! Below are some first pictures, the video will be released later in October and will be posted here.
My TEDx talk was titled “Could Virtual Reality Make us More Human” and included some of our recent research and examples from my teaching on immersive environments. Here’s some of the ideas in a nutshell:
Virtual reality is becoming increasingly accessible and affordable, and offers the unique opportunity to provide first-hand and embodied experiences. How could we use this potential to go beyond entertainment and gaming, for creating positive or even transformational experiences we might otherwise not be able to have? And how could we democratize the medium and put this powerful technology into the creative hands of more people?
I’m honoured, exciTED (and a bit nervous) to have been selected to present at TEDxEastVan on September 16th.
My TEDx talk will be titled “Could Virtual Reality Make us More Human” and will include some of our recent research and examples from my teaching on immersive environments. Here are some of the ideas I’ll put forth:
Virtual reality is becoming increasingly accessible and affordable, and offers the unique opportunity to provide first-hand and embodied experiences. How could we use this potential to go beyond entertainment and gaming, for creating positive or even transformational experiences we might otherwise not be able to have? And how could we democratize the medium and put this powerful technology into the creative hands of more people?
For those who couldn’t make it out to our Virtual Reality — Going Beyond showcase from the IAT 445 “immersive environments” course, here are some of the project videos. Thanks to all the students & TA Alex for all the great work and inspiring projects!
Forlorn
Canvas of Sound
Echo
Adrift
Finding Home
The Cave
Posted inUncategorized|Comments Off on Virtual Reality - Going Beyond: project videos
Here are some pictures and project posters from our Virtual Reality — Going Beyond showcase from the IAT 445 “immersive environments” course that I taught with lots of great help from TA & PhD student Alex Kitson. Project videos will follow soon…
Pictures from showcase:
Project posters:
Posted inUncategorized|Comments Off on Virtual Reality - Going Beyond: pictures and posters
IAT 445 Project showcase on Friday June 23, 2017, 10am-2:30pm
On Friday June 23, 2017, the students from my course on “immersive environments” (IAT 445) will be presenting their final projects in the Mezzanine on our SFU Surrey campus, from about 10am — 2:30pm.
9 student teams will showcase their own immersive Virtual Reality projects that they developed in the popular game engine Unity3D and will present using the Oculus Rift head-mounted display.
Some projects draw from contemporary indie/art computer games like Dear Esther, Journey, or Stanley’s Parable and cinema/television.
Students were tasked to design for a purposeful and immersive user experience — this semester’s design challenge for students was Going beyond: “Use unity3D and guiding frameworks (e.g., immersion, presence, user-centered systems design etc.) to iteratively ideate, design, prototype, and evaluate an immersive and interactive virtual environment experience that “goes beyond”: How could you provide interesting, inspiring, or meaningful experiences in VR? That is, what experiences could you provide in VR that are otherwise difficult, dangerous, or hard to experience? Instead of using VR as only a past-time and ultimate sensory overload tool to wow people, how could you use it for something more interesting, novel, exciting, or meaningful?” Be prepared for some exciting showcases!
In case you can’t make it to the interactive project showcase, you can join the public project video presentation session on Thursday June 29th at 2:30pm, in Surrey room #5380, or wait for the best videos to be posted online.
“We went to the Moon as technicians, we returned as humanitarians” reflected Edgar Mitchell after his space flight. This describes the overview effect – a profound awe-inspiring experience of seeing Earth from space resulting in a cognitive shift, leading to a more conscious and caring view on our planet. Experiencing Earth from space first-hand made many astronauts realize that Earth is fragile, without borders, leading to a feeling of connectedness to humanity and our planet(see astronauts’ quotes). Such an awareness shift could have a positive impact on our society and planet, especially if we had a tool that allowed for more people to experience it without the risk, cost, and environmental footprint associated with actual space flight.
To this end, the iSpace Lab investigates how we could best use the potential of immersive virtual reality to give people a glimpse of the overview effect without having to send more rockets to space. At the same time, we use virtual reality as a tool allowing us to better understand the experience and underlying triggers of the overview effect phenomenon.
To this end, we will design a set of introspective, physiological and
(1) design a set of introspective, physiological and behavioural evaluation research tools to better understand the overview effect phenomenon and how immersive VR could serve to induce it;
(2) Pilot these research tools as an essential part of our larger research program through the creation of a VR environment, gaining a deeper understanding of aspects of the personal experiences of the OE delivered through VR, and as a result deriving strategies for the design of pivotal VR experiences with the long-term goal of inducing positive social change in the population.
Below is our first video explaining the overall idea of the Earthgazement project. Thanks: IAT344 student team Joanna Chou, Katarina Shao, Lien Chou, & Sidi Zhong!
Posted inUncategorized|Comments Off on Earthgazement - towards an overview effect in Virtual Reality: first project video
Here’s the official SFU 1-minute coverage of my IAT 445 Immersive Environments course:
And below are some of the videos from our Immersive Environments course from our showcase on (IAT 445). For more infos, see Immersive Environments. We’ll offer the course again in Summer 2017 as a compact (intersession) course
Parallel Minds
The Reef
The Painter
Retrograde
Posted inUncategorized|Comments Off on Virtual Reality - the ultimate empathy machine? Project videos
Project showcase on Friday December 9th 2016, 10am-2pm
On Friday December 9th 2016, the students from my course on “immersive environments” (IAT 445) will be presenting their final projects in the Mezzanine on our SFU Surrey campus, from about 10:00am — 2pm.
IAT 445 immersive environments showcase Fall 2016
10 student teams showcased their own immersive Virtual Reality projects that they developed in the popular game engine Unity3D and will present using the Oculus Rift DK2 head-mounted display.
Some projects draw from contemporary indie/art computer games like Dear Esther, Journey, or Stanley’s Parable and cinema/television. Students were tasked to design for a purposeful and immersive user experience — this semester’s design challenge for students was evoking a strong yet meaningful feeling of empathy: ”“Use unity3D and guiding frameworks (e.g., immersion, presence, user-centered systems design etc.) to iteratively ideate, design, prototype, and evaluate an immersive and interactive virtual environment that evokes empathy in a meaningful way. This could be empathy towards humans as well as non-human animals, plants (e.g. trees) or even inanimate natural objects (mountains). So be prepared for some exciting showcases!
In case you can’t make it to the interactive project showcase, you can join the public project video presentation session on Thursday December 15th at 4:30pm, in Surrey room #2600 (the large theatre), or wait for the best videos to be posted online.
here’s the video recording of the whole session:
Here’s the list of speakers and topics
Elgin McLaren: Attention Retention: The effectiveness of neurofeedback systems for cueing sustained attention
Jeff Ens: Music and the role of dimensional complexity in similarity judgements
Arron Ferguson: Choose Wrong, Someone Dies: Measuring Engagement with Ethical Choices and Character Consistency in Interactive Narrative
Duc-Minh Pham: Body-based Navigation: A Promising Locomotion Technique in Immersive Virtual Environment.
Ray Pan: “Split, Horizontal or Overlapped?”: Comparing Social Presence and Body Ownership in Shared Video Views for Long Distance Relationships
Denise Quesnel: Are you awed yet? Objective and subjective indicators of awe, using virtual reality content
Mia Cole: Time to Relax: No effects to the stress response after short-term use of an EEG-based brain-computer interface.
Maha El Meseery: Will tracking user interactions during visual exploration helps improve their analysis efficiency?
Ted Nguyen Vo: Moving in a Box: A Visual Cue for Virtual Reality Locomotion
Fatemeh Salehian Kia: Motive or Strategic Student: Comparing 3 Types of Visual Feedbacks on Students’ Performance with Different Learning Styles in Online Discussions
Junwei Sun: Assessing Input Methods and Cursor Displays for 3D Positioning with HMDs
Narges Ashtari: Exploring factors which affect architects design exploration structure in CAD spaces
Stephanie Wong: Easy A: assessing student’s ability to cheat with smartwatch
Abraham Hashemian: Leaning-Based 360 Locomotion Interfaces: How good are they for navigation in Virtual Reality
Serkan Pekcetin: Measuring the Effect of Binaural Audio on the Sense of Direction in Virtual Environments
Xintian Sun: Where Was It? Evaluating Spatial Memory in Different Backgrounds from Static and Moving Viewpoints
Posted inUncategorized|Comments Off on 16 project presentations from my “Quantitative Research Methods" (IAT802) course...
Our conference paper at SUI 2016 just received an honourable mention at the ACM SUI conference in Tokyo, congratulations to all!
Using a custom-designed foot haptics system and evaluating it in a multi-part study, we show that adding walking related auditory cues (footstep sounds), visual cues (simulating bobbing head-motions from walking), and vibrotactile cues (via vibrotactile transducers andbass-shakers under participants’ feet) could all enhance participants’ sensation of self-motion (vection) and involvement/presence. Compared to seated joystick control, standing leaning enhanced self-motion sensations.
The “overview effect” is an awareness shift experienced by astronauts when they see the Earth from space and realize how fragile it is. This is described as a profound effect leading to more conscious and caring view on our planet. Could we use immersive Virtual Reality (and some other tricks) to give people a glimpse of this experience without having to spend all the money and fossil fuel to send rockets out into space? We’ll post more infos about this project on our Virtual EarthGazing project page soon. Feel free to contact us if you’re interested in collaborating.
here’s a nice overview video of the overall topic (thanks to the planetary collective)
Posted inUncategorized|Comments Off on Just starting our latest project: Virtual Earthgazing - towards an overview effect in Virtual Reality
Here’s a video of Mirjana’s presentation on “Living In A Box: Potentials and Challenges of Existence in VR” from the Consumer Virtual Reality (CVR) show in Vancouver (May 2016). Way to go Mirjana!
For more information on her projects see Pulse Breath Water project page
Here’s a video of my presentation at the 2016 International Psychonomics Conference in Granada about an online spatial orientation study and the rather unexpected response patterns that we observed — and how we might be able to make sense of them.
You can find more infos about this project at this page
Here’s the reference for the talk:
Riecke, B. E., Stepanova, E. R., & Kitson, A. 2016. New response patterns in point-to-origin tasks depending on stimulus type and response mode. Talk presented at the International Meeting of the Psychonomic Society, Granada, Spain.
Last year Alex Kitson gave a great presentation at the 2nd International Workshop on Movement and Computing (MoCo) in Vancouver, co-located with ISEA.
here’s at last a recording and the full reference: Kitson, A., Riecke, B. E., & Stepanova, E. R. (2015). (pp. 100–103). Presented at the MOCO’15 – 2nd International Workshop on Movement and Computing, Vancouver, Canada: ACM. doi:10.1145/2790994.2791014
http://dl.acm.org/citation.cfm?id=279. Enjoy!
Here’s a recording of an invited talk I just gave about some aspects of our theoretical framework on spatial orientation and reference frame conflicts.
The talk was entitled “Qualitative Modeling of Spatial Orientation Processes and Concurrent Reference Frame Conflicts using Logical Propositions” and presented at the International Workshop on Models and Representations in Spatial Cognition at the Hanse-Wissenschaftskolleg in Delmenhorst, Germany from March 3 – 4, 2016
I just presented 2 papers at the 2015 ACM Spatial User Interaction Symposium in LA. Below are references & a simple video recording for those who couldn’t make it down to LA.
Kruijff, E., Riecke, B. E., Trepkowski, C., & Kitson, A. (2015). Upper Body Leaning can affect Forward Self-Motion Perception in Virtual Environments (pp. 103–112). Presented at the SUI ’15: Symposium on Spatial User Interaction, Los Angeles, CA, USA: ACM. doi:10.1145/2788940.2788943, http://dl.acm.org/citation.cfm?id=2788943
https://youtu.be/kZUkhI2UI7s
Kitson, A., Riecke, B. E., Hashemian, A. M., & Neustaedter, C. (2015). NaviChair: an embodied interface to navigate virtual reality (pp. 123–126). Presented at the SUI ’15: Symposium on Spatial User Interaction, Los Angeles, CA, USA: ACM. doi:10.1145/2788940.2788956, see http://dl.acm.org/citation.cfm?id=2788940.2788956 for the full paper
On Friday June 26th, the students from the summer 2015 course offering of “immersive environments” course (IAT 445) will be presenting their final projects in the Mezzanine on our SFU Surrey campus, from about 10:30am — about 1:30pm.
10 student teams will showcase their own immersive Virtual Reality projects that they developed in the popular game engine Unity3D and will present using the Oculus Rift DK2 head-mounted display.
Some projects draw from contemporary indie/art computer games like Dear Esther, Journey, or 5 nights at Freddy’s and cinema/television. Students were tasked to design for a purposeful and immersive user experience — this semester’s design challenge for students was evoking a strong yet meaningful emotional or visceral response: “Use unity3D and guiding frameworks (e.g., immersion, presence, user-centered systems design etc.) to iteratively ideate, design, prototype, and evaluate an immersive and interactive virtual environment that evokes a strong yet meaningful emotional or visceral response in the users.” So be prepared for some exciting showcases!
Cheers & hope to see you there,
Bernhard
P.S> In case you can’t make it to the interactive project showcase, you can join the public project video presentation session on Friday July 3rd at 10:30am, in Surrey room #2600 (the large theatre), or wait for the best videos to be posted online.