This course focuses on psychological theory and application relevant to interacting with current and emerging digital technologies. Topics will typically include interfacing and communicating with artificial intelligence, perception and cognition in digital spaces such as virtual and augmented reality and how we can feel “present” in our digital experiences. This course will be taught in a Virtual Reality Classroom.
Note: This course requires students to have a Virtual Reality Head Mounted Display (HMD).
The first of its kind in Canada, the class, which started Sept. 14, filled its 20 spots (standard for a fourth-year psych course) in a matter of days…
“Immersion in media is a topic that’s been around for a long time, but it takes on a whole different level when you talk about it in VR,” Chaston says, noting it will play a role in everything from work and play to shopping as retailers set up VR stores.
After diving deep into what VR is and how it works, the course will focus on Chaston’s research into using VR nature scenes to lower stress levels. The class is set up as a three-hour block and already students have been invited to a couple of VR “events” to ensure they are comfortable in the space.The first day was an introduction, including basic etiquette for behaviour in VR. While most class time will be in VR, there will be time for group work that uses other more traditional online formats like Google Meet so that students aren’t wearing headsets for three hours straight. As note-taking is tough in VR, those will be provided separately.
Chaston credits Anna Nuhn (who has since left MRU) and Erik Christiansen at the Riddell Library and Learning Centre and MRU psychology professor Dr. Evelyn Field, PhD, for their help over the past year in developing the course.
My half-hour presentation on virtual reality in higher education (with an emphasis on social VR), which I gave to my university’s Senate Committee on Academic Computing, was well received. As promised, here is a copy of my PowerPoint slides from yesterday’s presentation (60 slides, 191MB in size; it’s so large because I included a number of animated GIFs). I would have liked to embed my slides into this blogpost, but the various ways I tried unfortunately failed to work. You will have to download the slides and run them on your own computer!
I had been sorely tempted to set this so that you had to join my Patreon in order to get it, which would have cost you at least $1.00, but in the end I decided that it would be seen by more people if I made it free to download. However, I do reserve the right at a future point to start doing this for some of my content! From the bottom of my heart, I would like to thank my Patreon patrons; their support means the world to me! I am going to rack my brains to see if I can come up with new perks for you!
Conducting experiments in VR can sometimes be difficult, involving the purchase and setup of sometimes expensive hardware (particularly if multiple headsets need to be bought). University budgets can only go so far, even at the best of times. One way to get around this is to use existing commercial social VR platforms and their users as volunteers (who, of course, already have their own equipment).
This is a different form of what is called crowdsourcing: dividing up a task among a larger group of volunteers. In this case, researchers at Northeastern University in Boston, Massachusetts did a small demonstration experiment to prove the idea that recruiting study volunteers via VRChat was possible, publishing a paper at a computer science conference held last year. The following research paper is unfortunately not free to access and read, but you can always use your friendly local public or academic library to obtain a copy of it! Here’s the citation:
Saffo, D., Yildirim, C., Di Bartolomeo, S., & Dunne, C. (2020). Crowdsourcing virtual reality experiments using VRChat. Conference on Human Factors in Computing Systems – Proceedings, 1–8. https://doi.org/10.1145/3334480.3382829
According to the conference paper’s abstract:
Research involving Virtual Reality (VR) headsets is becoming more and more popular. However, scaling VR experiments is challenging as researchers are often limited to using one or a small number of headsets for in-lab studies. One general way to scale experiments is through crowdsourcing so as to have access to a large pool of diverse participants with relatively little expense of time and money. Unfortunately, there is no easy way to crowdsource VR experiments. We demonstrate that it is possible to implement and run crowdsourced VR experiments using a preexisting massively multiplayer online VR social platform—VRChat. Our small (n = 10) demonstration experiment required participants to navigate a maze in VR. Participants searched for two targets then returned to the exit while we captured completion time and position over time. While there are some limitations with using VRChat, overall we have demonstrated a promising approach for running crowdsourced VR experiments.
One of the features which attracted the researchers to VRChat was the ability to build custom virtual worlds or rooms:
VRChat also has a special feature that sparked our interest: it allows users to upload custom rooms built with Unity by using a proprietary VRChat SDK. The SDK contains special triggers and event handlers that can be triggered by users, in addition to giving the possibility to upload rooms made of and containing any kind of 3D models made by a creator. We started asking ourselves if we could leverage the vast amount of VRChat users who already own VR equipment and use them as experiment participants by building a custom room that contained the implementation of our experiment, in order to run crowdsourced experiments in VRChat.
And so they built a maze and ran a simple experiment:
The participants in the experiment were asked to run through a VR maze, find two targets inside the maze, and go back to the exit. The experiment was run using two point of views, immersive and non-immersive, and compared the timing between a group of self-declared gamers and non-gamers. Our reasoning for choosing this experiment over others was that it was simple enough to avoid having too many variables influencing the results, and it would give us a quick way to evaluate the process of conducting a user study on the platform.
A researcher would then visit public world in VRChat, asking users present if they would be willing to run the maze.
After joining a public world, we began by looking for users using HMDs. We did this by asking users directly if they were using VR, or by observing their in-game movements as VR users have full head and sometimes hand tracking. We found that most users we approached were willing and eager to participate. After users had joined our world, they would spawn in a waiting room where we could give them further instructions. At this stage researchers conducting a user study may also present digital consent forms for participants to read and sign.
The researchers noted that, at the time of the proof-of-concept experiment, they were somewhat limited by the relatively narrow scope of what they could build using the then-available version of the VRChat SDK (software development kit). However, they noted that the next-generation graphical SDK (called Udon) offered the ability to build more complex interactive worlds, thereby expanding the possible uses for VR experiments.
The researchers also noted the relative ease and cost effectiveness with which VRChat could be used for academic research into the growing field of social or collaborative virtual reality:
It is particularly exciting to note that VRChat can also be used to implement collaborative VR studies. Previously, such studies would require custom multiplayer platform development. VRChat not only provides an SDK to create worlds but also all the network capabilities to have several concurrent users all in the same virtual space.
UPDATE 2:02 p.m.: I’ve just discovered a recent five-minute YouTube video featuring the Northeastern researchers, explaining the concept of using existing social VR platforms for their experiments:
This video mentions and summarizes a second, follow-up research paper, which I have not yet read (again, you will have to pay to access this conference paper; you should be able to obtain a copy via your local public or academic library). Here’s the citation for you:
Saffo, D., Bartolomeo, S. Di, Yildirim, C., & Dunne, C. (2021). Remote and collaborative virtual reality experiments via social VR platforms. Conference on Human Factors in Computing Systems – Proceedings. https://doi.org/10.1145/3411764.3445426
I’m quite eager to read this second research paper! According to the description of the YouTube video, a preprint of this conference paper and all supplemental materials are available at the following URL: osf.io/c2amz (so you might not need to pay for a copy via interlibrary loan/document delivery from your local library, after all).
Of course, it’s not just VRChat that could be repurposed as an academic testbed. Any number of commercially available social VR platforms can be used as cost-effective platforms to conduct VR experiments! The researchers at Northeastern University are to be commended for their proof-of-concept work, and I very much look forward to seeing other uses of social VR platforms in various areas of academic virtual reality research.
Fisk University, a private, historically Black university located in Nashville, Tennessee, will launch a virtual human cadaver lab for its pre-med and biology students this fall. The cadaver laboratory will use the social VR platform ENGAGE, in a partnership with Fisk University, HTC VIVE, T-Mobile, and VictoryXR (an educational content creator company using ENGAGE as a platform).
Inside the lab, students will examine the internal organs of various human systems, and the professor can even remove the organs from the body and pass them around for students to hold and open. Students will have the ability to enlarge the organ to a size large enough where they can even step inside to better learn how it works. In addition to organ systems, the cadavers will also include complete skeletal and muscle structures.
“With this cadaver lab, our pre-med students will no longer need to rely on other universities for advanced anatomy and biology classes,” said Dr. Shirley Brown, Dean of Fisk University. “Virtual reality technology takes our university to a level equal to the most advanced schools in the country.”
In the past, Fisk University has not purchased cadavers due to the high cost and maintenance. But with a virtual cadaver lab, the university can offer state-of-the-art scientific learning that’s affordable and easy to maintain. Virtual cadavers do not degrade, and over time additional specialties can be added to the software such as surgical procedures, comparative learning between human and animal as well as microbiology at the cellular level.
Here’s a two-minute promotional video for the project:
The…costs to own a cadaver lab is in the order of magnitude of millions of dollars. Not all universities can afford that. There is at the moment a slightly better alternative, that is using ultra-realistic synthetic cadavers, that are also able to simulate some motions of the human body (e.g. the heart pumping), but the cost of each one of them is $60-100,000. This means that to own them a university must invest much money anyway.
We all know that virtual reality can replicate real objects pretty well, so VictoryXR had the idea of trying to reproduce a cadaver lab in virtual reality: apart from the fixed cost for the 3D elements, this laboratory would scale pretty well with the number of students and would need almost no maintenance cost. This is a very smart solution to make education more accessible for medicine students. Thanks to this, many more universities would be able to afford to have a virtual cadaver lab, even in non-wealthy countries. We always talk about VR being able to democratize education, and this is one bright example of how it can do that.
Tony came away from his brief demo favourably impressed:
I had just a short demo with the virtual lab, and I think that it is a good start for Fisk University and VictoryXR. I don’t think that at the moment it can replace the real experience with a cadaver because you miss all the tactile sensations, the weight, and also the creep of having a real organ in your hands. But it can be a good substitute to start learning about the human body, to observe the organs in detail, to start getting confidence with having a bone or a part of the body of someone else in your hands. It could be able to offer a good course, and after that, maybe the students can have just a few final lessons with real corpses in another location. It is a good way of giving value to many medicine universities not only in the U.S. but in the whole world, especially the ones that can’t afford to have real or synthetic cadavers for tests.
What impressed me the most is the potential that this solution can have in the future. There are things that VR can give to students that are hardly possible in real life. The fact that you can enter with your teacher inside an organ and examine it both at macro and micro level is one amazing thing for instance. The possibility of organizing minigames (like the puzzle) that are engaging and improve the learning efficiency via interactivity is something that VR enables and that would be too creepy to do in real life. The possibility of doing many simulated surgeries on the cadavers with the possibility of repeating every operation at no additional cost is another cool thing.
Thanks to Chris Madsen/DeepRifter of ENGAGE for the heads up, and Tony Vitillo/SkarredGhost for his report and pictures! You can read Tony’s review in full here, and I strongly recommend you follow his blog as well as my own!