Taking a Moment to Catch My Breath and Figure Out Where I’m Going Next

So, as I have mentioned, I haven’t been blogging much lately, because I have been so busy with my full-time paying job as an academic librarian at my employer, the University of Manitoba in Winnipeg, Canada. Now that the annual rush of training hundreds of students on how to use the university libraries effectively and efficiently has ended, my attention turns to my other big project: specifying hardware and software for a virtual reality lab, which we are calling the XR Lab (the XR stands for eXtended Reality, a sort of umbrella term used for virtual reality, augmented reality, mixed reality, and what Apple is now calling spatial computing).

The purpose of this lab is to provide virtual reality and augmented reality hardware and software (both VR/AR experiences and content creation tools) to University of Manitoba faculty, staff, and students to support their teaching, learning and research. I have been working on this project for the past two and half years, and it is a weird feeling to finally see the computers removed from the room which we have designated as the future home of the XR Lab, in preparation for the necessary room renovations (which are to start soon, and are supposed to be completed by spring next year):

The former computer lab which will be renovated to create the XR Lab

In the meantime, I have been cross-training another Libraries staff member on the hardware and software which I am proposing for the XR Lab. In other words, if (God forbid!) I should get run over by a bus, the idea is that somebody will be able to give VR/AR demos in my place. There is a lot of information which has to be shared! For example, our last training session included a section on how to set the correct interpupillary distance (IPD) on both the Vive Pro 2 and Meta Quest 3 headsets (thankfully, the Apple Vision Pro automatically scans your eyes and sets the IPD automatically!).

Just another day in the office: the Vive Pro 2 VR headset is sitting on the Windows desktop PC it is tethered to on the right, the Meta Quest 3 is to the left near the back of the table, and the Apple Vision Pro is sitting at the centre, near the front of the table.

There’s a lot of balls to juggle, and I must confess that I often feel exhausted and even overwhelmed at times. When I come home from work, the last thing I want to do is write a blog post! So my formerly feverish blogging pace has unfortunately slowed to a crawl. Also, my blogpost viewing stats are way, waaay down. Where I used to get 1,500 views a day, now I’m lucky to reach even one third of that:

Partly it’s because the metaverse hype cycle has crested and crashed (and everyone has jumped on the artificial intelligence bandwagon), and partly it’s because longform blogs seem to be an increasingly outdated—even quaint—means of communication in the current short-attention-span era of Instagram pictures and TikTok videos.

Which means I seriously need to pause and think about what direction in which I want to take this blog, and who I want my audience to be. One of the things that I have always said is that, in a blog that literally has my name in the URL, anything I want to talk about here is on topic! However, I am wondering if perhaps I have cast my net a little too broadly, and it might be time to narrow the focus of the RyanSchultz.com blog somewhat.

I don’t think that I will cease blogging completely; I still feel the need to write, but I need to reflect a bit on what I want to write about, and why. I still do get a sense of accomplishment when I craft a well-written blog post on a topic that I care about and, as always, I read and appreciate all the comments and feedback I receive on my blogposts!

So please bear with me as I figure out where I am going next with (gestures broadly) all this.

It can be difficult to choose the next direction in which to go (Image by Rama Krishna Karumanchi from Pixabay)

IMPORTANT HOUSEKEEPING ANNOUNCEMENT AND APOLOGY: Why I Am Putting The RyanSchultz.com Blog on the Back Burner for the Foreseeable Future

When you’re up to your ass in alligators, it’s hard to remember that your initial objective was to drain the swamp.

Modern proverb, possibly Cajun

A picture of the equipment setup in the temporary virtual reality demonstration room in Elizabeth Dafoe Library, with a Meta Quest 3 headset (left, the white headset), and the Vive Pro 2 headset with the “wand” controllers (centre front, the black headset). You can see on the wall-mounted computer monitor behind them a view of the Sansar world No Spectators: The Art of Burning Man – 2nd Floor, a gallery experience by the Smithsonian.

So, as you might have noticed, I haven’t been blogging very much lately (again).

There are a few reasons why, chief among them that I have been through a library move. The building which houses the university science library where I work full-time has been closed, and both the staff and collections have been moved to other locations. The building is going to be completely gutted and renovated over the next 2-1/2-to-3 years. Moving a large library is a MAJOR undertaking, folks! And just days after the move in June, as luck would have it, we hosted a science librarians conference, which had attendees coming from all over North America. The last month has been hectic! I haven’t even had an opportunity to unpack most of my moving boxes in my new office!

But another reason why I haven’t been writing much lately is that the virtual reality lab project I am working on is starting to ramp up. While plans for the necessary room renovations for the future home of the XR (Extended Reality) Lab are proceeding (with a projected ready date of January 2025), I have been given a smaller room in the main arts and humanities library to set up a temporary virtual reality demonstration room, equipped with a wireless Meta Quest 3 VR/AR headset, plus a Vive Pro 2 PCVR setup, attached to a Windows PC with a good graphics card (see image above).

I have been spending most of last week and this week previewing and reviewing a curated selection of apps and experiences, and drafting a “menu” for both the Meta Quest 3 and the Vive Pro 2, which I will be giving to Libraries staff so they can decide what VR/AR experiences they would like to have. Most of them are brand new to virtual reality and augmented reality, so I still need to work out the best procedures for giving these demos, and cleaning the hardware between users, helping them avoid VR sickness, etc.

In fact, I have spent so much time hopping in and out of various VR apps to draw up the menus, that I have often given myself VR sickness, something which surprised me, as a virtual reality veteran! I have been using a wide variety of headsets since January 2017, and I am usually able to be in VR for two hours at a time!

I discussed this at the first meeting of the University of Manitoba VR/AR/MR/XR Group (a new group I helped organize, for U of M faculty, staff, and students working in virtual reality, augmented reality, mixed reality, extended reality, spatial computing—and whatever other umbrella term they come up with next!), and the head of the computer science department told me that, in his opinion, part of the problem is that many newer app developers don’t put the same amount of care and attention into designing affordances that the earliest VR apps had. He has a good point.

In other words, some VR/AR developers are just throwing stuff together using the new and improved content creation tools, without really doing proper testing. I do think that there is some merit in this idea, based on my own experience over the past two weeks. So I am finding that I am having to take breaks from all my VR/AR activity until the nausea passes. And it has reminded me that I definitely need to keep VR sickness top of mind when giving demos!

Along with off-the-shelf apps (educational and non-gaming, although some apps might have a gamification component) from both the Quest store (for the Meta Quest 3) and the Steam store (for the Vive Pro 2), I am also including in my menus some examples of educational worlds which people have created in various social VR platforms. Some examples are the NASA Apollo moon-landing exhibit in Sansar, The Universe microscopic-to-macroscopic experience in Resonite, and the Ancient Athens Acropolis and Agora worlds, which have been moved from AltspaceVR to VRChat. There’s a lot of content out there! I want Libraries staff to be able to experience as much of it as possible, to get a sense of the possibilities of this technology. (Right now, I am focused on free apps and experiences, but eventually I will have a budget to purchase software.)

So, I have been extremely busy, and sometimes I do feel a bit overwhelmed. Quite often, when I come home from work, the last thing I want to do is sit in front of a computer, and especially put on another virtual reality headset! So my trusty Valve Index, with the Knuckles controllers, is quietly collecting dust on my computer desk at home.

So I apologize for the lack of blog posts lately, but as you can see, I’m trying to keep a lot of plates spinning at the moment! I am going to have to put this blog on the back burner for the foreseeable future. Thank you for your patience and understanding.

Kandyan Plate Spinners (CC BY-SA 2.0 Antony Stanley, from Flickr)

UPDATED! Meta Announces the Meta Horizon Operating System for Future Third-Party VR/AR/MR Headsets, and Partnerships with ASUS, Lenovo, and Xbox (Also: Reports of Slower-Than-Expected Sales for the Apple Vision Pro)

On April 22nd, 2024, Meta (the company formerly known as Facebook) made an announcement titled A New Era for Mixed Reality:

Today we’re taking the next step toward our vision for a more open computing platform for the metaverse. We’re opening up the operating system powering our Meta Quest devices to third-party hardware makers, giving more choice to consumers and a larger ecosystem for developers to build for. We’re working with leading global technology companies to bring this new ecosystem to life and making it even easier for developers to build apps and reach their audiences on the platform.

This new hardware ecosystem will run on Meta Horizon OS, the mixed reality operating system that powers our Meta Quest headsets. We chose this name to reflect our vision of a computing platform built around people and connection—and the shared social fabric that makes this possible. Meta Horizon OS combines the core technologies powering today’s mixed reality experiences with a suite of features that put social presence at the center of the platform.

Of course, this also includes the Meta Quest Store, which will apparently be renamed the Meta Horizon Store:

Developers and creators can take advantage of all these technologies using the custom frameworks and tooling we’ve built for creating mixed reality experiences, and they can reach their communities and grow their businesses through the content discovery and monetization platforms built into the OS. These include the Meta Quest Store, which contains the world’s best library of immersive apps and experiences—we’re renaming it to the Meta Horizon Store.

And, as you might expect with a company whose profits still largely derive from social media based on surveillance capitalism, you’d best believe that Meta wants to make sure that it inserts itself into all the social aspects of this technology, as it licenses the tech to other companies:

The Horizon social layer currently powering Meta Quest devices will extend across this new ecosystem. It enables people’s identities, avatars, and friend groups to move with them across virtual spaces and lets developers integrate rich social features into their apps. And because this social layer is made to bridge multiple platforms, people can spend time together in virtual worlds that exist across mixed reality, mobile, and desktop devices. Meta Horizon OS devices will also use the same mobile companion app that Meta Quest owners use today—we’ll rename this as the Meta Horizon app.

It looks very much as though the word Quest is going to be replaced by the word Horizon throughout (much as Oculus was replaced by Quest previously). I guess those Meta marketing people need to justify their paycheques by constant rebranding! Gotta keep it fresh! Personally, I think they should have stuck with Oculus… 😉

Also part of this announcement are three key partnerships with third-party hardware developers:

  • ASUS and its Republic of Gamers subsidiary “will use its expertise as a leader in gaming solutions to develop an all-new performance gaming headset.”
  • Lenovo will apparently focus on education and the workplace: “Lenovo will draw on its experience co-designing Oculus Rift S, as well as deep expertise in engineering leading devices like the ThinkPad laptop series, to develop mixed reality devices for productivity, learning, and entertainment.”
  • Meta will also be working with Xbox to create a limited-edition Meta Quest (Microsoft and Meta also worked together recently to bring Xbox cloud gaming to the Quest).

Reactions to this new on Reddit have varied. One person on the r/VisionPro subreddit (hardly an impartial source!) commented, “Feels more closed than Apple. And also less developer friendly.” (As if Apple doesn’t have its own walled-garden approach to its technology.)

Also mentioned in Meta’s announcement was that software developed through the Quest App Lab will be featured in the newly-renamed Horizon Store:

As we begin opening Meta Horizon OS to more device makers, we’re also expanding the ways app developers can reach their audiences. We’re beginning the process of removing the barriers between the Meta Horizon Store and App Lab, which lets any developer who meets basic technical and content requirements ship software on the platform. App Lab titles will soon be featured in a dedicated section of the Store on all our devices, making them more discoverable to larger audiences.

I think that this is good news for smaller developers, who often struggle to get word out about their products. (Of course, Meta will get a cut of any sales through its store!)

In an Engadget report by Devindra Hardawar, she writes:

Think of it like moving the Quest’s ecosystem from an Apple model, where one company builds both the hardware and software, to more of a hardware free-for-all like Android. The Quest OS is being rebranded to “Meta Horizon OS,” and at this point it seems to have found two early adopters. ASUS’s Republic of Gamers (ROG) brand is working on a new “performance gaming” headsets, while Lenovo is working on devices for “productivity, learning and entertainment.” (Don’t forget, Lenovo also built the poorly-received Oculus Rift S.)

As part of the news, Meta says it’s also working on a limited-edition Xbox “inspired” Quest headset. (Microsoft and Meta also worked together recently to bring Xbox cloud gaming to the Quest.) Meta is also calling on Google to bring over the Google Play 2D app store to Meta Horizon OS. And, in an effort to bring more content to the Horizon ecosystem, software developed through the Quest App Lab will be featured in the Horizon Store. The company is also developing a new spatial framework to let mobile developers created mixed reality apps.

Devindra does have a good point; Apple has long been opposed to opening up its hardware to third-parties (and it would appear, based on recent media reports, that sales of the eyewateringly-pricey Apple Vision Pro are not as brisk as the company had hoped):

Apple has dropped the number of Vision Pro units that it plans to ship in 2024, going from an expected 700 to 800k units to just 400k to 450k units, according to Apple analyst Ming-Chi Kuo.

Orders have been scaled back before the Vision Pro has launched in markets outside of the United States, which Kuo says is a sign that demand in the U.S. has “fallen sharply beyond expectations.” As a result, Apple is expected to take a “conservative view” of headset demand when the Vision Pro launches in additional countries.

Kuo previously said that Apple will introduce the Vision Pro in new markets before the June Worldwide Developers Conference, which suggests that we could see it available in additional areas in the next month or so.

Apple is expecting Vision Pro shipments to decline year-over-year in 2025 compared to 2024, and the company is said to be “reviewing and adjusting” its headset product roadmap. Kuo does not believe there will be a new Vision Pro model in 2025, an adjustment to a prior report suggesting a modified version of the Vision Pro would enter mass production late next year.

According to Apple industry analyst Ming-Chi Kuo, initial sales of the high-end Apple Vision Pro have “fallen sharply beyond expectations.”

I find it an absolutely fascinating time to be working in virtual reality, augmented reality, mixed reality, and spatial computing! While Apple has aimed for the high-end with its US$3,500 headset, Meta has focused its attention on the low end, with a wireless headset that is seven times cheaper than the Apple Vision Pro! (Of course, you could also use the Quest 3 as a PCVR headset, but most people don’t do that.)

I never would have predicted that we’d have two firmly-set goalposts at each end of the field, instead of companies releasing a mass of options in the middle of the field! This leaves a huge gap between the ultra-low-end Meta Quest 3 and the ultra-high-end Apple Vision Pro, and I do believe that there is certainly opportunity for companies to fill that gap, with existing hardware (e.g. the Valve Index, the Vive Pro 2, etc.), as well as some new devices which fall in between the two extremes.

I think that Meta is very smart to partner up with third parties who already have some experience in this space (notably Lenovo), and from those partnerships, new products will spring up to address that gap. While it will likely not be until 2025 or 2026 until we see the fruit of these new partnerships, interesting times are ahead!


UPDATE April 26th, 2024: I sometimes post my blogposts to the various virtual world and virtual reality Discord servers I belong to, in order to drive a bit more traffic to my blog (I don’t do it nearly as often as I used to, though). And PK, on the MetaMovie Discord server, made the following insightful and thought-provoking comment on this announcement from Meta/Facebook:

I want someone to dig into what sort of access Meta would have to data on these third-party headsets, potentially, through various software that would be required. I think it’s existential that we need to keep metaverse data out of their hands.

Even now, having failed with five or six different social VR attempts so far, they still manage to collect 1/3 of every virtual transaction in VRChat, at least those using Quest headsets, which is the majority of users now. Their [i.e., VRChat’s]creator economy is only in beta so far, but thanks to Facebook and Steam, and Apple for pushing this model, we don’t have the thriving virtual economy we would have had by now, because even taking 1% of every transaction just for monopolizing app downloads, that would be too much. A third is robbery, but because [Meta CEO Mark] Zuckerberg could afford to make mobile headsets affordable without worrying about profits so far, they’re now cornering commerce in this space. I don’t think it’s safe to trust them with our future, and so I’m very skeptical about these sorts of initiatives.

PK is correct; it is troubling that the walled-garden gatekeepers like app store owners (Meta, Google/Android, and Apple) are each taking a cut of any in-world transactions. It has a chilling effect on anybody trying to make money within VRChat (of course, the social VR platform has long had a booming economy going on outside of VRChat, with places such as the Virtual Market series of avatar shopping events and the VRCMods Discord server, where avatar buyers and sellers can connect).

Linden Lab was luckily able to avoid this entire mess by creating its own in-world economy within Second Life well before the advent of Google Play and Apple’s App Stoe—but now that they are actively working on a new mobile Second Life app for Android and iOS, it will be interesting to see whether Second Life, too, will be impacted by other players like Meta wanting to take their cut. (Probably not, since you can do things like buy Linden dollars directly from the Second Life website.)

Interesting times lie ahead! As drag queen RuPaul likes to say on her hit reality TV show, RuPaul’s Drag Race (and my guilty pleasure!):

Mama Ru raises her opera glasses and says, “I can’t wait to see how this turns out.”

Thank you to PK of the MetaMovie Discord, for giving me permission to quote them directly!

Entering the RadyVerse: A Look at Five VR and AI Projects for Training Healthcare Workers at the University of Manitoba’s Rady Faculty of Health Sciences

One of the virtual reality labs being used to train nursing students in the College of Nursing at the University of Manitoba

As many of my readers already well know, I am the computer science and agriculture librarian at the Jim Peebles Science and Technology Library at the University of Manitoba in Winnipeg, Manitoba, Canada, and I have been writing about “news and views on social VR, virtual worlds, and the metaverse” (as the tagline of the RyanSchultz.com blog states) since July 31st, 2017. I have now been actively and avidly reporting on this space on my blog for almost seven years, sharing news and events in the rapidly-evolving metaverse!

So it was that I had already written on my blog (albeit somewhat in passing) about the University of Manitoba’s College of Nursing, which has been training new nursing students using the UbiSim software since the Fall 2022 term. Here’s a one-minute YouTube video about that work:

However, today I wanted to give you all an update on some newer innovations in the use of VR (and AI!) in healthcare education at my employer, the University of Manitoba.

Yes, the RadyVerse launch even had a cake! Carbs take priority, people!!! 😉

One month ago, on Friday, March 15th, 2024, I attended a special afternoon event located at the University of Manitoba’s Bannatyne Campus (the downtown, health-sciences-focused campus, next door to Winnipeg’s main hospital complex, the Health Sciences Centre). This event was the official launch of a new initiative of the Max Rady Faculty of Health Sciences, called the RadyVerse. According to the announcement:

The RadyVerse is an exciting initiative of the Rady Faculty of Health Sciences that combines virtual reality (VR), artificial intelligence and machine learning to create immersive and controlled simulations for students, educators and clinicians. The integration aims to empower an interprofessional community, promote collaboration and enhance skill development in a risk-free setting.

Dr. Nicole Harder speaking at the RadyVerse launch event (with Dr. Lawrence Gillman, seated)

In an article published in UM Today, the University of Manitoba’s online newspaper, one of the speakers at the launch described the purpose of the event, and the benefits of using VR in the College of Nursing programs:

Dr. Nicole Harder, associate dean, undergraduate programs and professor in the College of Nursing,  and Mindermar Professor in Human Simulation, Rady Faculty of Health Sciences, described the launch event as a “technology fair” that will give faculty, staff and students the opportunity to participate in interactive demonstrations.

“People will be able to try on the VR headsets and step into the immersive world. We’ll also have monitors where we can screencast and show others what they see in the VR, and how this will be used as an educational tool,” Harder said.

“VR has been used in other universities for some time, but not to the same extent. In the College of Nursing, it is embedded into our curriculum.”

The college recently expanded its VR simulation training to its programming in The Pas and Thompson through a partnership with the University College of the North. This allows students from different parts of the province to work together on a simulated clinical case in one virtual room.

As more disciplines become involved, interprofessional teams will not even need to be in the same physical space when collaborating, Harder said.

“VR is a great tool for learning clinical decision-making, problem solving, empathy and communication.”

One of my Libraries colleagues tries out the UbiSim nursing simulation software
Kimberly Workum of the College of Nursing, at the Bodyswaps demonstration workstation

The launch event had five stations intended to showcase how the faculty is using virtual reality and artificial intelligence to educate and train the next generation of healthcare professionals: doctors, nurses, pharmacists, rehabilitation therapists, etc. U of M faculty, staff, students, reporters, and the general public were invited to try out the technology for themselves, and get a taste of how it works. The five stations were:

  • The previously mentioned UbiSim VR software, used for training nurses in simulated but realistic nursing scenarios, where students can practice their skills within a safe and controlled environment;
  • Bodyswaps, another initiative of the College of Nursing, which provides experiential, soft-skills training (e.g. how to talk with patients and family members in various scenarios);
  • An artificial intelligence (AI) tool called OSCE GPT, which uses a specially-trained large language model (LLM) to simulate patients, in order to allow healthcare professionals to practice their patient interview skills, and give them feedback on how to improve it;
  • Lumeto, social-VR-based roleplay software for up to 4 users at once, used to train healthcare workers in interprofessional collaboration skills; and
  • Acadicus (a VR program for education which I had written about in 2019 on my blog), which is being used by Dr. Lawrence Gillman. According to the UM Today article:
People could try out the Acadeicus software, being used by Dr. Gillman’s team to train doctors

One of the stations will be led by Dr. Lawrence Gillman, associate professor of surgery at the Max Rady College of Medicine and director of the Clinical Learning and Simulation Program at Bannatyne campus.

Gillman has a crisis-based simulation and trauma resuscitation program in development that he will soon be using to teach his residents. At the launch, he’ll demonstrate what trainers and learners will be able to do.

“This VR program is basically a playground where you can create your own sim lab in a virtual environment. You can create whatever scenarios or places you want, and people can participate together in person, or even from a distance,” Gillman said.

“Basically, we create medical crises that people can practice in and then make mistakes in simulation rather than real life.”

A user tries out Lumeto

I visited all five workstations, and had an ample opportunity to test out most of these applications first-hand, and speak to my U of M coworkers about these projects. In fact, you can even catch a glimpse of me standing behind Dr. Gillman as he guides a user through the Acadicus software, in the video attached to this CTV News report of the RadyVerse event (see the red arrow in the screen capture I took from that video):

(I didn’t even know about this until a friend who watched CTV News told me!)

There’s just so much exciting stuff going on right now! There are so many VR initiatives taking place on campus, oftentimes in isolation, which is a shame. For example, I wonder how many of the healthcare professionals at the RadyVerse launch were aware that the UM Libraries is working on setting up a VR lab for faculty, staff, and student use (an initiative which is now well underway). And that the Department of Computer Science also has plans to set up a VR lab for its students. And I believe that the university’s Centre for the Advancement of Teaching and Learning is also working on something to do with VR…like I said, there’s a lot going on.

Therefore, I hope to be able to use some of my own “soft skills” and abilities to help set up improved communication channels and venues at the university, so we can all learn from each other as we beaver away on our separate projects and programs! I believe that there is much so in-house expertise and experience which we can share with each other. I know that I would benefit from this, and I suspect others would as well. We can all learn from each other.

The RadyVerse event was a fantastic opportunity to learn more about some of the other virtual reality and artifical intelligence work taking place at the University of Manitoba, and I hope to report on future developments in this exciting edtech as it rolls out across campus. These are exciting times to be a VR and AI enthusiast at the University of Manitoba!