Last Friday, a new social networking app for the Apple Vision Pro was launched! It is called AuraTap, a project developed by Phil Traut and Artur Sychov, according to the About page on the visionOS app:
Apologies for the always-tilted nature of the screenshots I took of AuraTap in my Apple Vision Pro, and transferred to my MacBook Pro to write this blogpost! I still have to learn how to tilt my head properly so that the resulting screen captures are STRAIGHT, sigh!)
Even more interesting is that they make use of the sometimes-terrifyingly accurate spatial Personas feature in the visionOS operating system native to the AVP, as shown in this promotion YouTube video:
You meet someone new on Apple Vision Pro. Face to face. 2 minutes. That’s the idea behind AuraTap, a new app.
If there’s a mutual vibe, you keep going. If not, you’re on to the next.
I love how quickly you can meet people, make connections, and explore each other’s profiles.
It also features one of the most beautiful UX designs I’ve tried on visionOS. Everything feels incredibly polished.
The app launches this Friday, March 27.
They’re hosting an in app launch party this Friday at 1PM PDT. Hope you can make it.
Here’s a few more screen captures I took this morning, to show you what the app looks like. Now, obviously, this is a brand-new app, and the target audience of Apple Vision Pro users is, well, let’s be honest, still rather small, but even so, to already get 238 users signed up since last Friday is kinda impressive (and yes, I set up my own profile too, why not?).
Here’s my AuraTag profile. It’s easy to use your own spatial Persona to create a profile picture. It’s so new I was even able to get my first name, Ryan, as a username!
In fact, AuraTap is so new that I really didn’t have an opportunity to do more than set up a profile and browse through other users’ profiles! The idea is that you can connect with other AVP users who are also online, and have a brief chat (at least, that’s what I think it does). I mean, it’s obviously intended for professional networking, but it’s also giving me speed dating vibes, LOL! So maybe it’s going to be like so many social networking sites before it, where the users decide what it’s going to be used for.
And if the name of the one of the developers, Artur Sychov, rings a bell to my regular readers, it’s because he was one of the developers of a social VR program I wrote about on my blog, called Somnium Space (you can see all my blogposts about Somnium Space here). One thing that I can say about Artur is that he seems to be involved in many different projects! In addition to Somnium Space and now AuraTap, he’s also been working on his own virtual reality headset, called the Somnium VR1.
Interesting development, and along with InSpaze (which I have written about before many times on my blog), I look forward to testing it out and meeting some new people! The product is so new that it doesn’t even appear to have a website yet, but if I find one, I will update this blogpost to add it. And, if you already own an Apple Vision Pro, just search for “AuraTap” in the App Store, and you’ll find this free program.
UPDATE 3:25 p.m.: And the AuraTap website is now up! According to the description on the website, it works as follows:
MEET A PERSONA: Never met one before? Here’s your chance. We’ll randomly connect you with someone new — and you’ve got 5 minutes to decide if it’s a match. If you both Tap to Connect, the timer disappears, and you’ll be permanently linked, ready to find each other anytime in your contacts.
CROSS YOUR FINGERS: Here’s the catch: the other person won’t know you’ve tapped to connect. So bring your best behaviour, hope they want to stay in touch too — and in the meantime, keep your fingers crossed!
STAY IN TOUCH WITH YOUR CONNECTIONS: This is the address book you’ve always wished for — a place to rediscover past connections and spark new ones. See who’s online, invite them for a quick chat, or browse their business card with socials and more. It’s not just networking — it’s the future of networking.
AuraTap’s Connect Book keeps tabs on people you’ve chosen to connect with.
This, to me, sounds a little weird, and more like a dating app than a professional networking app. But then, it also sounds kinda fun. The big difference between AuraTap and InSpaze, as far as I can tell, is that all connections are one-on-one, as opposed to meetings of groups up to eight people (and, of course, the random aspect of AuraTap). Who knows, maybe Artur and Phil are on to something. I’d dearly love to know what inspired this particular app design!
UPDATE April 2nd, 2026: Well, I finally connected with my first person using AuraTap! We chatted for about two and half minutes (sorry, but I forgot to take a screenshot!). He was from the Philippines and worked in IT. It was the first chat for both of us!
I clearly do not know what I am doing, because I couldn’t see anywhere where I could actually add him to my Connect Book to talk to later if I wanted to. When that 2-1/2-minute chat abruptly ended, while I was waiting for somebody new to chat with, you are presented with a carousel of other potential people to chat with (if they were online), like FriedGuy here:
Again, apologies for the off-kilter angle of this shot! I’m going to have to tilt my head a bit more to the left side before I do any screen captures in future… 😉
Again, while I’m not completely sold on the random interaction model, it is kinda fascinating to see who you might connect with. So I will continue to pop in over the coming Easter long weekend, just to see who I will chat with.
UPDATE 12:19 p.m.: Okay, so AuraTap is basically ChatRoulette, but with Apple Personas rather than using your webcam. Apparently, they used GenAI to generate ideas for a Personas-based app. and this was the one that Artur and Phil decided to run with.
EDITORIAL NOTE: In writing this blogpost, I have only used generative AI one time (to help me remember the actual names of Steam VR Home and Meta Horizon Home, typing in a description of what these environments do into Anthropic’s Claude AI to obtain the actual names). In other words, I used Claude as a sort of dictionary where you describe what it is you’re looking for, and ask for its proper name. Works like a charm!
I do not rely on using GenAI tools to write or edit my blog posts, unless otherwise specified in a note such as this, at the top of the post. And yes, I have been regularly using an em dash in my writing long, long before ChatGPT came along! 😉
I was surprised to discover, finger-swiping and pinching my way through the Apple Vision Pro subreddits I follow using the Pioneer for Reddit app (while in the Apple Vision Pro, of course!), that the Apple Vision Pro was already celebrating the two-year anniversary of its release in the United States. We Canadians and citizens of about a dozen other countries were only able to get our hot little hands on AVPs later, of course (I had a particularly tortured road until I finally was able to use mine, as explained here, including several frustrating and time-consuming incidents trying to communicate with both Apple’s and UPS’s AI-powered chatbots in efforts to speak with an actual live human being). But, as usual, I digress.
I have been thinking a lot lately about why I am so enamoured with my Apple Vision Pro, and how it compares to the many previous Windows PCVR and standalone VR/AR headsets I have used since January 2017 (Oculus Rift, Oculus Quest 1/2/3, Valve Index, Vive Pro 2). Also, I have been thinking a lot about how I have been using those different headsets, and again, why my use of the AVP has been such a radical departure from previous virtual reality gear. So this blogpost is my attempt to summarize all those thoughts, and get them down on—hmmm, well, not paper, exactly, but pixels?—to share them with you, my faithful blog readers. (By the way, I very much appreciate those of you who do actually take the time to read my ramblings!)
Any sufficiently advanced technology is indistinguishable from magic. —Arthur C. Clarke
First, the technology of the Apple Vision Pro makes the device feel magic, and I still feel that sense of awe and appreciation while wearing it every day. Shortly after my first week of use, in a message I first excitedly shared with my friends on Second Life, first quoted here on my blog, I stated:
The Apple Vision Pro makes every single VR headset I have used to date feel like one of those red plastic View-Masters I used to play with as a kid in the 1960s. The “screen door” effect so evident in earlier VR headsets (where you can see individual pixels, making everything slightly blurry) is COMPLETELY, UTTERLY gone.
The Apple Vision Pro’s display resolution is 50 times more dense than the iPhone, and such a startling leap forward, that I often like to joke, it makes all the older VR/AR headsets I have ever worn feel like a cheap plastic View-Master toy!
After decades of working on Microsoft Windows computers, I used the Apple Vision Pro (and in particular, what I consider its killer feature, Mac Virtual Display) to switch almost completely to macOS and the Apple ecosystem. Let me walk you through a typical workday. I arrive at my cubicle in the librarian’s shared office space, turn on my MacBook Pro, and unpack and set up my Apple Vision Pro. I remove my prescription eyeglasses, put my AVP on, adjust the straps across the back and top of my head for a comfortable fit, and select my usual environment, Mount Hood, the tallest mountain in Oregon:
My preferred Apple Vision Pro Environment for work is Mount Hood, Oregon because I like to be surrounded by pine forest.
I can adjust how much my chosen Environment blends with my cubicle office space, by twisting the knob on the upper right of my AVP. Most times, I like to have it set up around 90-95%, so that I feel I am surrounded by forest, with the lake and Mount Hood to my back, but enough of the real world still pokes through so I can, for example, easily grab my insulated Winnipeg Folk Festival coffee mug (with an environmentally-friendly metal straw, so I can take a sip more easily while wearing my AVP!). When I use my Apple Magic keyboard, it automatically highlights itself as my hands hover over it, pulling itself out of the forested ground when I look down. Everything just works. It’s magic.
Usually, I have the Apple Music app pinned to my right side, and I select a playlist (usually instrumental new age music, but it can vary depending on my mood).
Sorry, any screen captures I take in my Apple Vision Pro always tend to be a bit lopsided! I need to learn how to angle my head correctly.
I pop in my Apple AirPods, and then look at my MacBook Pro. A virtual Connect button hovers over the MacBook Pro’s screen, I tap my finger and thumb together to select it, et voilà! A large, adjustable ultra-high-definition screen appears over my desk, a sharp, crystal-clear wide screen where I can rearrange my macOS windows to my heart’s content: Outlook for email, Word for whatever report I am working on, my latest PowerPoint presentation, my Firefox web browser, etc.
I now spend between four and six hours of my workday in this productivity cocoon. If I need to get up (say, to reheat my coffee in the microwave), I unplug the AVP battery from its power cable, place the battery in my left front pocket, and walk around the office. I exit the Mount Hood environment, which remains in place like a virtual office partition. If, on my way to the microwave, I happen to look behind me, I can still see my huge Mac Virtual Display, and the Apple Music window hanging in midair at my workstation.
This setup gives me two things: focus and pain relief.
First, the ability to isolate myself (literally, throwing an immersive, three-dimensional virtual environment around myself) gives me the ability to focus on the task at hand, and I find it helps with my overall productivity. I can even get into a much-desired flow state. (Interestingly, the second-edition Apple Vision Pro with the higher-end M5 processing chip seems to have completely alleviated a problem I had with the original-model AVP, which was I would develop eyestrain after at about the two-hour mark while using it with the Mac Virtual Display feature. The new dual-loop Dual Knit headband is also an improvement over the original, single-band knit headband.)
Second, I have a couple of deteriorating joints in the cervical part of my spine, which unfortunately limits how much time I can spend sitting in front of a desktop computer monitor and keyboard. I have noticed that I can work for longer periods of time, with less neck and shoulder pain, when using the Mac Virtual Display feature on my Apple Vision Pro with my MacBook Pro, than I can in any other workstation setup (including just my MacBook Pro with an external monitor). I am truly grateful that the technology is now sufficiently advanced to help alleviate my pain!
As far as I am concerned, the Mac Virtual Display feature is THE killer app on the Apple Vision Pro. While I have been browsing the AVP subreddits and downloading and installing various apps, I find I use the Virtual Display far more than any other app or program (at least, right now). No other VR headset can give me what the AVP offers, or even come close. The thousands of dollars I have spent on the first and now second editions of the Apple Vision Pro over the past two years have been worth every. single. penny. I cannot imagine living and working without this device.
With all the Windows PCVR and standalone VR/AR headsets I have used, I had always been hopping between one app or another (usually a metaverse platform like Sansar or VRChat, because that is my personal hobby and my research interest). I spent very little time in places like Steam VR Home, or the Meta Horizon Home, where you can see your library of installed VR/AR applications and games, launch them, and switch between apps. But in the Apple Vision Pro, with the Mac Virtual Display feature, I find I am using the device more like a filter or environment through which I am doing actual work with pre-existing programs like Microsoft Office, as opposed to loading and running virtual-reality-native apps. You can see immediately how this is a big difference. I would never for a second even think of using my Meta Quest 3 headset to edit a document in Microsoft Word, or fire off an email, yet I do those sorts of things without a second thought in my Apple Vision Pro.
Which leads me to my next important point: why the relative lack of AVP-native apps and programs is not as serious a problem as it would appear at first glance. When you use the device as a filter, or an environment, as you do with the Mac Virtual Display feature, you are using it with the much richer library of apps and programs available on macOS. Add to that the thousands of iOS apps you can run in flat-screen mode on the AVP (e.g. Firefox, my go-to web browser), and you can see why I am not too terribly concerned about this issue.
But it would appear that many consumers are concerned at how (relatively) slowly new, native-AVP apps and programs are being added to the Apple App Store. In a post made four days ago to the r/VisionPro subreddit, someone asked:
So I finally pulled the trigger and bought an Apple Vision Pro, and honestly… wow. The hardware is insane. The display, hand tracking, eye tracking, immersion – it genuinely feels like a glimpse into the future. Watching films, browsing the web, even basic spatial apps feel miles ahead of anything else I’ve tried.
That said, I can’t shake one big concern: developer support is thin.
Right now it feels like there are hardly any apps that are actually built for Vision Pro. Yes, iPad apps technically work, but that’s not the same as native spatial experiences that really show off what this thing can do. After the initial “this is amazing” honeymoon phase, you start noticing how limited the ecosystem still is.
My worry is this: if Vision Pro doesn’t gain real traction, Apple could quietly scale it back or pivot, and developers will have even less incentive to build for it. That becomes a vicious circle — fewer users → fewer apps → even fewer users.
I really want this platform to succeed because the tech absolutely deserves it. But at the moment it feels like we’re relying on Apple’s long-term commitment and patience more than anything else.
Curious what other Vision Pro owners (or devs) think. Are we just early and impatient, or is the lack of native apps a genuine red flag?
This question sparked some developers and other users to weigh in, with some very insightful commentary, which I wanted to share here with you:
I think Apple knew this going in and that’s why this device is almost like a prototype in a way. They need it in consumers hands to know what it will turn into. They knew the price point wasn’t for general consumption, but the only way to mold this thing into a future device for the masses that has better battery, less weight, and more importantly, costs less, was to get it into the hands of people and watch it do its thing.
Hi,Vision Pro developer here. Long response incoming (TLDR at bottom). You and other users have responded with what I think is a correct analysis that there’s an economics issue in that people won’t buy the Vision Pro until there’s sufficient app support, while developers can’t afford to make a dedicated Vision Pro app until there’s a sufficient user base. I can maybe provide some more perspective on some other aspects of Vision Pro development.
I truly believe that spatial computing is the future of computing, but it won’t be with the current version of Vision Pro. Essentially, I see this iteration of Vision Pro as a (very) cool device for media consumption and a dev tool. In the future, Apple (or some other company, but my money is almost always on Apple) will likely release the product that breaks through with consumers, whether it be the upcoming glasses or some vastly improved Vision Pro, and then developers will begin work making the apps for that eventual product. My personal development projects on Vision Pro are done with the certainty that they will be made at a financial loss to myself, but in the hope that learning how to build streamlined apps and leverage the capabilities of the current device will allow me to be better positioned to be a developer for the breakthrough model. As a developer, this is the time to be experimenting with 3D user experience, to learn what works and what doesn’t as an interaction model for experiences as immersive as Vision Pro allows.
There are also problems with what Apple allows developers to do. In truth, there’s very little freedom to push the device to its limits and make something really imaginative and unique. Apple has set out strict privacy considerations (which are good broadly speaking, but might be overkill at this point) that lock developers into predefined paradigms that Apple approves of. Of course Apple’s own apps don’t have to obey these restrictions, which allows them to make apps that feel magical, like Experience Dinosaurs. Having attended the Vision Pro Developer conference for the past two years, I can tell you that there are significant frustrations among the developer community over the restrictions Apple has placed.
From where I’m sitting, I think the interest among developers for Vision Pro is reasonably high, but most can’t afford to build for it until there are some big changes in the market. I think in the near future there won’t be more than a smattering of new native apps, mostly made by the passionate developers who see the potential, but once Apple releases the product that clicks for consumers the dam will open up. This will probably result in a flood of apps for this current generation of Vision Pro, as I think Apple has nailed the software side of this, and just needs to work on building a physical frame that consumers want to put on their head.
TLDR: Be patient. At some point spatial computing will likely take off on a future Vision Pro-like model, and then the developers will come.
Developers aren’t going to invest heavily in the platform until there’s more users. Apple knows this. Apple is getting the OS and dev tools maturing while they work towards more consumer-friendly versions of their Vision line. They needed the hardware out and in user and developer’s hands to really start moving forward. Traction will come, I sincerely don’t think there’s anything to worry about there.
I agree wholeheartedly with the second commenter, the developer who stated that “people won’t buy the Vision Pro until there’s sufficient app support, while developers can’t afford to make a dedicated Vision Pro app until there’s a sufficient user base.” It’s a classic chicken or the egg problem, which is why what I said earlier is so important. The number of available apps and programs for the Apple Vision Pro doesn’t really matter at this point (at least, for me), because I am pretty much using it as an immersive environment through which I am running other programs. To date, the only native-AVP apps I have been running regularly have been the previously-mentioned Pioneer for Reddit app, InSpaze, and Explore POV! (I have, however, been avidly collecting dozens of free and inexpensive AVP apps based on recommendations posted to the r/AppleVisionPro and r/VisionPro subreddits! One day, probably when I am on my upcoming research and study leave, I will start to explore more AVP-native programs and apps. In fact, two days ago, Google finally released a version of its popular YouTube video-watching app for the Apple Vision Pro!)
As I said up top, Mac Virtual Display is the killer feature I use most often.And that is what makes my use of the Apple Vision Pro so dramatically and drastically different from previous VR/AR headsets. It’s a productivity tool first, and with my continuing neck and shoulder pain, it’s also been a pain management tool second, an unexpected but not unwelcome way to get through an eight-hour workday with as little discomfort as possible. I am eternally grateful that the technology has actually evolved enough, just in time, to help me still be productive despite my pain! And for those two reasons alone, it is worth every single penny I have spent on this device. As I said before, I am all in.
The upgraded Apple Vision Pro has been a Godsend, and worth every penny I have spent!
PLEASE NOTE: This is now a somewhat edited first draft of the notes I was frantically taking during this livestream, because I wanted to get the information out there on this very interesting application of the Apple Vision Pro! Yesterday I came across this announcement of how the Apple Vision Pro was being used in research to determine its effectiveness as a support for those suffering from anxiety and depression. As an avid AVP user, as a subscriber to Explore POV, and as a mental health consumer, I was definitely not going to miss this presentation, which was being streamed on LinkedIn (a first for me; usually I am on Microsoft Teams or Zoom for this sort of online event).
I was originally thinking I would go in using my AVP’s Virtual Display feature with my MacBook Pro (my usual work setup lately, what with my neck and shoulder pain), and then I thought: naaah, let’s not overcomplicate things. Apparently, this is also being streamed to YouTube, which I will look for later. UPDATE: Added the YouTube link at the end.
Any omissions and errors are my fault; sorry guys, I can only take notes so quickly!
The speakers in the livestream were:
Hala Darwish, Associate Professor, School of Nursing/Neurology/Neuroscience Graduate Program, University of Michigan (currently conducting research, still in its very earliest stages)
Jeremy Dalton, XRHQ (moderator of the event; formerly PwC Head of Immersive Technologies)
James Hustler, Explore POV (3D video creator, whose app was chosen as the Apple Vision Pro App of the Year 2025, https://exploreimmersive.com)
Event description: By immersing patients in breathtaking natural environments using the Apple Vision Pro, research is now underway to discover whether these experiences can support those suffering from anxiety and depression. Join James Hustler, creator of the award-winning Explore POV app, and Dr. Hala Darwish, Associate Professor at the University of Michigan, for a live discussion hosted by Jeremy Dalton from XRHQ. Together, we will explore the technology, the clinical thinking that inspired it, and what it could mean for the future of digital therapeutics.
(Unfortunately, I missed the first few minutes while I was fiddling with my sound settings, and trying to get my earbuds to work properly, so I missed Jeremy’s and Hala’s introductions.)
James Hustler travels the world to record amazing 3D videos and share them via his subscription service, Explore POV (which I have written about before here). He had been living in a motor home in New Zealand during the pandemic, when he had started recording 3D videos to share with friends.
Hala is in early stage research, interested in the relationship between mental health and the environment. Many people do not have access to certain environments (e.g. an urban environment with very little nature). Also, people can have access issues (e.g. a disability). Hala was looked into VR as an alternative to real-life nature experiences, and in 2019 when she started, the tech wasn’t quite ready (they tried with 360-degree videos, and she felt it didn’t really work well, i.e. low resolution; caused motion sickness, etc.). She then tried computer-generated nature graphics for patients with MS (multiple sclerosis). In 2023, the Apple Vision Pro was released, and Hala had a demo. The decision was made to switch from 360-degree VR video to 180-degree VR video.
James: The VR 360-degree video format is not new, but until recently, it hasn’t been at a high-enough resolution to create a true sense of presence. i.e., it changes from an intellectual response to an emotional response of being there. Explore POV is now recording at 16K resolution, and experimenting with Apple Immersive Video. The goal is to capture a scene so that the user feels like it’s lifelike and real to them.
Hala: transporting the individual to these natural environments does appear to have health benefits (mental and physical health, stress relief, etc.). In addition to anxiety and depression relief, Hala’s area of research, VR is also being used for the treatments of phobias (exposure therapy), performance anxiety (e.g. fear of public speaking), and as a method of pain management and distraction, among other uses.
James, when asked about feedback to his videos: Explore POV was created as a travel app, but people by the hundreds are contacting him about the mental and emotional response to the VR video scenes, telling him it’s the first time they’ve climbed a mountain or paddled a kayak. People have told James that they use the Explore POV app to relax after a stressful day’s work. This sort of feedback has opened James’ eyes to the possibilities of 3D video in VR. He had originally approached his work from a technical challenge (e.g. how do I create the highest-resolution 3D videos in VR?). He stressed that all these responses are anecdotal, but that we need scientific evidence.
Hala, in talking about her research: we want to run clinical trials (but we are currently testing feasibility and safety with a limited number of patients with progressive multiple sclerosis and depression). If we give AVPs to patients to use at home (e.g. with disability), how are they going to be able to use the headsets? The first study splits the patients into three groups. It’s a cross-over study: one group gets standard treatments first, then VR treatment, the second group gets VR treatment first, then standard treatments. The third group has just standard treatments, with no VR intervention. It is an early-stage feasibility study, with 14, 14, and 12 patients in the three categories of patients being looked at. She is also interested in researching longer-term responses to VR treatment.
Hala: in my opinion, exposure to natural scenes in VR appears to be a good adjunct to standard therapy. It’s still too early to come to any definitive conclusion. We first want to see if it has an impact on stress and anxiety levels, and then eventually expand to a larger number of patients (right now it’s a small number).
Which environments create greater impacts? James: we would expect to see what we’ve seen compared to previous academic research studies using real-life nature scenes (e.g. MRI brain scans after exposure to nature, e.g. taking a hike). There is already a good body of academic literature dealing with the impact of real-life nature on people’s anxiety and depression.
But we don’t have anything beyond anecdotal results for the use of nature in VR so far, nothing scientific; this research is still in its very earliest stages. For example, one early patient had a very good response to a desert environment (but it’s only a sample size of one!). James: if we’re aiming for calmness, certain VR video environments would probably help with that, e.g. flowing water, watching a sunset while sitting on a mountain, etc.). But again, at this point it’s purely anecdotal.
I asked a question in the text chat during the livestream that was actually asked of James, the creator of Explore POV, which was: Has James created specific VR video environments for Hala’s research? The answer was no; James has not yet created specific VR video environments for Hala’s research. However, they’ve now shot approximately 200 videos in 20 different countries so far for the Explore POV app (I think he said 200, but it was hard to take accurate notes!). He notes that they are a small, nimble team who can rapidly adjust to meet any requests from Hala’s research team, if needed in the future.
My question got asked!!
In response to a question from another user about the use of Apple’s SharePlay feature, where you can share an experience together with other Apple users via their Personas: James would love to add this feature, if he can. Yes, he would love to make Explore POV more of a multiplayer experience, if possible. He talks about people sequentially experiencing the same VR video in Explore POV, and thereby “sharing” the experience with others (e.g. a father and his daughter, if I remember correctly).
James: for people who can’t physically travel due to disability or for soke other reason, the technology is unlocking experiences that they might never experience otherwise. He thinks that it’s an amazing position to be in where we can give some of these people a taste of visiting remote places, with impacts in not just healthcare but also conservation, education, etc.
Hala: the academic research process is slow due to recruitment bottlenecks, but she estimates 2 years for the duration of the study (before results are published). he notes that most of the time, the people who most need the nature exposure do not have the opportunity to access it (for example they cannot afford an Apple Vision Pro).
(Unfortunately, Hala crashed out of the stream soon her comments, and the other two speakers wrapped it up!)
Conservation, education and healthcare are the three areas of what James wants to focus on with Explore Immersive. In addition to working with Hala on her research study, he’s also working on conservation and education applications as well. He hopes to start new partnerships in these three key areas, and wants to make Explore POV more than “just a travel app.”
Here’s the 53-minute YouTube video, in case you missed the livestream (unfortunately, you do have to actually go over and watch it on YouTube, as I am not allowed to embed it into my blogpost). Sorry! I do very strongly urge you to go over and watch it, though; it was amazing and inspiring.
WARNING: This is yet another one of my Ryan-Schultz-patented meandering editorial blog posts, written during the week I turn 62 years old. I promise you, I will soon return to regularly-scheduled programming about (as the tagline of my blog now states) “News and Views on Social VR, Virtual Worlds, and the Metaverse, plus Artificial Intelligence and Generative AI’s Impact on the Metaverse.” (Hey, at least, this time, I didn’t write a whole goddamn paragraph for the blogpost title. 😜)
My birthday always falls in the very coldest of winter weather here in Winnipeg, and today has been the coldest day this season by far:
Screenshot
According to the Environment Canada website, with the windchill factored in, it feels like -48°C (that works out to -54°F for you metric-system-averse Americans). Even worse, it’s going to stay this cold for at least the next seven days, according to the forecast:
This is the time of year when we intrepid Winnipeggers, bundled up in layers covering every square centimetre of skin except for the eyes, stumble between our homes and our cars, and then rush from our cars to our workplaces, mumbling the following hallowed mantra: “noearthquakesnovolcanoesnoearthquakesnovolcanoesnoearthquakesnovolcanoes….“
But fear not! While I beaver away in my (thankfully heated) cubicle at the University of Manitoba Libraries, I am surrounded by the sights and sounds of gentle waves rippling along a sandy beach in Bora Bora, one of the Apple Vision Pro’s expertly-designed immersive Environments:
The clouds gently hover, and the palm trees sway, as I work away on my MacBook Pro, using the Virtual Display feature in my Apple Vision Pro headset. Simply by reaching up and turning the upper right knob on my AVP, I banish my drab workspace surroundings in wintry Winnipeg, and replace it with a tropical paradise!! (Drinks with umbrellas not included; they would frown upon that at work.)
I have already written at length about my continuing neck and shoulder pain, due to a couple of deteriorating joints in the cervical part of my spine, the first serious sign that my aging body is starting to wear out. However, having now had some everal months’ experience with this discomfort, I now know that the two biggest triggers of that pain are:
Sitting too long in front of a desktop computer or notebook computer, hunched over my keyboard; and
When I get stressed, my neck and shoulder muscles tend to tighten up, and soon my shoulders are aching.
So, I now spend between 4 and 6 hours per workday using the Mac Virtual Display on my trusty Apple Vision Pro headset with my MacBook Pro, because I have discovered that, instead of looking down at a small screen at arms-length, my neck gets less sore, and I can work for longer stretches, looking up and ahead at a large, clear, ultra-high-definition screen hovering in the space over my desk, which is designed to appear as if you were looking at it from about 1.8 metres/6 feet away from my eyes:
Focal distance in the context of VR headsets refers to the distance at which the lenses allow your eyes to focus comfortably. In the case of the Apple Vision Pro, the actual focal distance is set around six feet.
This means that, regardless of the virtual distance of an object in the digital space, your eyes will focus as if that object were six feet away.
Also, when I upgraded my AVP from the first edition (with the M2 graphics processing chip) to the refreshed model (which contains a top-of-the-line M5 chip), I noticed that the eyestrain I used to experience after about an hour and a half while wearing the unit has completely disappeared. Hooray! And the new dual-strap knit band fits much more comfortably on my big fat head. Aside from the occasional neck-wrenching mishap, the Apple Vision Pro is worth every single penny I have spent on it. And I will be first in line to purchase the next edition of this wonderful headset. As I said before, I am all in.
Thankfully, I have finally received the final report from the Ergonomics Office at my university, with a detailed shopping list of recommended equipment to purchase. Like many of my younger work colleagues, I will be getting an adjustable-height sit/stand desk, risers to place my MacBook Pro and my brand-new Dell Windows notebook at the proper eye height, new desktop monitor holders and keyboard trays, etc. I am also learning (with the help of my ergonomist and my physiotherapist), how to take regular breaks, to stretch, walk around the office, and do some neck, shoulder, chest, and upper-back strengthening exercises.
The good news is that, because of all these changes, I am now in less frequent pain than I was a few months ago. But it has come at a cost. You see, I need to save what I like to call my “good neck” hours for my paying job as an academic librarian, which means that I have has to cut back significantly on my extracurricular, after-hours activities that used to require me to spend similarly long stretches of time sitting in front of a desktop computer at home.
One of those activities that I have had to cut back on is, unsurprisingly, my beloved virtual world of Second Life. Trying to navigate my small army of avatars and alts through all the Advent calendars and Christmas gifts in December just about did me in last month, and I have decided that my body is telling me that I desperately need to rebalance my real life/Second Life ratio a little bit, and spend more time in (gasp!) the real world. 😜
Speaking of the real world, I have maintained my boycott of mainstream social media platforms, in order to continue to focus on my good mental and emotional health. And for the same reason, I am not really paying attention to the traditional news media right now, either; if I have zero personal control over it, I simply don’t want to know. Every so often, my eyes hover over a newspaper headline at the supermarket checkout line with the latest story about Trump and Greenland, I grimace and roll my eyes, and I promptly move on with my day, focusing on those things I do have some control over (like my job, my friends, my community, and my obsessive little hobbies like Second Life). I have found that, simply by avoiding toxic social media and if-it-bleeds-it-leads news media and the doomscrolling both trigger, I have never been in a better headspace overall, and I intend to continue this approach moving forward into what appears to be yet another year of batshit craziness, train wrecks, and dumpster fires.
Trump who? Greenland what?? Don’t care, not my circus, not my monkeys.
I find I don’t miss Facebook, Instagram, and Twitter/X/whatever the fuck Apartheid Clyde is calling it this week, at all, and I spend precious little time on Mastodon, Bluesky, and Substack (although I do check the latter from time to time, mostly for AI/GenAI news). The only social media spots I pop into now are selected subreddits on Reddit (like r/AppleVisionPro and r/VisionPro), Primfeed (think Twitter/X, but only by and about Second Life), and now Tumblr (for the wonderfully creative Heated Rivalry fan art, memes, and fan-edited music videos using clips from the TV show). Even a couple of Discord servers devoted to Heated Rivalry have popped up, where fans share fanfic recommendations! It lifts my spirit and makes me happy.
For the past four weeks, ever since my SL friend first told me about Heated Rivalry and suggested I watch the show, I have been riding a wave of feel-good hormones like serotonin and oxytocin from the Crave TV series about a hidden love affair between two professional-league hockey players (I wrote about it here). And I am not the only one feeling that heady rush after watching the show! Many commenters in posts on the r/HeatedRivalry subreddit talk about the impact the show has had on them, and many have watched the entire TV series multiple times. The best and most concise summary of this phenomenon (which one joker suggested we call “the Heated Rivalry Mass Psychosis Event” 😂) is that watching the show makes you feel as though you are falling in love. There are many Reddit and Tumble posts from people who, like me, feel that the show has given them an important insight on their lives and how they are living them.
The following Reddit post is one example I saved because I could relate to it so much:
One Heated Rivalry fan’s emotional response to the show. I could 100% relate to this person saying that they had cut themselves off from dating, romance, and intimacy, because watching the TV show made me realize the exact same thing about myself. I could share with you dozens of other examples from Tumblr and Reddit about how the show has impacted viewers. This show has genuinely struck a chord with many people in the LGBTQIA+ community (and probably in the straight world, too).
And—just as I had with the movie Brokeback Mountain, almost exactly 20 years ago—After watching the Heated Rivalry television series, I bought and read books 2 and 6 of Nova Scotia author Rachel Reid’s book series Game Changers (the original source material for season 1 and the already-greenlit season 2 of Heated Rivalry), and then dove head-first into the Heated Rivalry/Game Changers-inspired fan fiction posted to Archive of Our Own (AO3 for short). Here’s a link to AO3 of HR/GC fan fiction, sorted in descending order by kudos (fan likes). WARNING: please note that many of these fanfics have an adult content warning for explicit gay sex scenes! One of the ironic things I find about explicit gay fan fiction (also called slash fiction) is that it is primarily written by, and read by, an audience that is predominantly straight women (although, of course, it also has many fans among the LGBTQIA+ community).
Connor Storrie (left) plays Ilya Rozanov and Hudson Willaims plays Shane Hollander in the surprise hit Crave TV series Heated RIvalry (showing on HBO Max in the U.S.)
Yesterday, my hometown newspaper, the Free Press, devoted a full two-page spread about how Heated Rivalry has become a major pop-culture moment, with ripples spreading out far beyond the queer community and fan fiction writers. I had to laugh when I read a column (original; archived version if you hit a paywall) where three FP reporters were discussing their squeamishness about watching the gay sex scenes in Heated Rivalry. Straight people clutching their pearls over depictions of gay sex in mainstream media are just so funny to me.
I mean, c’mon, people. For God’s sake, if you’ve ever watched Bridgerton, there’s just as much (non-genital but ass-showing) nudity and (non-X-rated) sex happening there, and nobody needs to fetch their smelling salts for that! We do the exact same things in bed that you do, straight people (and no, on second thought, I am NOT gonna spell it out for you here). 😉 Please get OVER yourself.
Okay, end of rant…switching to other topics.
I have two things coming up that I wanted to share with you, my faithful blog readers. First, I have been honoured to be asked to be one of the keynote speakers at the 2026 Virtual Worlds Best Practices in Education conference, taking place in the virtual world of Second Life March 19-21, 2026. Of course, I said yes! I haven’t picked a topic or even a presentation title yet, but expect an announcement soon-ish.
Second, although it is not official official (and I really should wait until I get the official letter from university administration, which I was told should happen about the end of March), the University of Manitoba Libraries has approved my application to take a one-year Research and Study Leave (at full salary) to start later this year, where I am relieved of my regular academic librarian duties, and can work on a special project. Academic librarians at the University of Manitoba are members of the faculty union, and just like the the professors, we have the right (and the opportunity) to pursue research. Again, more details later. I’ve only mentioned this to a couple of people so far, but I think I can share that much detail at this time.
So 2026 is going to be a very interesting year for me, on several fronts! Heated Rivalry has inspired me to make some significant choices and changes already (some of which you will hear about, and others you won’t). Wish me luck!