My Notes from an XRHQ Live Streaming Event on LinkedIn and YouTube — Pixels & Pills: Breaking Research on Immersive Treatment for Mental Health, using the Apple Vision Pro and Explore POV (January 29th, 2026)

PLEASE NOTE: This is now a somewhat edited first draft of the notes I was frantically taking during this livestream, because I wanted to get the information out there on this very interesting application of the Apple Vision Pro! Yesterday I came across this announcement of how the Apple Vision Pro was being used in research to determine its effectiveness as a support for those suffering from anxiety and depression. As an avid AVP user, as a subscriber to Explore POV, and as a mental health consumer, I was definitely not going to miss this presentation, which was being streamed on LinkedIn (a first for me; usually I am on Microsoft Teams or Zoom for this sort of online event).

I was originally thinking I would go in using my AVP’s Virtual Display feature with my MacBook Pro (my usual work setup lately, what with my neck and shoulder pain), and then I thought: naaah, let’s not overcomplicate things. Apparently, this is also being streamed to YouTube, which I will look for later. UPDATE: Added the YouTube link at the end.

Any omissions and errors are my fault; sorry guys, I can only take notes so quickly!

The speakers in the livestream were:

Hala Darwish, Associate Professor, School of Nursing/Neurology/Neuroscience Graduate Program, University of Michigan (currently conducting research, still in its very earliest stages)

Jeremy Dalton, XRHQ (moderator of the event; formerly PwC Head of Immersive Technologies)

James Hustler, Explore POV (3D video creator, whose app was chosen as the Apple Vision Pro App of the Year 2025, https://exploreimmersive.com)

Event description: By immersing patients in breathtaking natural environments using the Apple Vision Pro, research is now underway to discover whether these experiences can support those suffering from anxiety and depression. Join James Hustler, creator of the award-winning Explore POV app, and Dr. Hala Darwish, Associate Professor at the University of Michigan, for a live discussion hosted by Jeremy Dalton from XRHQ. Together, we will explore the technology, the clinical thinking that inspired it, and what it could mean for the future of digital therapeutics.


(Unfortunately, I missed the first few minutes while I was fiddling with my sound settings, and trying to get my earbuds to work properly, so I missed Jeremy’s and Hala’s introductions.)

James Hustler travels the world to record amazing 3D videos and share them via his subscription service, Explore POV (which I have written about before here). He had been living in a motor home in New Zealand during the pandemic, when he had started recording 3D videos to share with friends.

Hala is in early stage research, interested in the relationship between mental health and the environment. Many people do not have access to certain environments (e.g. an urban environment with very little nature). Also, people can have access issues (e.g. a disability). Hala was looked into VR as an alternative to real-life nature experiences, and in 2019 when she started, the tech wasn’t quite ready (they tried with 360-degree videos, and she felt it didn’t really work well, i.e. low resolution; caused motion sickness, etc.). She then tried computer-generated nature graphics for patients with MS (multiple sclerosis). In 2023, the Apple Vision Pro was released, and Hala had a demo. The decision was made to switch from 360-degree VR video to 180-degree VR video.

James: The VR 360-degree video format is not new, but until recently, it hasn’t been at a high-enough resolution to create a true sense of presence. i.e., it changes from an intellectual response to an emotional response of being there. Explore POV is now recording at 16K resolution, and experimenting with Apple Immersive Video. The goal is to capture a scene so that the user feels like it’s lifelike and real to them.

Hala: transporting the individual to these natural environments does appear to have health benefits (mental and physical health, stress relief, etc.). In addition to anxiety and depression relief, Hala’s area of research, VR is also being used for the treatments of phobias (exposure therapy), performance anxiety (e.g. fear of public speaking), and as a method of pain management and distraction, among other uses.

James, when asked about feedback to his videos: Explore POV was created as a travel app, but people by the hundreds are contacting him about the mental and emotional response to the VR video scenes, telling him it’s the first time they’ve climbed a mountain or paddled a kayak. People have told James that they use the Explore POV app to relax after a stressful day’s work. This sort of feedback has opened James’ eyes to the possibilities of 3D video in VR. He had originally approached his work from a technical challenge (e.g. how do I create the highest-resolution 3D videos in VR?). He stressed that all these responses are anecdotal, but that we need scientific evidence.

Hala, in talking about her research: we want to run clinical trials (but we are currently testing feasibility and safety with a limited number of patients with progressive multiple sclerosis and depression). If we give AVPs to patients to use at home (e.g. with disability), how are they going to be able to use the headsets? The first study splits the patients into three groups. It’s a cross-over study: one group gets standard treatments first, then VR treatment, the second group gets VR treatment first, then standard treatments. The third group has just standard treatments, with no VR intervention. It is an early-stage feasibility study, with 14, 14, and 12 patients in the three categories of patients being looked at. She is also interested in researching longer-term responses to VR treatment.

Hala: in my opinion, exposure to natural scenes in VR appears to be a good adjunct to standard therapy. It’s still too early to come to any definitive conclusion. We first want to see if it has an impact on stress and anxiety levels, and then eventually expand to a larger number of patients (right now it’s a small number).

Which environments create greater impacts? James: we would expect to see what we’ve seen compared to previous academic research studies using real-life nature scenes (e.g. MRI brain scans after exposure to nature, e.g. taking a hike). There is already a good body of academic literature dealing with the impact of real-life nature on people’s anxiety and depression.

But we don’t have anything beyond anecdotal results for the use of nature in VR so far, nothing scientific; this research is still in its very earliest stages. For example, one early patient had a very good response to a desert environment (but it’s only a sample size of one!). James: if we’re aiming for calmness, certain VR video environments would probably help with that, e.g. flowing water, watching a sunset while sitting on a mountain, etc.). But again, at this point it’s purely anecdotal.

I asked a question in the text chat during the livestream that was actually asked of James, the creator of Explore POV, which was: Has James created specific VR video environments for Hala’s research? The answer was no; James has not yet created specific VR video environments for Hala’s research. However, they’ve now shot approximately 200 videos in 20 different countries so far for the Explore POV app (I think he said 200, but it was hard to take accurate notes!). He notes that they are a small, nimble team who can rapidly adjust to meet any requests from Hala’s research team, if needed in the future.

My question got asked!!

In response to a question from another user about the use of Apple’s SharePlay feature, where you can share an experience together with other Apple users via their Personas: James would love to add this feature, if he can. Yes, he would love to make Explore POV more of a multiplayer experience, if possible. He talks about people sequentially experiencing the same VR video in Explore POV, and thereby “sharing” the experience with others (e.g. a father and his daughter, if I remember correctly).

James: for people who can’t physically travel due to disability or for soke other reason, the technology is unlocking experiences that they might never experience otherwise. He thinks that it’s an amazing position to be in where we can give some of these people a taste of visiting remote places, with impacts in not just healthcare but also conservation, education, etc.

Hala: the academic research process is slow due to recruitment bottlenecks, but she estimates 2 years for the duration of the study (before results are published). he notes that most of the time, the people who most need the nature exposure do not have the opportunity to access it (for example they cannot afford an Apple Vision Pro).

(Unfortunately, Hala crashed out of the stream soon her comments, and the other two speakers wrapped it up!)

Conservation, education and healthcare are the three areas of what James wants to focus on with Explore Immersive. In addition to working with Hala on her research study, he’s also working on conservation and education applications as well. He hopes to start new partnerships in these three key areas, and wants to make Explore POV more than “just a travel app.”

Here’s the 53-minute YouTube video, in case you missed the livestream (unfortunately, you do have to actually go over and watch it on YouTube, as I am not allowed to embed it into my blogpost). Sorry! I do very strongly urge you to go over and watch it, though; it was amazing and inspiring.

UPDATE: Some More Thoughts After Nine Weeks Using the Apple Vision Pro

You might have noticed that I haven’t been blogging much at all lately. There’s a good reason for that. I have been so completely occupied with my full-time paying job as an academic librarian during this busy fall semester that I have had little time or energy left over to write up blog posts when I get home from work.

I have, however, been making very good use of my new Apple Vision Pro VR/AR headset! I am currently using it anywhere from two to five hours a day. At work, I use the Virtual Display feature on the AVP to create a large, sharp, crystal clear virtual screen for my MacBook Pro notebook computer (and I often listen to playlists on an Apple Music screen set up to the right of my work screen). And at home, I am using the Apple Vision Pro as my primary means of consumption of 2D and 3D movies and television shows. It find that it is a great improvement over using my iPad! I have also been spending time in a few other AVP apps over the past two months, notably Explore POV and InSpaze.

Screen capture from the Explore POV website: the immersive videos are stunning

Explore POV (website) is a subscription service where you can download and watch stunning short immersive videos (only 2-5 minutes long) of various locations around the world. Unfortunately, over the past two months, I have encountered some problems with the service, twice losing all my downloaded videos when my Apple Vision Pro unexpectedly reset. Also, I have found it to be a somewhat expensive service for the still relatively small number of videos available. While I have thoroughly enjoyed the time I have sent in Explore POV, I simply cannot justify the cost (currently, CA$12.99 per month), and so I have decided to let my membership lapse. Even though the content is top-notch, and I want to support the creator who is traveling the world and sharing his videos, I cannot justify spending the equivalent of a streaming service subscription for such a small library of content. (Perhaps I can come back later when there is more to watch. Please note that there are free videos which you can watch in Explore POV, to let you get a taste of what is to offer.)

The other app which I have been using a lot is InSpaze, which is a way to connect with other Apple Vision Pro users (as well as iPhone users who wish to participate, although obviously the latter don’t get an immersive experience). InSpaze has become the gathering place where AVP users can compare notes on apps, report on and troubleshoot problems with the device, and just talk about anything. It neatly solves one problem most Apple Vision Pro users have: they are often the only person in their social circle who owns the device, so they have no other users to chat with (I myself fall into this category!).

InSpaze in windowed mode (seen in the Mount Hood environment in my Apple Vision Pro)
InSpaze in immersive mode (with apologies for the blurry picture quality; note that even in immersive, 3D mode, the Personas of the people participating are still appearing in flatscreen mode, in tiny windows). However, the Personas are “sitting” on seats in a circle in a three-dimensional room, and the sound is spatial (for example, the Persona sitting to your left, their speaking voice sounds as if it is coming from your left).

You can choose to either represent yourself by a cartoon avatar (using one of several built-in options), or use your Persona (your real-life head and hands, scanned by the Apple Vision Pro as part of the device setup). Most people opt for the latter. The avatars and Personas appear in flat-screen windows, as 3D Personas are still only available in FaceTime calls.

I actually had my very first FaceTime chat using 3D personas, and having the people you are talking with appear in your space as three-dimensional heads and hands was certainly a noteworthy and memorable experience! (I actually jumped in fright when one of them appeared to be sitting on my sofa, right next to me!! When we high-fived, there was even a flash of light between our palms! It felt like I had stepped straight into the future.)

I experienced by first-ever FaceTime call with 3D Personas with these gentlemen (we set up the meeting via InSpaze), and let me tell you, a simple 2D picture does not even BEGIN to do it justice! My mind was BLOWN.

Unfortunately, Apple is still keeping close tabs on its spatial Personas feature, and they have not (as yet) allowed other platforms (such as InSpaze) to make use of the technology yet. But there’s something inevitable about it, and I wouldn’t be surprised at all if it creeps into InSpaze and other apps as well. It’s just amazing. In fact, I would say that it’s one of the most amazing things about the Apple Vision Pro.

Another truly amazing feature, which officially rolled out with the update to version 2.0 of the VisionOS operating system used on the Apple Vision Pro (although it was available before that with third-party apps such as the Spatial Media Toolkit), is the ability to convert a two-dimensional photo or video and convert it into 3D! I have spoken with several InSpaze users who have been experimenting with this new feature, and they tell me how emotional it makes them to see an old photo of a long-lost relative or pet converted to three dimensions, and it somehow makes them feel that the person or pet is right there next to them again.

There are drawbacks, of course. People have run tests, taking 2D and 3D versions of the same picture, then converting the 2D image to 3D and comparing the converted and taken-directly three-dimensional images. These comparisons show small details where the conversion algorithm—as amazing as it is!—fails. For example, a flag appearing behind leaves on a tree in the foreground shows up as a rag hanging in the tree in the 2D-to-3D converted picture. But you have to really look to find these kinds of errors, comparing them to the shot-in-3D picture. (This is another topic we discuss in our rooms in InSpaze, of course!)


I have deliberately chosen to be an early adopter of the Apple Vision Pro, and I do not regret that decision one bit. I have been spending quite a bit of time in the r/AppleVisionPro and r/VisionPro subreddits since the device was first released in the United States in February, following along with every new development and debate. Some of these early subreddit users reported that the curved front glass of their Apple Vision Pros developed cracks, and showed us pictures. Of course, this occurrence, while still rare, happened often enough that it was dubbed “Crackgate” by Reddit users, who shared tales of how they had to navigate Apple support to get a repair or replacement unit under their AppleCare warranties!

Well, guess what?

I have spent a LOT of money on my Apple Vision Pro, and I have been treating it like gold. One morning at work, while unpacking it from its carrying case, I inspected the front glass as usual—only to discover a tiny crack right in the middle of the curved front glass! I immediately made an appointment with the Genius Bar at my local Apple Store and brought it in for them to look at (I was told I was the first person in my province to bring back an AVP with such a problem). Given that the crack was purely cosmetic (the device still worked flawlessly) and very small (about half a centimeter in length), we agreed that we would monitor it. They took pictures if the crack, and I decided—for now—not too opt for repair or replacement. (I did talk about it in a couple of InSpaze rooms, and I was told that I should follow up with a call directly to the Apple Vision Pro team in the Apple Support app, an option which I will probably follow up on, especially as it appears to me that said crack is growing slightly!)

Also, I am somewhat upset that a device on which I have spent a grand total of CA$7,700 (a 1-terabyte model Apple Vision Pro, with a carrying case and 2 years of AppleCare warranty, plus provincial and federal sales taxes), has ALREADY developed a half-centimeter crack after only 7 weeks of daily use! So stay tuned for further details on my own personal CrackGate…and yes, I have been told by people in the InSpaze rooms where I talked about this, that I should follow through on this, just on sheer principle!

The half-centimeter crack in the front glass of my Apple Vision Pro (see blue arrow)—and no, I don’t look this bad in real life, it’s just the reflection from the curved glass front!!!

I consider myself lucky that I am not among the many AVP users who have had problems finding a comfortable fit for their devices; I have been using the medium-sized solo knit band without issue, and my initial fit of the facial shield has been quite comfortable for extended periods of use. (It probably helps that I have been wearing VR headsets of one kind or another since January of 2017, so I am very used to the front weight!)

However, I have received regular warnings that my eyes were too close to the lenses, and I have also noticed a certain amount of eyestrain to set in after about 1-1/2 to 2 hours of continuous use (something which I had never really encountered much before with other VR headsets). After reading this document from Apple Support, I realized that I could swap out the light seal cushion on the facial shield to a slightly thicker version (which was included in the original box), and since I did that, it seems to have alleviated the problem. However, I only did this late last week, and I still do need to spend some more extended time in my AVP to confirm that this fix alone was enough to alleviate my eyestrain problem.

Overall, while I have had some pains and problems with my Apple Vision Pro, I have been quite happy with it. I’m not sorry that I jumped in early this time, despite the problems I have encountered to date. When I wear it, I feel like I have taken an important step into the future. Also, because the AVP is still so new and the community is still so small, there is a exciting feeling of being part of a small, vibrant community where connections to developers are still quick and easy. One fellow new user posted the following comment to the VisionPro subreddit, which I agree with wholeheartedly:

There are many issues with the Vision Pro at its current level of development. However right now is also a “golden time” which will eventually go away so I am going to savor it while I can.

What I find so “golden” is the interaction between the VP users and app developers. I had an issue which I posted to a developer’s post on another website. I received a response 5 minutes later and the issue seems to have been resolved. Yesterday I had a video conference with a developer in London and I was able to make some suggestions which he appreciated for making the app better.

When filing technical support issues (I have filed thousand of them over the years) the time to respond is normally measured in days or longer. I haven’t receive a response to a support request that I sent to one company over a week ago. Actual resolution of the problem once identified can take years. A MacOS bug which I reported over 2 years ago was only just fixed with Sequoia. Once the VP transitions away from this exciting development phase these interactions with developers and fast resolution time will go away. You will talk with support who will be the ones who talk with the developers. Fixes will appear in a software release at an unknown date.

I am going to savor these direct connections while they last.

Many new app developers hang out in InSpaze or post new apps and updates in the AppleVisionPro and VisionPro subreddits, sharing TestFlight links to beta versions of apps. In fact, I belatedly realized that I could even use the existing TestFlight invitation which I had already recevied for the mobile Second Life app on my iPad Pro!

I checked on TestFlight on my Apple Vision Pro (which uses the same AppleID), and sure enough, there was the link to the mobile iOS SL client! And so it was that I might just be the first and only person in the world who has run the iOS mobile version of Second Life in a flatscreen window in my Apple Vision Pro!

I might well be the first person in the world to run the mobile client for Second Life in a window inside an Apple Vision Pro! Old virtual worlds meet new virtual reality 😉

In short, my experience overall so far has been nothing short of magic! And I cannot wait to see where we go from here.


UPDATE Oct. 11th, 2024: I just wanted to add a couple of updates to my original blogpost. Unfortunately, both are not good news.

First, switching to the slightly thicker light seal cushion has meant that I no longer get warnings that my eyes are too close to the Apple Vision Pro lenses (and, in fact, every so often I now get a warning that my eyes are too far away!). However, I still do find that, after about two hours wearing the device, that a certain amount of eye strain still happens. I’m not sure what the exact cause of it is, but it is not something which I have experienced with previous virtual reality headsets (in those cases, the problems have been more with either the fit, or with VR sickness/nausea, and not with eyestrain).

It might be because of the mismatch between what my eyes and my brain are telling me. For example, the virtual screen hovering above my MacBook Pro is both superimposed over my machine and yet also appears behind it in the physical space in my office. I have been experimenting with turning on one of the virtual environments to fill in the space around the virtual screen (I tend to use the Mount Hood environment), while still leaving a view of the edge of my desk:

In this screen capture I just took in my Apple Vision Pro, I have turned up the Mount Hood environment so it surrounds the virtual display of my MacBook pro screen, but you can still make out the window in my office in the upper rightphand corner of the picture. I can continue to turn the right-hand knob on top of my AVP to completely replace my office view with Mount Hood, but then I wouldn’t be able to find my coffee cup!
You can just barely make out my blue coffee mug on my desk, next to my glasses. I can actually drink from the coffee mug while wearing my Apple Vision Pro, but find I have to tilt my head back to an awkward degree to finish the coffee, so I have added a metal straw, which works much better!

Sharing screen captures of what I can see in my Apple Vision Pro could not be easier; all I have to do is press both the top left and top right buttons on my device simultaneously to save a snapshot to the Photos app on my AVP. Then, I go into the Photos app on my Apple Vision Pro, select Share, then use AirDrop to transmit the image to my MacBook Pro to insert into this blog post!

The second piece of bad news is that the half-centimetre crack in the front curved glass of my Apple Vision Pro appears to be getting wider, so I am once again going to have to contact Apple Support. My new friends in InSpaze have advised me, instead of recontacting my local Apple Store, to talk directly to somebody on the Apple Vision Pro team via the Apple Support app. I’ve been told that they will have much more experience in dealing with such problems (and their possible solutions) than my local store (especially since, as I have said before, I’m the first person in my whole province of 1.4 million people to present them with a device crack!). And this time around, I am going to use my AppleCare warranty coverage to firmly insist on repair or replacement. A crack in such an new and expensive device is NOT okay.

But I am feeling a bit weary of having to face yet another verbal joust with Apple’s AI chatbot to get through to an actual, live, human support person. And—as I stated at the start of this blogpost—I am already feeling absolutely exhausted and completely worn out by the demands of my job, so I might postpone that battle to next week, after I have spent a relaxing (Canadian) Thanksgiving long weekend resting, and recharging my mental batteries.