Some Thoughts on the Apple Vision Pro, Two Years After Its Release

Photo by Sam Grozyan on Unsplash

I was surprised to discover, finger-swiping and pinching my way through the Apple Vision Pro subreddits I follow using the Pioneer for Reddit app (while in the Apple Vision Pro, of course!), that the Apple Vision Pro was already celebrating the two-year anniversary of its release in the United States. We Canadians and citizens of about a dozen other countries were only able to get our hot little hands on AVPs later, of course (I had a particularly tortured road until I finally was able to use mine, as explained here, including several frustrating and time-consuming incidents trying to communicate with both Apple’s and UPS’s AI-powered chatbots in efforts to speak with an actual live human being). But, as usual, I digress.

I have been thinking a lot lately about why I am so enamoured with my Apple Vision Pro, and how it compares to the many previous Windows PCVR and standalone VR/AR headsets I have used since January 2017 (Oculus Rift, Oculus Quest 1/2/3, Valve Index, Vive Pro 2). Also, I have been thinking a lot about how I have been using those different headsets, and again, why my use of the AVP has been such a radical departure from previous virtual reality gear. So this blogpost is my attempt to summarize all those thoughts, and get them down on—hmmm, well, not paper, exactly, but pixels?—to share them with you, my faithful blog readers. (By the way, I very much appreciate those of you who do actually take the time to read my ramblings!)


Any sufficiently advanced technology is indistinguishable from magic.
—Arthur C. Clarke

First, the technology of the Apple Vision Pro makes the device feel magic, and I still feel that sense of awe and appreciation while wearing it every day. Shortly after my first week of use, in a message I first excitedly shared with my friends on Second Life, first quoted here on my blog, I stated:

The Apple Vision Pro makes every single VR headset I have used to date feel like one of those red plastic View-Masters I used to play with as a kid in the 1960s. The “screen door” effect so evident in earlier VR headsets (where you can see individual pixels, making everything slightly blurry) is COMPLETELY, UTTERLY gone.

The Apple Vision Pro’s display resolution is 50 times more dense than the iPhone, and such a startling leap forward, that I often like to joke, it makes all the older VR/AR headsets I have ever worn feel like a cheap plastic View-Master toy!

After decades of working on Microsoft Windows computers, I used the Apple Vision Pro (and in particular, what I consider its killer feature, Mac Virtual Display) to switch almost completely to macOS and the Apple ecosystem. Let me walk you through a typical workday. I arrive at my cubicle in the librarian’s shared office space, turn on my MacBook Pro, and unpack and set up my Apple Vision Pro. I remove my prescription eyeglasses, put my AVP on, adjust the straps across the back and top of my head for a comfortable fit, and select my usual environment, Mount Hood, the tallest mountain in Oregon:

My preferred Apple Vision Pro Environment for work is Mount Hood, Oregon because I like to be surrounded by pine forest.

I can adjust how much my chosen Environment blends with my cubicle office space, by twisting the knob on the upper right of my AVP. Most times, I like to have it set up around 90-95%, so that I feel I am surrounded by forest, with the lake and Mount Hood to my back, but enough of the real world still pokes through so I can, for example, easily grab my insulated Winnipeg Folk Festival coffee mug (with an environmentally-friendly metal straw, so I can take a sip more easily while wearing my AVP!). When I use my Apple Magic keyboard, it automatically highlights itself as my hands hover over it, pulling itself out of the forested ground when I look down. Everything just works. It’s magic.

Usually, I have the Apple Music app pinned to my right side, and I select a playlist (usually instrumental new age music, but it can vary depending on my mood).

Sorry, any screen captures I take in my Apple Vision Pro always tend to be a bit lopsided! I need to learn how to angle my head correctly.

I pop in my Apple AirPods, and then look at my MacBook Pro. A virtual Connect button hovers over the MacBook Pro’s screen, I tap my finger and thumb together to select it, et voilà! A large, adjustable ultra-high-definition screen appears over my desk, a sharp, crystal-clear wide screen where I can rearrange my macOS windows to my heart’s content: Outlook for email, Word for whatever report I am working on, my latest PowerPoint presentation, my Firefox web browser, etc.

I now spend between four and six hours of my workday in this productivity cocoon. If I need to get up (say, to reheat my coffee in the microwave), I unplug the AVP battery from its power cable, place the battery in my left front pocket, and walk around the office. I exit the Mount Hood environment, which remains in place like a virtual office partition. If, on my way to the microwave, I happen to look behind me, I can still see my huge Mac Virtual Display, and the Apple Music window hanging in midair at my workstation.

This setup gives me two things: focus and pain relief.

First, the ability to isolate myself (literally, throwing an immersive, three-dimensional virtual environment around myself) gives me the ability to focus on the task at hand, and I find it helps with my overall productivity. I can even get into a much-desired flow state. (Interestingly, the second-edition Apple Vision Pro with the higher-end M5 processing chip seems to have completely alleviated a problem I had with the original-model AVP, which was I would develop eyestrain after at about the two-hour mark while using it with the Mac Virtual Display feature. The new dual-loop Dual Knit headband is also an improvement over the original, single-band knit headband.)

Second, I have a couple of deteriorating joints in the cervical part of my spine, which unfortunately limits how much time I can spend sitting in front of a desktop computer monitor and keyboard. I have noticed that I can work for longer periods of time, with less neck and shoulder pain, when using the Mac Virtual Display feature on my Apple Vision Pro with my MacBook Pro, than I can in any other workstation setup (including just my MacBook Pro with an external monitor). I am truly grateful that the technology is now sufficiently advanced to help alleviate my pain!

As far as I am concerned, the Mac Virtual Display feature is THE killer app on the Apple Vision Pro. While I have been browsing the AVP subreddits and downloading and installing various apps, I find I use the Virtual Display far more than any other app or program (at least, right now). No other VR headset can give me what the AVP offers, or even come close. The thousands of dollars I have spent on the first and now second editions of the Apple Vision Pro over the past two years have been worth every. single. penny. I cannot imagine living and working without this device.


With all the Windows PCVR and standalone VR/AR headsets I have used, I had always been hopping between one app or another (usually a metaverse platform like Sansar or VRChat, because that is my personal hobby and my research interest). I spent very little time in places like Steam VR Home, or the Meta Horizon Home, where you can see your library of installed VR/AR applications and games, launch them, and switch between apps. But in the Apple Vision Pro, with the Mac Virtual Display feature, I find I am using the device more like a filter or environment through which I am doing actual work with pre-existing programs like Microsoft Office, as opposed to loading and running virtual-reality-native apps. You can see immediately how this is a big difference. I would never for a second even think of using my Meta Quest 3 headset to edit a document in Microsoft Word, or fire off an email, yet I do those sorts of things without a second thought in my Apple Vision Pro.

Which leads me to my next important point: why the relative lack of AVP-native apps and programs is not as serious a problem as it would appear at first glance. When you use the device as a filter, or an environment, as you do with the Mac Virtual Display feature, you are using it with the much richer library of apps and programs available on macOS. Add to that the thousands of iOS apps you can run in flat-screen mode on the AVP (e.g. Firefox, my go-to web browser), and you can see why I am not too terribly concerned about this issue.

But it would appear that many consumers are concerned at how (relatively) slowly new, native-AVP apps and programs are being added to the Apple App Store. In a post made four days ago to the r/VisionPro subreddit, someone asked:

So I finally pulled the trigger and bought an Apple Vision Pro, and honestly… wow. The hardware is insane. The display, hand tracking, eye tracking, immersion – it genuinely feels like a glimpse into the future. Watching films, browsing the web, even basic spatial apps feel miles ahead of anything else I’ve tried.

That said, I can’t shake one big concern: developer support is thin.

Right now it feels like there are hardly any apps that are actually built for Vision Pro. Yes, iPad apps technically work, but that’s not the same as native spatial experiences that really show off what this thing can do. After the initial “this is amazing” honeymoon phase, you start noticing how limited the ecosystem still is.

My worry is this: if Vision Pro doesn’t gain real traction, Apple could quietly scale it back or pivot, and developers will have even less incentive to build for it. That becomes a vicious circle — fewer users → fewer apps → even fewer users.

I really want this platform to succeed because the tech absolutely deserves it. But at the moment it feels like we’re relying on Apple’s long-term commitment and patience more than anything else.

Curious what other Vision Pro owners (or devs) think. Are we just early and impatient, or is the lack of native apps a genuine red flag?

This question sparked some developers and other users to weigh in, with some very insightful commentary, which I wanted to share here with you:

I think Apple knew this going in and that’s why this device is almost like a prototype in a way. They need it in consumers hands to know what it will turn into. They knew the price point wasn’t for general consumption, but the only way to mold this thing into a future device for the masses that has better battery, less weight, and more importantly, costs less, was to get it into the hands of people and watch it do its thing.

Hi,Vision Pro developer here. Long response incoming (TLDR at bottom). You and other users have responded with what I think is a correct analysis that there’s an economics issue in that people won’t buy the Vision Pro until there’s sufficient app support, while developers can’t afford to make a dedicated Vision Pro app until there’s a sufficient user base. I can maybe provide some more perspective on some other aspects of Vision Pro development.

I truly believe that spatial computing is the future of computing, but it won’t be with the current version of Vision Pro. Essentially, I see this iteration of Vision Pro as a (very) cool device for media consumption and a dev tool. In the future, Apple (or some other company, but my money is almost always on Apple) will likely release the product that breaks through with consumers, whether it be the upcoming glasses or some vastly improved Vision Pro, and then developers will begin work making the apps for that eventual product. My personal development projects on Vision Pro are done with the certainty that they will be made at a financial loss to myself, but in the hope that learning how to build streamlined apps and leverage the capabilities of the current device will allow me to be better positioned to be a developer for the breakthrough model. As a developer, this is the time to be experimenting with 3D user experience, to learn what works and what doesn’t as an interaction model for experiences as immersive as Vision Pro allows. 

There are also problems with what Apple allows developers to do. In truth, there’s very little freedom to push the device to its limits and make something really imaginative and unique. Apple has set out strict privacy considerations (which are good broadly speaking, but might be overkill at this point) that lock developers into predefined paradigms that Apple approves of. Of course Apple’s own apps don’t have to obey these restrictions, which allows them to make apps that feel magical, like Experience Dinosaurs. Having attended the Vision Pro Developer conference for the past two years, I can tell you that there are significant frustrations among the developer community over the restrictions Apple has placed.

From where I’m sitting, I think the interest among developers for Vision Pro is reasonably high, but most can’t afford to build for it until there are some big changes in the market. I think in the near future there won’t be more than a smattering of new native apps, mostly made by the passionate developers who see the potential, but once Apple releases the product that clicks for consumers the dam will open up. This will probably result in a flood of apps for this current generation of Vision Pro, as I think Apple has nailed the software side of this, and just needs to work on building a physical frame that consumers want to put on their head.

TLDR: Be patient. At some point spatial computing will likely take off on a future Vision Pro-like model, and then the developers will come.

Developers aren’t going to invest heavily in the platform until there’s more users. Apple knows this. Apple is getting the OS and dev tools maturing while they work towards more consumer-friendly versions of their Vision line. They needed the hardware out and in user and developer’s hands to really start moving forward. Traction will come, I sincerely don’t think there’s anything to worry about there.

I agree wholeheartedly with the second commenter, the developer who stated that “people won’t buy the Vision Pro until there’s sufficient app support, while developers can’t afford to make a dedicated Vision Pro app until there’s a sufficient user base.” It’s a classic chicken or the egg problem, which is why what I said earlier is so important. The number of available apps and programs for the Apple Vision Pro doesn’t really matter at this point (at least, for me), because I am pretty much using it as an immersive environment through which I am running other programs. To date, the only native-AVP apps I have been running regularly have been the previously-mentioned Pioneer for Reddit app, InSpaze, and Explore POV! (I have, however, been avidly collecting dozens of free and inexpensive AVP apps based on recommendations posted to the r/AppleVisionPro and r/VisionPro subreddits! One day, probably when I am on my upcoming research and study leave, I will start to explore more AVP-native programs and apps. In fact, two days ago, Google finally released a version of its popular YouTube video-watching app for the Apple Vision Pro!)

As I said up top, Mac Virtual Display is the killer feature I use most often. And that is what makes my use of the Apple Vision Pro so dramatically and drastically different from previous VR/AR headsets. It’s a productivity tool first, and with my continuing neck and shoulder pain, it’s also been a pain management tool second, an unexpected but not unwelcome way to get through an eight-hour workday with as little discomfort as possible. I am eternally grateful that the technology has actually evolved enough, just in time, to help me still be productive despite my pain! And for those two reasons alone, it is worth every single penny I have spent on this device. As I said before, I am all in.

The upgraded Apple Vision Pro has been a Godsend, and worth every penny I have spent!

ANNOUNCEMENT: My One-Year Research and Study Leave Project

Photo by Jaredd Craig on Unsplash

Open Educational Resources (OER) are teaching, learning and research materials in any medium—digital or otherwise—that either reside in the public domain or have been released under an open license that permits no-cost access, use, adaptation and redistribution by others with no or limited restrictions. While many think of OER as referring predominantly to open textbooks, OER includes a vast variety of resources, such as videos, images, lesson plans, coding and software, and even entire courses. In order for a resource to be considered open, it must fulfill the following criteria:

Modifiable: The resource must be made available under an open license that allows for editing. Ideally, it should also be available in an editable format.

Openly-licensed: The resource must explicitly state that it is available for remixing and redistribution by others. Some open licences may include restrictions on how others may use the resource (see: Creative Commons).

Freely Available: The resources must be available online at zero cost.

—definition adapted from Introduction to Open Educational Resources, Open Education Alberta.

Not long ago, on my 62nd-birthday blogpost, I wrote:

…although it is not official official (and I really should wait until I get the official letter from university administration, which I was told should happen about the end of March), the University of Manitoba Libraries has approved my application to take a one-year Research and Study Leave (at full salary) to start later this year, where I am relieved of my regular academic librarian duties, and can work on a special project. Academic librarians at the University of Manitoba are members of the faculty union, and just like the professors, we have the right (and the opportunity) to pursue research. Again, more details later. I’ve only mentioned this to a couple of people so far, but I think I can share that much detail at this time.

Well, I am very happy to announce that it is now official official: I have formally been approved to take a one-year research and study leave, at full salary, from my employer, the University of Manitoba Libraries, to pursue a special project.

What is that special project, you may ask? Well, I’m just going to quote from my approved application form:

During my Research Leave, I will create a comprehensive Open Educational Resource (OER) addressing a critical gap in scholarly literature: a rigorous, pedagogically-sound introduction to virtual worlds, social virtual reality, and the metaverse, with particular emphasis on applications in higher education. This project builds directly on my expertise as the writer of a popular blog on the topic over the past eight years (https://ryanschultz.com), as well as the owner and moderator of an associated Discord server, representing over 700 members who are actively using various metaverse platforms. The research phase will involve a literature review, plus case study analysis of specific metaverse platforms. The OER will consist of several modules, including topics such as: the history of the concept of the metaverse; how the current wave of generative AI will impact the metaverse, etc. This project requires a dedicated research leave because the rapidly-evolving nature of the field requires intensive, concentrated research and focus. Released under a Creative Commons license, this resource will serve UM faculty and the global educational community, providing a freely-adaptable foundation for teaching, learning, and research.

Yep, that’s right folks, I am taking a full year off from my regular academic librarian duties to write a book about what I know best, and have been blogging about for many years now: virtual worlds, social VR, and the metaverse! (Throwing in a little bit about artificial intelligence and generative AI, as it applies to those topics.)

My leave runs from July 1st, 2026 through to June 30th, 2027, and the best part of it is, since it’s about the metaverse, I can literally work from anywhere: at home in Winnipeg, while visiting the rest of my family in Alberta, on the beach at Bora Bora (highly unlikely, although the Apple Vision Pro provides a suitable substitute in a pinch!), etc. The only rule is you have to vacate your current office at the university for whoever is filling in for you while you’re away on research leave, which seems pretty reasonable to me. However, I will be borrowing some of the VR/AR equipment which I had purchased on previous years’ travel and expense funds (T&E funds for short; essentially, extra money allocated to faculty and librarians for things like conference travel, books, computers, etc.):

Because part of this research work will involve social VR, I will have to move some virtual reality equipment purchased on previous years’ T&E funds from my office in Elizabeth Dafoe to my home. This equipment will be returned to my office after my Leave ends.

Oh, and I also have to promise that I will come back to my job at the University of Manitoba Libraries after my leave ends, which is fine, since I am planning to stay until I retire at age 65, in January 2029. This will, of course, be the last research leave I take before I do retire.

Best of all, after my OER is complete, anybody can use it for teaching, learning, and research purposes, including editing. remixing, and repurposing it (the exact rights will depend on which Creative Commons license I choose to publish it under).

Watch for updates on this project as I get closer to July 1st. Stay tuned!

Photo by Windows on Unsplash

Why Second Life is My Radio Station

Classical guitarist Joaquin Gustav performs on The Rooftop at NO DUMPIRE on Saturday morning.

Ladies and gentlemen and fabulous people of all genders on the internet, I have been having a VERY bad couple of weeks. I was in a car accident two weeks ago, which aggravated the neck and shoulder pain I am experiencing, due to the deterioration of a couple of joints in the cervical part of my spine. On top of the stress of dealing with the worsening of my pain, and the additional stress of dealing with insurance agents and arranging to get my car repaired, this week I accidentally deleted several directories in my Microsoft OneDrive cloud storage while transferring files from my old Windows notebook to my new one. On Friday I had a meltdown while I was on the phone with my university’s tech support, while I was struggling with my neck and shoulder pain, made worse—of course!—by stress. Frustrated, overwhelmed, embarrassed, and in pain, I finally threw in the towel, took the rest of the day off sick, and went home and to bed. I am ashamed at how poorly I am coping with everything happening in my life these past two weeks.

All of which is a very roundabout way of saying that I am grateful for Second Life, which is still, to this day, one of my preferred escapes when reality becomes a bit too much. I know that some of my readers are probably wondering why I choose to spend much of my free time in a 22-year-old virtual world, which the mainstream media likes to portray as quaint, outdated, and populated by weirdos. (Hey, as I say, embrace your weirdness. Be a professional weirdo. This world is not served by billions of cookie-cutter humans who think alike, look alike, and act alike.) But I digress.

Second Life is the perfect model of a fully-mature, ever-evolving metaverse, which many newer entrants would be wise to study, learn from, and emulate. One thing that the mainstream media gets wrong is the reason for Second Life’s appeal. That appeal—what keeps its userbase coming back—is not its weirdness (although that is certainly part of it). Second Life’s main appeal is that it is an unparalleled blank canvas for people to be whoever or whatever they want, and create whatever they want. And nowhere is that more evident than in SL’s vibrant music scene.

For example, first thing this morning, my main avatar, Vanity Fair, ascended the ladder to get to The Rooftop, one of several venues located in a region called NO DUMPIRE, created and maintained by a dinkie raccoon avatar named Zed. This morning, I enjoyed a one-hour live music set from classical guitarist Joaquin Gustav, chatting with friends in SL while sipping my morning coffee.

Second Life is packed with musical venues, where I can park Vanity (or another avatar from my small army of alts) down in a club, to hear a deejay or a live musical performer or a singer/songwriter. Everything from fancy ballrooms like LOVE, to the decidedly anti-consumerist dumpster chic of NO DUMPIRE’s many venues.

As I write this, Joaquin has packed up his guitar and now DJ Zed is spinning an eclectic set of chill music. His usual avatar is a dinkie (i.e. tiny avatar) raccoon:

DJ Zed
The Rooftop at NO DUMPIRE during Zed’s deejay set

It beats the hell out of just turning on the radio, and listening to whatever limited set of music that radio station serves. Why do that, when you can support a live deejay or musical artist in Second Life, AND serve a fabulous look? Here is Vanity Fair dancing to the tail end of a song during Zed’s DJ set:

Second Life and its many clubs, venues, festivals, and other events exposes me to musical artists and deejays from all around the world (for example, Joaquin Gustav hails from from Buenos Aires, Argentina, a far cry from wintry Winnipeg, Canada where I live). I can join SL groups (like Joaquin’s group) for my favourite artists, so I’m alerted as to when and where they are performing next. And I can tip those performers whose music sparks joy and gives pleasure, using Linden dollars which artists like Joaquin can cash out into their local currency. AND NO ADVERTISING.

And it’s not just on a Windows, MacOS or Linux desktop that you can log in and listen to a virtual world venue’s music stream. Second Life’s new Mobile client (available for both Android and iOS phones and tablets) allows you to bring your music with you wherever you happen to be! Even on Mobile, the sound quality is excellent.

So, that is why Second Life is my radio station. Ladies and gentlemen (and fabulous people of all genders), I hereby rest my case. 😉 Thank you for coming to my TED Talk.

My Top Ten Most Popular Blog Posts in 2025

I was curious this morning, so I went into WordPress and checked out my blog viewer statistics. I used to check them much more frequently during the heady heyday of the metaverse boom-and-bust, circa 2019 to 2022, when traffic to my blog surged to unprecedented levels and then crashed, as shown by a screenshot I took of my WordPress statistics last March:

One notable event during that time period was Facebook rebranding itself as Meta on October 28th, 2021, amid Mark Zuckerberg’s expensive push to transform his company into a metaverse powerhouse (with somewhat mixed but still undeniable success, notably on the hardware side, with its Quest line of wireless VR/AR headsets).

Then, in 2022, the hype cycle for artificial intelligence started, an unexpected surge of interest driven by a tidal wave of new, generative AI tools like DALL-E and ChatGPT, and the world seemed to move on—as seen by a noticeable decline in visitors to the RyanSchultz.com blog! (It also didn’t help that through most of 2024 and 2025, I was swamped at my paying job as an academic librarian, having to put my blog on hiatus for a while as I went on half-time sick leave for six months for treatment of burnout. As a result, I barely posted anything during most of 2025.)

Anyway, as I said previously, I was curious, so I checked to see what my top ten most popular (i.e. most viewed) blog posts were last year.

One of the things that never ceases to amaze and amuse me is how much traffic certain blogposts receive: the relatively rare ones where I write about sexual topics! Let me be clear: while I am not a prude, I am also not that terribly interested in writing about adult/sex-oriented metaverse platforms, because I find pixelsex boring. Therefore, I will leave that particular niche of the metaverse to others to chronicle. 😉

Anyway, 2025 was no exception to the rule, with three of my top ten most popular bloposts being about such adult topics (by the way, all three links are quite safe for work):

Which just goes to prove the old adage: sex sells. Or, at least, that there’s lots of people searching for sex on the internet. (I really should rent out prime advertising space on that sex-in-VRChat blog post; I bet I’d make some income off that! Even though I freely admit in that blogpost that I have absolutely no idea where the sex is in VRChat. Sorry, folks. 😂😆🤣)

In the number two slot is the count of people who just went straight to my homepage at https://ryanschultz.com, without clicking on a link to specific blogpost. No surprise there.

My coverage of the many changes in Second Life during 2025 (most notably, the new Avatar Welcome Kit with its LeLutka Lite heads and Legacy Basic bodies) takes up four of the top ten spots:

And, as I have said previously, I’m cutting back on my Second Life coverage (even though I have already broken that rule once so far in 2026!).

And the final 2 slots in my Top Ten are the two lists I maintain, the first of virtual worlds and social VR platforms, and the second of non-combat, open-world exploration games.

Both lists seem to be referred to a lot by other writers on the internet (and, increasingly, by generative AI tools which scrape the web, including my blog). For example, ChatGPT has referred its users to my blog 448 times in 2025 (and, to be honest, I’m not quite sure how to feel about that):

Speaking of ChatGPT and other generative AI tools, I do have plans to write more often on this topic, both as it applies to the ever-evolving metaverse, and more generally as well. Stay tuned!