Unity Drops a Bombshell: What Will Be the Impact on Social VR Platforms?

A collage of Twitter (sorry, X) statements from smaller game developers announcing they are dropping Unity after the company’s announcement earlier this week (source)

On Tuesday, Unity dropped a bombshell on software developers: a new fee structure that will charge devs using its popular game engine on a per-install basis, with less than four months advance notice. Ars Technica reported:

For years, the Unity Engine has earned goodwill from developers large and small for its royalty-free licensing structure, which meant developers incurred no extra costs based on how well a game sold. That goodwill has now been largely thrown out the window due to Unity’s Tuesday announcement of a new fee structure that will start charging developers on a “per-install” basis after certain minimum thresholds are met…

This is a major change from Unity’s previous structure, which allowed developers making less than $100,000 per month to avoid fees altogether on the Personal tier. Larger developers making $200,000 or more per month, meanwhile, paid only per-seat subscription fees for access to the latest, full-featured version of the Unity Editor under the Pro or Enterprise tiers.

“There’s no royalties, no fucking around,” Unity CEO John Riccitiello memorably told GamesIndustry.biz when rolling out the free Personal tier in 2015. “We’re not nickel-and-diming people, and we’re not charging them a royalty. When we say it’s free, it’s free.”

Now that Unity has announced plans to nickel-and-dime successful Unity developers (with a fee that is not technically a royalty), the reaction from those developers has been swift and universally angry, to put it mildly. “I can say, unequivocally, if you’re starting a new game project, do not use Unity,” Necrosoft Games’ Brandon Sheffield—a longtime Unity Engine supporter—said in a post entitled “The Death of Unity.” “Unity is quite simply not a company to be trusted.”

Sheffield goes on to say:

…I can say, unequivocally, if you’re starting a new game project, do not use Unity. If you started a project 4 months ago, it’s worth switching to something else. Unity is quite simply not a company to be trusted.

What has happened? Across the last few years, as John Riccitiello has taken over the company, the engine has made a steady decline into bizarre business models surrounding an engine with unmaintained features and erratic stability.

Ultimately, it screws over indies and smaller devs the most. If you can afford to pay for higher tiers, you don’t pay as much of this nickle and dime fee, but indies can’t afford to on the front end, or often it doesn’t make sense in terms of the volume of games you’ll sell, but then you wind up paying more in the long term. It’ll squash innovation and art-oriented games that aren’t designed around profit, especially. It’s a rotten deal that only makes sense if you’re looking at numbers, and assume everyone will keep using your product. Well, I don’t think people will keep using their product unless they’re stuck. I know one such developer who is stuck, who’s estimating this new scheme will cost them $100,000/month on a free to play game, where their revenue isn’t guaranteed.

Unity is desperately digging its own grave in a search for gold. This is all incredibly short-sighted and adds onto a string of rash decisions and poorly thought through schemes from Unity across the last few years.

And it’s not just games that are affected by this news; many metaverse platforms are using Unity too, and it remains to be seen how this news will impact them. Among the social VR platforms I have blogged about, which rely on the Unity game engine, are:

  • Anyland
  • Bigscreen
  • ChilloutVR
  • Engage
  • Lavender
  • NeosVR
  • Rec Room
  • Sinespace/Breakroom
  • Somnium Space
  • VRChat

(Ironically, the social VR platform Sansar deliberately made the decision not to use a third-party game engine, to avoid being blindsided by exactly what happened to Unity developers this week. Not that it helped with uptake of the platform.)

So, I posted the following question to the most knowledgable (and opinionated!) group of metaverse experts I know, the over 700 members of the RyanSchultz.com Discord server. Here’s a sample of some of their comments:

The devs at VRChat say, on Reddit, that nothing will change. We shall see…this guy is staff:

Other comments and responses to the news, from my Discord, are:

Lots of big-name devs are swearing off of Unity, dropping it even for projects already in progress.

For Neos itself I’m actually worried the least. For years they have planned to eventually move away from Unity, so the way the FrooxEngine actually interfaces with Unity is quite minimal. But like, most other VR Social games don’t have the “luxury” of running on two Engines frankensteined together. VRC will probably have to pay for it, the likes of Chillout are likely still far too small for that… But it still sucks that they have that lingering over their head now as the platform continues to grow.

Yeah, I mean, this is exactly why you shouldn’t rely too heavily on a third-party like this, because they can pull the rug out from underneath you…I am quite sure that VRChat is going to be okay. It’s the smaller, more niche metaverse platforms I’m a little worried about.

Sansar’s in-house engine looks pretty good right now, eh?

Okay, so it’s clear to me that this IS gonna have a large impact on any company that uses Unity. Question: how hard is it to move from Unity to, say, Unreal, or Godot? Is it an impossible task?

For an existing game? You’re usually basically re-writing it from scratch at that point.

For an existing project, it’s like remaking it from the ground up. An open engine similar to Unity would be a much better choice probably, for example Stride 3D.

The skinny seems to be that Unity will undo this, but trust will have been broken.

The last commenter makes an excellent point: even if Unity responds to the backlash by retreating from this decision, the damage has already been done, and the trust between Unity and developers has been broken.

The comments over on Reddit have also been uniformly negative. Again, here’s just a couple of examples:

Whatever Unity does, they already lost the trust of devs. Even if they retract, it will be “for now”. Fuck them.

and:

Cost per license sold? Sure. That’s fine, you can just bake it into the cost of the game.

Cost per install? Charged to the developer/distributor???? Fuck no. You have no idea how much money each customer will cost you.

Initially, Unity stated the fee would apply every time the game was installed, or reinstalled. Then they backtracked that, but installs on multiple devices will have the fee charged multiple times. Install it on your PC? That’s a fee. Now also on your Steam Deck? That’s another fee. Your laptop? Fee again. Replaced your PC? Have another fee! And god forbid someone remembers that PC cafes are a thing. There’s zero information about how a “device” will be kept track of, so potentially just changing the hardware in a device will cause the fee to reset.

Piracy is a huge unknown. Unity says developers will simply have to trust that Unity’s anti-piracy solution works.

You just don’t do business like that, ESPECIALLY when you make this change retroactively. Companies are going to have to retool their entire profit estimation for something they cannot even account for.

Anyway, it will be interesting to watch as developments unfold over the next few weeks. Unity is a part of so much software development work (it’s even said to be a part of the upcoming Apple Vision Pro VR/AR headset!), so there will definitely be ripple effects. And, of course, the only people guaranteed to make money off this are the lawyers, so expect to see the lawsuits fly! Stock up on popcorn…

Nonverbal Communication in Social VR: Recent Academic Research

Gestures (like this peace sign) are an example of nonverbal communication (Photo by Dan Burton on Unsplash)

In the real world, much of our communication is non-verbal: facial expression, gaze, gestures, body movements, even spatial distance (proxemics).

While older, flat-screen virtual worlds such as Second Life are somewhat limited in the forms of nonverbal communication available (most people rely on text or voice chat), modern VR equipment and social VR platforms allow for more options:

  • Hand/finger movement: most VR headsets have hand controllers; the Valve Index has Knuckles hand controllers which allow you to move your fingers as well as your hands;
  • Body movement: the Vive pucks can be attached to your waist, hips, feet, and other parts of your body to track their movement in real time;
  • Eye movements/gaze: for example, the Vive Pro Eye VR headset can track the blinking and movement of the eyes;
  • Facial expression: add-ons such as the Vive Facial Tracker (which attaches to your VR headset) allow you to convey lower face and mouth movements on your avatar.

In addition, many social VR platforms also employ emoticons, which can be pulled up via a menu and displayed over the head of the avatar (e.g. the applause emoji in AltspaceVR), as well as full-body pre-recorded animations (e.g. doing a backflip in VRChat). The use of all these tools, in combination or alone, allows users in social VR to approach the level of non-verbal communication found in real life, provided they have the right equipment and are on a platform which supports that equipment (e.g. NeosVR, where you can combine all these into an avatar which faithfully mimics your facial and body movements).

Two recently published research papers investigate nonverbal communication on social VR platforms, adding to the growing academic literature on social VR. (I am happy to see that social VR is starting to become a topic of academic research!)


Maloney, D., Freeman, G., & Wohn, D. Y. (2020). “Talking without a Voice”: Understanding Non-Verbal Communication in Social Virtual Reality. Proceedings of the ACM on Human-Computer Interaction, 4(CSCW2). https://doi.org/10.1145/3415246

Unfortunately, there is no open-access version of this conference proceeding available; you’ll have to obtain a copy from your local academic or public library. This paper, by Divine Maloney and Guo Freeman of Clemson University and Donghee Yvette Wohn of the New Jersey Institute of Technology, consists of two parts:

  • conducting unobtrusive observations of 61 public events held in AltspaceVR over the span of four weeks, to see what non-verbal interactions were being used naturally on the platform; and
  • interviewing 30 users of social VR platforms (of which I was one!), where the paper’s authors read through the transcribed interview data to acquire a picture with regards how social VR users used, perceived, and experienced non-verbal communication for further analysis.

In the first study of the two, the authors noted the following different kinds of nonverbal communication:

  • the use of movement to indicate that someone was paying attention. These included nodding behaviors and moving the body or head toward the person or object that was subject of attention;
  • the use of applause to indicate approval;
  • pointing and patting one’s own chest as a form of directing attention either at a remote object/person or oneself;
  • and behaviours such as waving, dancing, and kissing, which were mostly used in social grooming contexts (dancing was also used as entertainment);
  • and finally, the behaviour of trolls: interpersonal provocation and social disruptions.

With respect to the thirty interviewed conducted, they were analyzed as follows to answer two research questions:

Using quotes from users’ own accounts, in this section we present our findings as two parts. First, to answer RQ2 (How do people perceive and understand non-verbal communication in social VR?), we identified three common themes that demonstrated how users perceive and understand non-verbal communication in social VR: as more immersive and embodied interactions for body language; as a similar form of communication to offline face-to-face interaction in terms of spatial behavior, hand behavior, and facial expressions; and as a natural way to initiate communication with online strangers.

Second, to answer RQ3 (How, if at all, does non-verbal communication affect interaction outcomes in social VR?), we described the social consequences of interacting through non-verbal communication in social VR for various user groups, including marginalized users such as cis women, trans women, and disabled users. We specially highlighted how non-verbal communication in social VR afforded privacy and social comfort as well as acted as a protection for marginalized users.

Unsurprisingly, the researchers discovered that most participants considered non-verbal communication to be a positive aspect in their social VR experience. Those surveyed highly praised body tracking (either just the hands and head, or ins ome cases the whole body), as it allowed for a more immersive and embodied form of non-verbal communication than those in traditional, flatscreen virtual worlds.

In addition to supporting more immersive and embodied interactions for body language, participants also considered non-verbal communication in social VR similar to offline face-to-face interaction in terms of spatial behavior, hand behavior, and facial expressions. This familiarity and naturalness greatly contributed to their generally positive perceptions.

Participants also viewed non-verbal communication in social VR as positive and effective because it became a less invasive way to start interactions with online strangers (e.g. waving hello at someone you’ve just met). Nonverbal communication also afforded some users a sense of privacy and social comfort, and in some cases, became an effective protection for them to avoid unwanted interactions, attention, and behaviors (especially with LGBTQ people and women).

The paper made three design recommendations for improved nonverbal communication in social VR platforms: providing support for facial tracking (which is already on its way with products like the Vive Facial Tracker); supporting more accurate hand and finger tracking (again, already underway with the Knuckles controllers for the Valve Index); and enabling alternative modes of control, especially for users with physical disabilities. While most of the study participants highly praised full body tracking in social VR, disabled users in fact complained about this feature and demanded alternatives.

The conference paper concludes:

Recently, commercial social VR applications have emerged as increasingly popular digital social spaces that afford more naturally embodied interaction. How do these novel systems shape the role of non-verbal communication in our online social lives? Our investigation has yielded three key findings. First, offline non-verbal communication modalities are being used in social VR and can simulate experiences that are similar to offline face-to-face interactions. Second, non-verbal communication in social VR is perceived overall positive. Third, non-verbal interactions affect social interaction consequences in social VR by providing privacy control, social comfort, and protection for marginalized users.


Tanenbaum, T. J., Hartoonian, N., & Bryan, J. (2020). “How do I make this thing smile?”: An Inventory of Expressive Nonverbal Communication in Commercial Social Virtual Reality Platforms. Conference on Human Factors in Computing Systems – Proceedings, 1–13. https://doi.org/10.1145/3313831.3376606

This paper is available free to all via Open Access. In this conference proceeding, Theresa Jean Tanenbaum, Nazely Hartoonian, and Jeffrey Bryan of the Transformative Play Lab at the Department of Informatics at the University of California, Irvine, did a study of ten social VR platforms:

  • VRChat
  • AltspaceVR
  • High Fidelity (which shut down in January of 2020)
  • Sansar
  • TheWave VR (this social VR platform shut down in early 2021)
  • vTime XR
  • Rec Room
  • Facebook Spaces (since shut down and replaced by Facebook Horizon)
  • Anyland
  • EmbodyMe

For each platform, investigators answered the following eight questions:

  1. Can the user control facial expressions, and if so, how? (Pre-baked emotes, puppeteering, etc.)
  2. Can the user control body language, and if so, how? (Pre-baked emotes, puppeteering, postures. etc.)
  3. Can the user control proxemic spacing (avatar position), and if so, how? (Teleport, hotspots, real world positioning, etc.) How is collision handled between avatars? (Do they overlap, push each other, etc.)
  4. How is voice communication handled? Is audio spatialized, do lips move, is there a speaker indicator, etc.
  5. How is eye fixation/gaze handled? (Do avatars lock and maintain gaze, is targeting gaze automatic, or intentional, or some sort of hybrid, do eyes blink, saccade, etc.)
  6. Are different emotions/moods/affects supported, and how are they implemented? (Are different affective states possible, and do they combine with other nonverbal communications, etc.)
  7. Can avatars interact physically, and if so, how? (Hugging, holding hands, dancing, etc.) What degree of negotiation/consent is needed for multi- avatar interactions? (One-party, two-party, none at all?)
  8. Are there any other kinds of nonverbal communication possible in the system that have not be described in the answers to the above questions?

The results were a rather complete inventory of nonverbal communication in social VR, with the goal to catalogue common design elements for avatar expression and identify gaps and opportunities for future design innovation. Here is the table from the paper (which can be viewed in full size at the top of page 6 of the document).

An inventory of non-verbal communication in ten social VR platforms (source)

VR development is proliferating rapidly, but very few interaction design strategies have become standardized…

We view this inventory as a first step towards establishing a more comprehensive guide to the commercial design space of NVC [non-verbal communication] in VR. As a design tool this has two immediate implications for designers. First, it provides a menu of common (and less common) design strategies, and their variations, from which designers may choose when determining how to approach supporting any given kind of NVC within their platform. Second, it calls attention to a set of important social signals and NVC elements that designers must take into consideration when designing for Social VR. By grounding this data in the most commonly used commercial systems, our framework can help designers anticipate the likelihood that a potential user will be acquainted with a given interaction schema, so that they may provide appropriate guidance and support.

Our dataset also highlights some surprising gaps within the current feature space for expressive NVC. While much social signaling relies upon control of facial expression, we found that the designed affordances for this aspect of NVC to be mired in interaction paradigms inherited from virtual worlds. Facial expression control is often hidden within multiple layers of menus (as in the case of vTime), cannot be isolated from more complex emotes (as in the case of VR Chat), hidden behind opaque controller movement (as in Facebook Spaces), or unsupported entirely. In particular, we found that with the exception of dynamic lip-sync, there were no systems with a design that would allow a user to directly control the face of their avatar through a range of emotions while simultaneously engaging in other forms of socialization.

The authors go on to say that they observed no capacity in any of the systems to recombine and blend the various forms of nonverbal communication, such as can be done in the real world:

As we saw in our consideration of the foundations of NVC in general, and Laban Movement Analysis in particular, much NVC operates by layering together multiple social signals that modify, contextualize, and reinforce other social signals. Consider, for instance, that it is possible to smile regretfully, laugh maliciously, and weep with joy. People are capable of using their posture to
indicate excitement, hesitation, protectiveness, and many other emotional states, all while performing more overt discourse acts that inherit meaning from the gestalt of the communicative context.

The conference paper concludes:

As is evident in the scholarly work around social VR, improving the design space for NVC in VR has the potential to facilitate deeper social connection between people in virtual reality. We also argue that certain kinds of participatory entertainment such as virtual performance will benefit greatly from a more robust interaction design space for emotional expression through digital avatars. We’ve identified both common and obscure design strategies for NVC in VR, including design conventions for movement and proxemic spacing, facial control, gesture and posture, and several strategies unique to avatar mediated socialization online. Drawing on previous literature around NVC in virtual worlds, we have identified some significant challenges and opportunities for designers and scholars concerned with the future of socialization in virtual environments. Specifically, we identify facial expression control, and unconscious body posture as two critical social signals that are currently poorly supported within today’s commercial social VR platforms.

It is interesting to note that both papers cite the need to properly convey facial expressions as key to expanding the ability of avatars in social VR to convey non-verbal communication!

An Updated Comparison Chart of Social VR Platforms

Have you joined the RyanSchultz.com Discord yet? You’re invited to be a part of the first ever cross-worlds discussion group, with over 300 people participating from every social VR platform and virtual world! More details here


IMPORTANT NOTE, Feb. 6th, 2023: Thank you to Dr. Fran Babcock, who made some updates to this spreadsheet in 2021. I am currently working on a complete update to this spreadsheet for 2023. Thank you for your patience.

I haven’t published an update to my popular November 2018 comparison chart of twelve social VR platforms for quite some time. There never seems to be a perfect time to update. At first, I wanted to wait until the Oculus Quest was released. And then, I was wondering whether or not I should wait until Facebook releases the Oculus Link update to the Oculus Quest (which means, theoretically, that Oculus Quest users can use a custom cable connected to their VR-ready Windows computer to view content originally intended for the Oculus Rift).

In the end, I decided to go ahead and publish a first draft of the updated comparison chart now, get feedback from my readers, and update the chart as necessary. So here is that first draft.

I removed two of the 12 platforms in last year’s comparison chart: both Facebook Spaces and Oculus Rooms were shut down by Facebook on October 25th, 2019, in preparation for the launch of Facebook Horizon sometime in 2020. I have not added Facebook Horizon to this chart (yet) because we still know so little about this new social VR platform. And I decided to add six more social VR platforms to the chart: Anyland, Cryptovoxels, Engage, JanusVR, Mozilla Hubs, and NeosVR.

Rather than publish the chart as an image to Flickr, as I did last year, I decided to create a spreadsheet using Google Drive, and publish it to the web here:

Comparison Chart of 16 Social VR Platforms (Updated and Expanded Draft © Ryan Schultz, November 13th, 2019).

Please leave me a comment with any suggestions, corrections or edits, and I will update this new comparison chart accordingly. You can also reach me on the RyanSchultz.com Discord server, or any other virtual world Discord that I might belong to (my handle is always the same, RyanSchultz). You can also use the contact form on my blog.

UPDATE 3:48 p.m.: I’ve had a request to add userbase figures to this chart, but I am not going to do that for a very good reason: there’s absolutely no way I can get accurate figures from the various companies, many of whom want to keep that information private. And even ranking them using a scale like low, medium, and high would just be guesses on my part, misleading to a lot of people, and liable to lead to a lot of arguments. Sorry! I will leave it up to you to check Steam statistics for those platforms which are on Steam (which, again, may or may not be an accurate measure of the actual level of usage of any platform).

UPDATE Nov. 13th: I’d like to thank Frooxius (of NeosVR), Artur Sychov (of Somnium Space) and Jin for their corrections and suggestions. Any updates to this table are shown in real-time, which is a unexpected bonus to publishing a spreadsheet directly to the web from Google Drive! I should have thought of doing it this way last year.

And it would appear that there is a great deal of disagreement of what constitutes “in-world building tools”. I am referring to the ability to create complex objects entirely within the platform itself, and not using external tools such as Blender or Unity and then importing the externally-created objects into the platform. For example, High Fidelity has very rudimentary “prim-building” tools in-world, which are not often used by creators, who prefer to import mesh objects created in tools like Blender, Maya, or 3ds Max instead. To give another example, Somnium Space now offers a completely in-world tool for constructing buildings on your purchased virtual land. Sansar has no such tools for in-world building, although you can assemble premade, externally-created objects into a world by using their Scene Editor (which is something completely different from what I am talking about here).

One reader had suggested adding in a few more columns to this chart to include various technical aspects of these social VR platforms: game engine used, open/closed source, support for scripting, etc. Using the table provided to me by Enrico Speranza (a.k.a. Vytek), I have now added three more columns to the original comparison table: architecture/game engine, open/closed source, and scripting. Thank you for the suggestion, Vytek!

Please keep your suggestions, corrections and edits coming, thanks!

Cas and Chary Cover Five Social VR Platforms (Including Sansar)

Most of the people making YouTube videos about virtual reality hardware and software are men, so it is refreshing to find a new (well, new to me, anyways) channel about VR run by two women, called Cas and Chary VR.

Last week, Cas published a 10-minute YouTube video tour of five less popular social VR platforms, explaining:

So we all know VRChat, Rec Room, [and] AltspaceVR. This video isn’t about these games. It’s about 5 others that you might have missed.

The five platforms covered in this video include:

Videos like this are useful because they give viewers a look at platforms that they might not have had an opportunity to visit themselves. I was surprised to find that Sansar was a sponsor for this video. Cas says:

DISCLAIMER: This video was sponsored by Sansar. Per our guidelines, no review direction was received from them. Our opinions are our own.

I think it’s smart that Linden Lab is reaching out to YouTube influencers like Cas and Chary with sponsorship opportunities. As I have written before, social VR companies will likely have to turn to influencers more often in future to promote their products more effectively.