UPDATED! High Fidelity Invests in Linden Lab, the Makers of Second Life, and Philip Rosedale Rejoins Linden Lab as a Strategic Advisor

The Second Life website (image source)

Today, Linden Lab (more formally known as Linden Research, Inc., the makers of Second Life) dropped a press release:

High Fidelity announced today that it acquired an interest in Linden Research, Inc. (“Linden Lab”), the pioneering developer of the virtual world Second Life. The deal includes a cash investment and distributed computing patents. Members of High Fidelity’s metaverse team are joining the company, and Philip Rosedale, who is a founder of both companies, is also rejoining Second Life as a strategic advisor.

The transaction will help Second Life further scale its operations and strengthen its commitment to growing an innovative, inclusive, and diverse metaverse where its inhabitants’ ingenuity drives real-world value for themselves and others.

“No one has come close to building a virtual world like Second Life,” says Second Life founder and High Fidelity co-founder, Philip Rosedale. “Big Tech giving away VR headsets and building a metaverse on their ad-driven, behavior-modification platforms isn’t going to create a magical, single digital utopia for everyone. Second Life has managed to create both a positive, enriching experience for its residents — with room for millions more to join — and built a thriving subscription-based business at the same time. Virtual worlds don’t need to be dystopias.”

High Fidelity is the company Philip Rosedale founded after leaving Linden Lab. Its first product, an ambitious social VR platform called High Fidelity, failed to catch on and was shut down in early 2020. Its successor product (also called High Fidelity) is a 3D spatialized audio system for use in other metaverse platforms. So, when I’m talking about High Fidelity (HiFi for short), I always make sure to indicate whether I am talking about the company itself, its former social VR product (the old High Fidelity) or the new 3D audio product (the new High Fidelity)!

The website for the new High Fidelity (image source)

Wagner James Au, writer of the long-time virtual worlds blog New World Notes (from whom I first learned about this breaking news), has this to say:

Just got this message from Philip Rosedale, about the future of Second Life:

“I’m not back full-time, but it feels great to get to be talking to Lindens about design! I think the vital thing to focus on is demonstrating that a virtual world can scale to greater capacity while being inclusive and fair and safe for humanity.”

Exciting and welcome news, indeed! I will update this blogpost with more details as I acquire them. Stay tuned!

UPDATE Jan. 14th, 2022: The Wall Street Journal, in an article titled Second Life Founder Returns to Take On the Metaverse (archived version), reported yesterday:

Philip Rosedale in 2003 launched the online game where players using avatars can hang out, socialize with other players and make purchases. Second Life is a forerunner of the virtual worlds that big tech companies are now trying to create and that are often referred to as the metaverse. Mr. Rosedale is returning to the company he left in 2010 to serve as a strategic adviser and shepherd its expansion as the metaverse gains wider traction, he said in an interview…

Mr. Rosedale said that the business models underpinning some of the current tech giants, such as tracking user behavior to target ads, would be potentially harmful in the metaverse, which is more immersive than current digital platforms. “I think that there is a real genuine, existential risk associated with how that gets done,” he said.

Second Life may have had a head start on some of the metaverse companies it aims to compete with, but to some extent is the underdog. Second Life rolled out before Facebook was founded, but has hovered at around one million users since 2008, according to a company spokesperson. Meta’s Facebook, Instagram and other services sported more than 3.5 billion monthly users combined, according to its most recent earnings. Epic Games Inc.’s Fortnite videogame and game company Roblox Corp. , which are also making moves in the metaverse, have many times the number of users that Second Life has.

Brad Oberwager, chairman of Second Life parent company Linden Research Inc., said he is working with Mr. Rosedale to inject momentum into the business. Second Life already offers the ability for people to withdraw money from in-game sales into the real world, a feature lacking in some other emerging metaverses, which should attract users, he said. Coming upgrades focused on further improving the social and economic components of the game, such as the avatars and digital marketplace, promise to drive user growth, he added.

The Wall Street Journal article goes on to state that “Mr. Rosedale is bringing with him to Second Life a small cadre of developers, a number of patents and an unspecified financial investment from the company he founded in 2013, High Fidelity Inc.”

In an interview with c|net, titled Second Life founder returns to revamp his original metaverse, Philip goes into a little more detail:

Rosedale is going to be a “strategic adviser” for Second Life, while his company High Fidelity looks to infuse Second Life with some new ideas, simultaneously working on other ideas for future tech, including – at some point – VR again. “We’re announcing that we’ve shifted a group of seven people, some patents, some money. We’re investing in Second Life, to keep working on Second Life,” Rosedale told me. “Two of those patents are moderation in a decentralized environment patents, which is really cool.”

The reason for the shift is that Second Life still makes money and still has a considerably larger community than most VR platforms: It’s had over 73 million accounts created since it launched, and estimates of active users hover around 900,000. Rosedale sees the shift as solving problems while VR hardware still gets thought out. 

Despite the seeming success of the Oculus Quest 2, he still doesn’t think it’s enough. “The headset is so broken that it’s going to actually take, I think, five years to get to something that’s good,” he says, “and we as a startup would neither survive, nor would it make sense for us to sit around for five years.” He sees building up Second Life as a better platform that will be VR-optional until that magically perfect hardware arrives. 

The entire c|net article is well worth a read, by the way. This news has also been covered by publications such as The Verge, TechCrunch, VentureBeat, and CoinTelegraph.

Honestly, the more I read, the better this sounds! I think this is exciting news for both Linden Lab and High Fidelity, and I wish all involved every success in this endeavour.

Editorial: Will Sansar Survive?

Sansar is the reason I started this blog a little over four years ago, and it with a very heavy heart that I write this blogpost. As many of you know, I found that I had become too emotionally attached to what was going on with Sansar, and I had to step back from my previously comprehensive coverage of the Linden Lab-founded social VR platform, to gain some much-needed perspective and to be able to write about it dispassionately.

While the rumours of Sansar’s impending demise have been circling for quite a long while now, over the past few months, I have been hearing persistent gossip, from various well-placed sources, that Wookey-led Sansar is in serious trouble. I should rush to add that I have zero official confirmation of any of this, but every time I hear a new rumour, it seems to confirm what I have already heard from others. In other words, I am hearing the same thing from many different people.

Most recently, I’ve been told that the Wookey team is missing in action, both on the official Sansar Discord and in-world. I’ve heard that Sansar has lost big-name clients like Lost Horizon and Monstercat (although Sansar is still listed on the Lost Horizon Festival website). I’ve also heard that many people who used to be actively involved in Sansar have left, leaving for platforms as various and diverse as Helios, SapphireXR, and CORE (where I see many Sansar alumni chatting on their Discord servers).

My latest source tells me:

There hasn’t been a product meetup in monthsthey were all working like crazy on Splendour in the Grass…after that, crickets.

The marketplace for hosting live events has become extremely competitive, with social VR platforms competing with game companies like Fortnite and Minecraft to sign deals with artists and festivals, and to host concerts and other musical events. And if Sansar is struggling to do this during a pandemic, how will it fare when things return to (relative) normalcy, with a resurgence of live, in-person events? Can Sansar compete against better-funded companies to attract the kind of A-list talent which brings in audiences—and more to the point, can they get that audience to stick around and become content creators and community members after the music ends?

I am in a better position that most external observers to play armchair quarterback and try to pinpoint exactly where it all went so wrong, but I must confess that, like so many others (including numerous employees laid off in at least two rounds of wrenching, painful layoffs), I really thought that Sansar would succeed.

But the expensive bet placed by Linden Lab (and Philip Rosedale’s company, High Fidelity, which shut down a similar service in early 2020, and pivoted to a spatial audio product), is that there would be tens and even hundreds of thousands of people using high-end VR headsets like the Oculus Rift, HTC Vive, and Valve Index to access social VR platforms boasting beautiful high-end graphics. It didn’t seem like such a risky bet at the time, but looking back, perhaps it was.

Certainly, part of the problem is that these companies spent millions of dollars and many years building platforms, only to find that the VR hardware market was evolving so quickly that they couldn’t keep up. I mean, the Oculus Rift is no longer being sold by Facebook, which decided to put all their eggs into the standalone Quest, which is selling like hotcakes—and which Sansar can only run on if you attach a cable from your Quest to your high-end gaming PC.

What does it take for a platform to catch fire, like VRChat and Rec Room? Again, I don’t really know the answer (although social media, particularly YouTube and Twitch, certainly played a pivotal role in at least VRChat’s ultimate popularity and success).

At a time when the metaverse has again become a hot buzzword tossed around by many companies, both big and small, who knows what will happen to Sansar. But I must confess that I am very worried.

Nonverbal Communication in Social VR: Recent Academic Research

Gestures (like this peace sign) are an example of nonverbal communication (Photo by Dan Burton on Unsplash)

In the real world, much of our communication is non-verbal: facial expression, gaze, gestures, body movements, even spatial distance (proxemics).

While older, flat-screen virtual worlds such as Second Life are somewhat limited in the forms of nonverbal communication available (most people rely on text or voice chat), modern VR equipment and social VR platforms allow for more options:

  • Hand/finger movement: most VR headsets have hand controllers; the Valve Index has Knuckles hand controllers which allow you to move your fingers as well as your hands;
  • Body movement: the Vive pucks can be attached to your waist, hips, feet, and other parts of your body to track their movement in real time;
  • Eye movements/gaze: for example, the Vive Pro Eye VR headset can track the blinking and movement of the eyes;
  • Facial expression: add-ons such as the Vive Facial Tracker (which attaches to your VR headset) allow you to convey lower face and mouth movements on your avatar.

In addition, many social VR platforms also employ emoticons, which can be pulled up via a menu and displayed over the head of the avatar (e.g. the applause emoji in AltspaceVR), as well as full-body pre-recorded animations (e.g. doing a backflip in VRChat). The use of all these tools, in combination or alone, allows users in social VR to approach the level of non-verbal communication found in real life, provided they have the right equipment and are on a platform which supports that equipment (e.g. NeosVR, where you can combine all these into an avatar which faithfully mimics your facial and body movements).

Two recently published research papers investigate nonverbal communication on social VR platforms, adding to the growing academic literature on social VR. (I am happy to see that social VR is starting to become a topic of academic research!)


Maloney, D., Freeman, G., & Wohn, D. Y. (2020). “Talking without a Voice”: Understanding Non-Verbal Communication in Social Virtual Reality. Proceedings of the ACM on Human-Computer Interaction, 4(CSCW2). https://doi.org/10.1145/3415246

Unfortunately, there is no open-access version of this conference proceeding available; you’ll have to obtain a copy from your local academic or public library. This paper, by Divine Maloney and Guo Freeman of Clemson University and Donghee Yvette Wohn of the New Jersey Institute of Technology, consists of two parts:

  • conducting unobtrusive observations of 61 public events held in AltspaceVR over the span of four weeks, to see what non-verbal interactions were being used naturally on the platform; and
  • interviewing 30 users of social VR platforms (of which I was one!), where the paper’s authors read through the transcribed interview data to acquire a picture with regards how social VR users used, perceived, and experienced non-verbal communication for further analysis.

In the first study of the two, the authors noted the following different kinds of nonverbal communication:

  • the use of movement to indicate that someone was paying attention. These included nodding behaviors and moving the body or head toward the person or object that was subject of attention;
  • the use of applause to indicate approval;
  • pointing and patting one’s own chest as a form of directing attention either at a remote object/person or oneself;
  • and behaviours such as waving, dancing, and kissing, which were mostly used in social grooming contexts (dancing was also used as entertainment);
  • and finally, the behaviour of trolls: interpersonal provocation and social disruptions.

With respect to the thirty interviewed conducted, they were analyzed as follows to answer two research questions:

Using quotes from users’ own accounts, in this section we present our findings as two parts. First, to answer RQ2 (How do people perceive and understand non-verbal communication in social VR?), we identified three common themes that demonstrated how users perceive and understand non-verbal communication in social VR: as more immersive and embodied interactions for body language; as a similar form of communication to offline face-to-face interaction in terms of spatial behavior, hand behavior, and facial expressions; and as a natural way to initiate communication with online strangers.

Second, to answer RQ3 (How, if at all, does non-verbal communication affect interaction outcomes in social VR?), we described the social consequences of interacting through non-verbal communication in social VR for various user groups, including marginalized users such as cis women, trans women, and disabled users. We specially highlighted how non-verbal communication in social VR afforded privacy and social comfort as well as acted as a protection for marginalized users.

Unsurprisingly, the researchers discovered that most participants considered non-verbal communication to be a positive aspect in their social VR experience. Those surveyed highly praised body tracking (either just the hands and head, or ins ome cases the whole body), as it allowed for a more immersive and embodied form of non-verbal communication than those in traditional, flatscreen virtual worlds.

In addition to supporting more immersive and embodied interactions for body language, participants also considered non-verbal communication in social VR similar to offline face-to-face interaction in terms of spatial behavior, hand behavior, and facial expressions. This familiarity and naturalness greatly contributed to their generally positive perceptions.

Participants also viewed non-verbal communication in social VR as positive and effective because it became a less invasive way to start interactions with online strangers (e.g. waving hello at someone you’ve just met). Nonverbal communication also afforded some users a sense of privacy and social comfort, and in some cases, became an effective protection for them to avoid unwanted interactions, attention, and behaviors (especially with LGBTQ people and women).

The paper made three design recommendations for improved nonverbal communication in social VR platforms: providing support for facial tracking (which is already on its way with products like the Vive Facial Tracker); supporting more accurate hand and finger tracking (again, already underway with the Knuckles controllers for the Valve Index); and enabling alternative modes of control, especially for users with physical disabilities. While most of the study participants highly praised full body tracking in social VR, disabled users in fact complained about this feature and demanded alternatives.

The conference paper concludes:

Recently, commercial social VR applications have emerged as increasingly popular digital social spaces that afford more naturally embodied interaction. How do these novel systems shape the role of non-verbal communication in our online social lives? Our investigation has yielded three key findings. First, offline non-verbal communication modalities are being used in social VR and can simulate experiences that are similar to offline face-to-face interactions. Second, non-verbal communication in social VR is perceived overall positive. Third, non-verbal interactions affect social interaction consequences in social VR by providing privacy control, social comfort, and protection for marginalized users.


Tanenbaum, T. J., Hartoonian, N., & Bryan, J. (2020). “How do I make this thing smile?”: An Inventory of Expressive Nonverbal Communication in Commercial Social Virtual Reality Platforms. Conference on Human Factors in Computing Systems – Proceedings, 1–13. https://doi.org/10.1145/3313831.3376606

This paper is available free to all via Open Access. In this conference proceeding, Theresa Jean Tanenbaum, Nazely Hartoonian, and Jeffrey Bryan of the Transformative Play Lab at the Department of Informatics at the University of California, Irvine, did a study of ten social VR platforms:

  • VRChat
  • AltspaceVR
  • High Fidelity (which shut down in January of 2020)
  • Sansar
  • TheWave VR (this social VR platform shut down in early 2021)
  • vTime XR
  • Rec Room
  • Facebook Spaces (since shut down and replaced by Facebook Horizon)
  • Anyland
  • EmbodyMe

For each platform, investigators answered the following eight questions:

  1. Can the user control facial expressions, and if so, how? (Pre-baked emotes, puppeteering, etc.)
  2. Can the user control body language, and if so, how? (Pre-baked emotes, puppeteering, postures. etc.)
  3. Can the user control proxemic spacing (avatar position), and if so, how? (Teleport, hotspots, real world positioning, etc.) How is collision handled between avatars? (Do they overlap, push each other, etc.)
  4. How is voice communication handled? Is audio spatialized, do lips move, is there a speaker indicator, etc.
  5. How is eye fixation/gaze handled? (Do avatars lock and maintain gaze, is targeting gaze automatic, or intentional, or some sort of hybrid, do eyes blink, saccade, etc.)
  6. Are different emotions/moods/affects supported, and how are they implemented? (Are different affective states possible, and do they combine with other nonverbal communications, etc.)
  7. Can avatars interact physically, and if so, how? (Hugging, holding hands, dancing, etc.) What degree of negotiation/consent is needed for multi- avatar interactions? (One-party, two-party, none at all?)
  8. Are there any other kinds of nonverbal communication possible in the system that have not be described in the answers to the above questions?

The results were a rather complete inventory of nonverbal communication in social VR, with the goal to catalogue common design elements for avatar expression and identify gaps and opportunities for future design innovation. Here is the table from the paper (which can be viewed in full size at the top of page 6 of the document).

An inventory of non-verbal communication in ten social VR platforms (source)

VR development is proliferating rapidly, but very few interaction design strategies have become standardized…

We view this inventory as a first step towards establishing a more comprehensive guide to the commercial design space of NVC [non-verbal communication] in VR. As a design tool this has two immediate implications for designers. First, it provides a menu of common (and less common) design strategies, and their variations, from which designers may choose when determining how to approach supporting any given kind of NVC within their platform. Second, it calls attention to a set of important social signals and NVC elements that designers must take into consideration when designing for Social VR. By grounding this data in the most commonly used commercial systems, our framework can help designers anticipate the likelihood that a potential user will be acquainted with a given interaction schema, so that they may provide appropriate guidance and support.

Our dataset also highlights some surprising gaps within the current feature space for expressive NVC. While much social signaling relies upon control of facial expression, we found that the designed affordances for this aspect of NVC to be mired in interaction paradigms inherited from virtual worlds. Facial expression control is often hidden within multiple layers of menus (as in the case of vTime), cannot be isolated from more complex emotes (as in the case of VR Chat), hidden behind opaque controller movement (as in Facebook Spaces), or unsupported entirely. In particular, we found that with the exception of dynamic lip-sync, there were no systems with a design that would allow a user to directly control the face of their avatar through a range of emotions while simultaneously engaging in other forms of socialization.

The authors go on to say that they observed no capacity in any of the systems to recombine and blend the various forms of nonverbal communication, such as can be done in the real world:

As we saw in our consideration of the foundations of NVC in general, and Laban Movement Analysis in particular, much NVC operates by layering together multiple social signals that modify, contextualize, and reinforce other social signals. Consider, for instance, that it is possible to smile regretfully, laugh maliciously, and weep with joy. People are capable of using their posture to
indicate excitement, hesitation, protectiveness, and many other emotional states, all while performing more overt discourse acts that inherit meaning from the gestalt of the communicative context.

The conference paper concludes:

As is evident in the scholarly work around social VR, improving the design space for NVC in VR has the potential to facilitate deeper social connection between people in virtual reality. We also argue that certain kinds of participatory entertainment such as virtual performance will benefit greatly from a more robust interaction design space for emotional expression through digital avatars. We’ve identified both common and obscure design strategies for NVC in VR, including design conventions for movement and proxemic spacing, facial control, gesture and posture, and several strategies unique to avatar mediated socialization online. Drawing on previous literature around NVC in virtual worlds, we have identified some significant challenges and opportunities for designers and scholars concerned with the future of socialization in virtual environments. Specifically, we identify facial expression control, and unconscious body posture as two critical social signals that are currently poorly supported within today’s commercial social VR platforms.

It is interesting to note that both papers cite the need to properly convey facial expressions as key to expanding the ability of avatars in social VR to convey non-verbal communication!

Breakroom Implements High Fidelity’s Three-Dimensional Audio

Photo by Jason Rosewell on Unsplash

Sinewave Entertainment’s Breakroom (the corporate cousin of their social VR/virtual world platform Sinespace) has recently implemented the spatialized, three-dimensional audio API offered by the revamped High Fidelity.

VentureBeat reports:

The deal is a convergence of pioneers who have made their mark on the development of virtual life. Philip Rosedale is the CEO of High Fidelity and launched Second Life in 2003. Sine Wave Entertainment, the creator of Breakroom, got its start as a content brand in Second Life before it spun out to create its own virtual meeting spaces for real-world events.

Adam Frisby, chief product officer and cofounder of Sine Wave, said in an interview conducted inside Breakroom that the High Fidelity spatial audio will help Breakroom create a triple-A quality experience in a virtual world.

“The real benefit of having 3D audio in a virtual world like this is you can have lots of conversations going on simultaneously,” Frisby said. “3D audio is the only way to replicate the real-world experience in an online environment. You can have a 150-person conference and end up with 10 groups of people talking at the same time. That has helped us with engagement.”

Breakroom is among the first group of clients for Philip Rosedale’s company. Adam tells me that they are looking at implementing the same 3D audio in Sinespace at some point in the future.

Here’s a two-minute YouTube video where Adam Frisby explains and demonstrates the new 3D audio:


This blogpost is sponsored by Sinespace, and was written in my role as an embedded reporter for this virtual world (more details here).