How You Can Prepare for the Upcoming Switch to Avatar 2.0 in Sansar

As you probably already know, Sansar is upgrading its default, dressable human avatars to a new version soon. This update, called Avatar 2.0, means that existing human avatars and any items rigged for them will be discontinued. Earlier I wrote:

Linden Lab is working on the next version of the default human avatar in Sansar, dubbed Avatar 2.0, which should arrive sometime in August or September 2019. Unfortunately, first-edition avatars will be discontinued, and any items rigged for them (such as clothing, shoes, and hair) will break and not be useable by the next-generation avatars. However, clothing made using Marvelous Designer should still work with Avatar 2.0 avatars.

Here’s a FAQ by Linden Lab explaining all the upcoming changes in more detail, as well as an official blogpost. Inara Pey has also blogged extensively about Avatar 2.0 hereherehere and here.

Today, Linden Lab sent out more information about this upcoming major change to Sansar:

We’re just a month away from the official launch of Avatar 2.0, and we want to make sure creators like you have ample time to prepare.That’s why we’re releasing our Avatar 2.0 skeleton early – to give you all of August to optimize your content… Starting today, you’ll also be able to submit any avatar items you create against the new skeleton to the Sansar team for an official review. These include: full avatars, rigged clothing, hair, MD clothing, emotes, and accessories. 

You can download the new avatar reference files here. And here are step-by-step instructions on how to submit your created items to Linden Lab for review.

And, as a sneak peek of what we can expect with Avatar 2.0, here is a YouTube video (taken from a weekly Sansar Product Meetup livestream) that shows how you can adjust the facial features on the new human avatars:

This reminds me strongly of how you adjust your face in The Sims 4. I’m really looking forward to seeing this roll out!

Coming Soon: Explore the Vatican’s Sistine Chapel in Virtual Reality

This is a blog devoted to social VR, virtual worlds, and the metaverse, so I usually don’t cover non-social VR experiences (i.e. those you visit alone). But I’m going to make an exception for an experience showcased at the SIGGRAPH 2019 conference in Los Angeles.

VentureBeat reports:

There’s no shortage of sophisticated mixed reality hardware at Siggraph, but I was most impressed by a piece of software that really demonstrated VR’s educational and experiential potential. Christopher Evans, Paul Huston, Wes Bunn, and Elijah Dixson exhibited Il Divino: Michelangelo’s Sistine Ceiling in VR, an app that recreates the world-famous Sistine Chapel within the Unreal Engine, then lets you experience all of its artwork in ways that are impossible for tourists at the real site.

Il Divino: Michelangelo’s Sistine Ceiling in VR is described by its creators as follows:

The demo was created exclusively for the SIGGRAPH 2019 Immersive Pavilion, by the team behind the previous SIGGRAPH 2017 VR piece: Il Gigante: Michelangelo’s David in VR. Debuting at SIGGRAPH on Valve’s INDEX headset, Il Divino delivers an experience of the highest fidelity –you can see individual cracks and brush strokes in the plaster!

Attendees can step onto Michelangelo’s own scaffold to learn about how he painted the ceiling, or enter a Vatican conservator’s mobile aerial platform to see the ceiling up close, and learn about the controversial cleaning. In all, there are over 100 clickable elements connected to an hour of commentary talking about Michelangelo’s monumental work.

Later this year, it will be released to all as a freely downloadable experience, and it will continue to be added to and improved in the future.

This reminds me of a virtual recreation of the Sistine Chapel in Second Life, which I visited sometime in 2007 or 2008 (unfortunately, it is no longer available to visit):

My angel avatar visit the Sistine Chapel in Second Life (circa 2007)

If you are curious about how this VR experience was constructed, here is more information on the techniques used. There’s even some suggestions as to how you can help improve the project.

Here’s a video of the experience:

And here’s an interview with one of the creators, Chris Evans.

I Had a Dream…

Photo by Randy Tarampi on Unsplash

You know that something has really become an integral part of your life when you starting dreaming about it! Well, I am on holidays from work this week and next, so I turned off my alarm clock and slept in late this morning, and I just woke up from the most vivid and detailed dream, where I was at a social VR conference!

Here’s my dream, in as much detail as I can remember, first thing in the morning while many of the details are still fresh in my mind:


I wound up at this conference, held at the University of Western Ontario, quite by accident. I am an alumni of UWO (which is true in real life), and I was there visiting my old campus residence, when I stumbled across this conference, which was being held on campus. Attending were venture capitalists, designers, and representatives of various companies who were building and selling social VR platforms.

A cameraman had captured a video of me trying out a VR experience, wearing a VR headset and holding a rifle-like gun, and I got to see the video. I was having an ecstatic first-time user experience, and he captured it for all to see! I was so happy that I asked for a copy of the video, and he sold it to me for $60. (I even remember hunting around in my wallet for the right amount of cash!)

I was talking with vendors and having them give me business cards so I would remember their details so I could blog about them later. In once case someone, whom I recognized as a fellow computer science student when I was at the University of Manitoba (which is a true detail from my real life) asked me for my phone number so she could call me, and I tried to tell her first my work number, and then my cell number—and each time, I discovered that I had forgotten the last few digits!

One of the events at the conference was Strawberry Singh‘s wedding—to a woman named Raspberry! In real life, Strawberry looked exactly like her avatar! (I had absolutely no difficulty recognizing her in real life.) At one of the events afterwards at this conference, I went up to Strawberry and Raspberry to congratulate them—only to discover that I had laryngitis and I couldn’t speak above a whisper!

One of the VR companies showed a promotional video about their new social VR platform, which included pictures taken in real life at various locations of people trying out their product. I immediately recognized some of the locations as pictures taken at the high school I had attended, Transcona Collegiate (which is true in real life), and in the video I also saw a photo mosaic of people trying out their product, which included a picture of me!

Just before I woke up, I was at the lunch for the conference attendees. I had to hunt around a little bit to find it in the building, and in one of the rooms I looked in before I found it, the walls had large displays of various bloggers’ blogposts about social VR—including posts from this blog! I remember standing there looking at this and thinking: Holy shit, I’ve made it it. People are talking about me at a conference.)

At the lunch, I asked the moderator for permission to come to the podium to speak to all the conference attendees. She said yes, and I went up to the podium to speak, and the microphone was having technical problems and I seemed to have some sort of laryngitis, which were interfering with my attempt to tell everyone how wonderful this whole experience was and to thank them for letting me be there, even though I hadn’t registered for the conference…

And then I woke up.

Image by Stefan Keller from Pixabay

This has got to be one of the more vivid and detailed dreams I have ever had! Several times throughout this dream, I actually thought to myself, “This is a dream, this can’t be real“, only to wake up in my dream while still dreaming and realize that it wasn’t a dream—it was really happening to me! Crazy. I wonder what my dream means, if anything.

I love how the dream sprinkled in various details that corresponded to my real life, including my old high school. And I do find it interesting that at least three times in my dream, I had difficulty in communicating with other people at the conference. I’m not sure if my subconscious is trying to tell me something significant, or not!

P.S. I forgot to mention that in my dream, in one room at the conference, there were a bunch of well-known drag queens from my favourite reality TV show, Ru Paul’s Drag Race! I recognized Shangela there. The drag queens tried to warn me about someone who was at the conference, from whom I had obtained detailed information about their soon-to-be-released social VR platform, which was still being kept secret from the public. They warned me that he was not a good person and that I should be careful!

Facebook Demos Highly Realistic Avatar Facial Animation

My Twitter feed keeps delivering news nuggets this week! This is an update to a blogpost I had written earlier this year on this technology.

Facebook Reality Labs has published a research article in the journal ACM Transactions on Graphics, which shows cutting-edge avatar facial animation using multiple cameras attached to a VR headset, and a new multiview image processing technique. (The full paper is free to download from the link above.) The researchers also gave a presentation at the SIGGRAPH 2019 computer graphics conference in Los Angeles.

The results are impressive, giving an avatar human-driven, lifelike animations not only of the lower face but also the upper face, which of course if covered by the VR headset:

This is light years ahead of current avatar facial animation technology, such as the avatar facial driver in Sinespace, which operates using your webcam. Imagine being able to conduct a conversation in VR where you can convey the full gamut of facial expressions while you are talking! This is a potential gamechanger for sure. It’s not clear when we can expect to see this technology actually applied to Oculus VR hardware, however. It might still be many years away. But it is exciting!