EDITORIAL: 24 Hours in Second Life—A Wedding, a Light Show, and Some Choreography! (What the Newer Social VR Platforms Can Learn from SL)

As I have often said before, Second Life is the perfect example of a mature, fully-evolved metaverse, which the newer social VR platforms would be wise to study and learn lessons from. Just because it doesn’t support virtual reality does not mean that you can’t learn something from its 18-year history.

One of the ways in which I keep my finger on the pulse of Second Life is to head to YouTube, do a search for “second life”, then sort the results in reverse chronological order by the time they were uploaded (i.e. most recent videos first). Usually, I scroll back 24 to 48 hours to see what the latest SL videos are, and I always find something to delight and surprise me.

So I did this yesterday evening, and today I wanted to share with you three videos which perfectly illustrate all the wild and wonderful ways in which people are using Second Life. Platforms need to attract content creators, and in this SL has succeeded beyond anyone’s wildest expectations! See also this blogpost.

First, the newer social VR platforms need to ask themselves: Can you host a wedding? (In VRChat at least, the answer is “yes”.) Second Life weddings are big business, and you might be surprised to learn that there are dozens of wedding venues, many stores selling wedding dresses and bridesmaid outfits—and even businesses which specialize in making professionally-edited wedding videos!

Another question the newer social VR platforms need to ask themselves: Do you support particle effects? You can see the particle effects in Second Life at work in the fireworks in the previous wedding video, and here is another example of the creativity which can be unleased with the proper particle and light systems!

Finally, I ask the social VR platforms: Can you dance? Second Life boasts what may well be the single biggest selection of avatar animations on any platform (a quick search of the SL Marketplace pulls up tens of thousands of dances), which lets you unleash your inner musical director and create stage shows like the following:

So don’t be so quick to dismiss Second Life as antiquated and outdated! There’s life in the old girl yet! 😉 The creators of the newer social VR platforms might just want to spend a bit of time investigating, and get some ideas for new features or as-yet-unexplored new market niches in the metaverse!

Who knows? You just might yet entice a future wedding videographer, lightshow mastermind, or musical choreographer to your new platform!

VRChat The Movie: Coming Soon!

Machinima (/məˈʃiːnɪmə, -ˈʃɪn-/) is the use of real-time computer graphics engines to create a cinematic production…Machinima-based artists, sometimes called machinimists or machinimators, are often fan laborers, by virtue of their re-use of copyrighted materials…

Machinima (Wikipedia article)

Well, you knew this was going to happen sooner or later: a full-length feature movie shot entirely in VRChat, complete with in-jokes that only diehard VRChatters would get! (Actually, I’m sure it’s not the first movie shot in VRChat, but I’m pretty certain it’s the most ambitious machinima project in VRChat to date.)

Here’s the three-minute teaser trailer for VRChat The Movie:

The cast and crew involved in the creation of this cinematic masterpiece participated in a live discussion panel held at the recently concluded VRCon 2021 (the actual content starts at the 26:45-minute mark for some strange reason in this livestream of the event):

The project even has a page on the Internet Movie Database (IMDb)!

Attend the VRCon Film Festival in VRChat on September 12th, 2021

One event which somehow escaped my notice is VRCon, but I did see a message that they were holding a film festival on the final day, Sunday, September 12th, 2021, starting at 11:00 a.m. EST. Various short films, machinima, and music videos shot in VRChat will be shown.

Here’s the lineup (you can click on the following image of the line up to see it in a more readable size in Flickr):

VRCon Film Festival

See you there! You can obtain more information about the VRCon festival from their official website, including a complete schedule of events.

UPDATED! Academic Research in Social VR: Crowdsourcing Virtual Reality Experiments Using VRChat at Northeastern University

I am still working away on my presentation on the various uses of social VR in higher education, which I am to deliver on Sept. 8th, 2021 to my university’s senate committee on academic computing. Over the summer I have highlighted a number of interesting and innovative projects at various universities and colleges (you can find a number of them here, all tagged with the tag “Higher Education”). And I am especially heartened to see more and more published academic research on virtual reality, triggered by the increasing uptake of consumer-market VR headsets!

Conducting experiments in VR can sometimes be difficult, involving the purchase and setup of sometimes expensive hardware (particularly if multiple headsets need to be bought). University budgets can only go so far, even at the best of times. One way to get around this is to use existing commercial social VR platforms and their users as volunteers (who, of course, already have their own equipment).

This is a different form of what is called crowdsourcing: dividing up a task among a larger group of volunteers. In this case, researchers at Northeastern University in Boston, Massachusetts did a small demonstration experiment to prove the idea that recruiting study volunteers via VRChat was possible, publishing a paper at a computer science conference held last year. The following research paper is unfortunately not free to access and read, but you can always use your friendly local public or academic library to obtain a copy of it! Here’s the citation:


Saffo, D., Yildirim, C., Di Bartolomeo, S., & Dunne, C. (2020). Crowdsourcing virtual reality experiments using VRChat. Conference on Human Factors in Computing Systems – Proceedings, 1–8. https://doi.org/10.1145/3334480.3382829


According to the conference paper’s abstract:

Research involving Virtual Reality (VR) headsets is becoming more and more popular. However, scaling VR experiments is challenging as researchers are often limited to using one or a small number of headsets for in-lab studies. One general way to scale experiments is through crowdsourcing so as to have access to a large pool of diverse participants with relatively little expense of time and money. Unfortunately, there is no easy way to crowdsource VR experiments. We demonstrate that it is possible to implement and run crowdsourced VR experiments using a preexisting massively multiplayer online VR social platform—VRChat. Our small (n = 10) demonstration experiment required participants to navigate a maze in VR. Participants searched for two targets then returned to the exit while we captured completion time and position over time. While there are some limitations with using VRChat, overall we have demonstrated a promising approach for running crowdsourced VR experiments.

One of many delightful images illustrating this research paper!

One of the features which attracted the researchers to VRChat was the ability to build custom virtual worlds or rooms:

VRChat also has a special feature that sparked our interest: it allows users to upload custom rooms built with Unity by using a proprietary VRChat SDK. The SDK contains special triggers and event handlers that can be triggered by users, in addition to giving the possibility to upload rooms made of and containing any kind of 3D models made by a creator. We started asking ourselves if we could leverage the vast amount of VRChat users who already own VR equipment and use them as experiment participants by building a custom room that contained the implementation of our experiment, in order to run crowdsourced experiments in VRChat.

And so they built a maze and ran a simple experiment:

The participants in the experiment were asked to run through a VR maze, find two targets inside the maze, and go back to the exit. The experiment was run using two point of views, immersive and non-immersive, and compared the timing between a group of self-declared gamers and non-gamers. Our reasoning for choosing this experiment over others was that it was simple enough to avoid having too many variables influencing the results, and it would give us a quick way to evaluate the process of conducting a user study on the platform.

A researcher would then visit public world in VRChat, asking users present if they would be willing to run the maze.

After joining a public world, we began by looking for users using HMDs. We did this by asking users directly if they were using VR, or by observing their in-game movements as VR users have full head and sometimes hand tracking. We found that most users we approached were willing and eager to participate. After users had joined our world, they would spawn in a waiting room where we could give them further instructions. At this stage researchers conducting a user study may also present digital consent forms for participants to read and sign.

The researchers noted that, at the time of the proof-of-concept experiment, they were somewhat limited by the relatively narrow scope of what they could build using the then-available version of the VRChat SDK (software development kit). However, they noted that the next-generation graphical SDK (called Udon) offered the ability to build more complex interactive worlds, thereby expanding the possible uses for VR experiments.

The researchers also noted the relative ease and cost effectiveness with which VRChat could be used for academic research into the growing field of social or collaborative virtual reality:

It is particularly exciting to note that VRChat can also be
used to implement collaborative VR studies. Previously,
such studies would require custom multiplayer platform development. VRChat not only provides an SDK to create
worlds but also all the network capabilities to have several concurrent users all in the same virtual space.

UPDATE 2:02 p.m.: I’ve just discovered a recent five-minute YouTube video featuring the Northeastern researchers, explaining the concept of using existing social VR platforms for their experiments:

This video mentions and summarizes a second, follow-up research paper, which I have not yet read (again, you will have to pay to access this conference paper; you should be able to obtain a copy via your local public or academic library). Here’s the citation for you:


Saffo, D., Bartolomeo, S. Di, Yildirim, C., & Dunne, C. (2021). Remote and collaborative virtual reality experiments via social VR platforms. Conference on Human Factors in Computing Systems – Proceedings. https://doi.org/10.1145/3411764.3445426


I’m quite eager to read this second research paper! According to the description of the YouTube video, a preprint of this conference paper and all supplemental materials are available at the following URL: osf.io/c2amz (so you might not need to pay for a copy via interlibrary loan/document delivery from your local library, after all).

Of course, it’s not just VRChat that could be repurposed as an academic testbed. Any number of commercially available social VR platforms can be used as cost-effective platforms to conduct VR experiments! The researchers at Northeastern University are to be commended for their proof-of-concept work, and I very much look forward to seeing other uses of social VR platforms in various areas of academic virtual reality research.