Today at 11:00 a.m. CST, Philip Rosedale (the founder and former CEO of Linden Lab, the makers of Second Life, and the current CEO of High Fidelity) hosted a discussion titled Second Life Stories, and Designing the Metaverse, where people had an opportunity to ask him questions. Dr. Fran Babcock and Dr. Hayman Buwaneswaran Buwan from the MetaWhat? The Metaverse Show were key organizers. Philip is always an engaged, articulate, and informed speaker, and if you missed this event, I will update this blogpost with links to an archived version which you can listen to via Twitter Spaces, Clubhouse, and Callin. UPDATE 7:14 p.m.: Links are at the end of this blogpost.
Philip was on Twitter Spaces, with well over 100 listeners in the room, but the conversation was also extended to the social audio apps Clubhouse and Callin, plus there was a virtual auditorium set up in Second Life, with almost 50 avatars present! Participants in all four spaces could both hear and ask questions. To my knowledge, this is the first time something like this set-up had been attempted.
Philip shared a couple of “first stories” from his experience with Second Life, real stories from the early years of the company, both pre- and post-launch in 2003, e.g. Steller Sunshine’s beanstalk. He talked about how it was a challenge to provide backwards-compatibility, and how this affected the design of SL over time (for example, changing the friction elements would affect how people could climb the beanstalk). He talked about how he was able to drop a virtual pebble into the virtual water to create ripples (something which was later taken out because it was so computationally expensive!).
When asked why Second Life did not create mobile apps, Philip says that SL, when launched in 2003, predated mobile devices like the iPhone (introduced in 2007) and apps like Facebook (launched in 2004). While Philip is an advisor to Linden Lab, he is not a member of the executive team running the company day-to-day. He says that running SL on a mobile app is a “hard problem” to solve (I agree).
I asked Philip about his opinions regarding Meta’s surveillance system to enforce good behaviour, which includes constantly recording what happens in Horizon Worlds in case someone wants to send an abuse report to the moderators to act upon. Philip talked about his misgivings about AI-based surveillance and targeting systems in the metaverse, and how they could be used to gather information about us in new and disturbing ways, such as using how we are feeling to decide what ads to show us.
Philip has grave concerns about a business model of metaverse designed around advertising and surveillance. Talking about moderation, Philip wants the metaverse to be designed largely driven by the actions of the (human) people who are there, rather than implementing an automated behavioural surveillance and reporting system.
In answering a follow-up question, Philip said he felt that it it is indeed possible to have a metaverse with consequences for trolls and griefers, while still building strong social connections between people, citing as an example banning a person from a public place such as a restaurant where they were misbehaving.
Philip mentioned, in an interview he gave to a media outlet earlier today, that Second Life still has a higher revenue per person per year than YouTube does, with most of that income coming from fees: fees on sales and fees for virtual land (tier). He feels that a business based on fees (as opposed to surveillance advertising) is most definitely scalable, citing the approximately one million users in Second Life.
Philip talked about how presence can change communication dynamics, such as how how walking up to another avatar, and being physically near another avatar, triggers a response where people tended to be more civil than they might be in a text-only environment like a chatroom, and how quickly such presence could help defuse potentially negative communications.
Among the speakers present were Avi Bar-Zeev, the person who created the primitive system, the digital atoms used for building anything and everything in the early days of Second Life! In fact, many content creators in the metaverse got their start by prim-building in SL. (One SL historian remarked that today was the 20th anniversary of the first-ever created prim in Second Life, made on January 25th, 2002!) Philip talked about how Second Life’s prim permission system could be seen as a forerunner of newer digital asset systems being considered for the metaverse.
later moving on to creating and texturing mesh models using tools like Blender. Avi talked about artificial scarcity of virtual worlds, and the necessity to design the metaverse to be human spaces, a place to rehumanize rather than dehumanize those who participate.
Philip talked about how VR headsets are still not affordable and accessible enough (i.e. you are able to wear them all day) yet, to be able to have the kind of social community that we experience in virtual worlds like Second Life. He said (and I was transcribing madly while he spoke, so this is a paraphrase!):
It’s difficult to get people to communicate normally in a virtual world. It’s easy to forget that this is an experience that most people would not be comfortable with, yet. We’re not there yet, and the way we get there is to make avatars more visually expressive, which is a tough problem to solve.—Philip Rosedale
Philip talked about spatialized audio products such as High Fidelity’s 3D audio as an aid to community-building, but adds that we still need to work on nonverbal communications (the listener leaning in to the speaker to indicate engagement, etc.).
There was a lot more discussed, including Philip Rosedale’s thoughts about virtual economies and NFT real estate, which unfortunately I did not have a chance to transcribe. Philip is always an articulate and informative speaker, so you will want to listen to the recording if you missed this event.
I will, however, provide a link to an archive of this wide-ranging and fascinating discussion on Clubhouse, Twitter Spaces, and Callin, once Dr. Hayman posts it! He is to be thanked for juggling everything in order to make this multicast such as success.
UPDATE 7:14 p.m.: Here, as promised, are links to the recordings made:
Twitter Spaces recording 1:43:44 (Dr. Hayman tells me, “this recording has less of the interruptions from Second Life, as I muted the mic when feedback and keyboard noises were present in SL”)
Callin recording 1:40:08
Enjoy! I know I will be relistening to portions of this.