The social VR/virtual world platform Sinespace is going to be throwing a zoo-themed party on Saturday evening, September 3rd, 2022, at 5:00 p.m. Pacific Time/8:00 p.m. Eastern Time. Residents are encouraged to dress up as animals, and there are some serious prizes for the top two winners of the costume contest!
The Zoo with its amazing show of dozens of animals! Best dressed in created animal: first place 2000 Gold, second place 500 Gold (contest board). Live music with DJ Les spinning the tunes!
…
24 Hour Scavenger Hunt! Four frogs have escaped the Amphibian House and the zookeeper needs help finding them. Go to The Zoo between 8:00 p.m. EDT Sept. 3rd and 8:00 p.m. EDT Sept. 4th, find a frog, and take your picture with the frog. Post it to Sinespace Discord events, on fansite, or the Welcome Centre Flickr board. Tag Mimi, send help and receive 100 Gold! One frog per person, please!
The location of the party will be the Zoo world, but it will not be open until just before the event starts. Just click on the Explore button in the bottom row of blue buttons on your Sinespace client, search for “Zoo”, and you’ll easily find it!
Plug-and-play is a term often used to refer to something you can simply install by plugging it into one of the ports on your personal computer (usually USB), where it automatically sets itself up and it just works, right out of the box, without any fuss or futzing about. (I am old enough to remember the pre-USB days. Hell, I still remember in my high school days having to stick stacks of 80-character punchcards into card readers to submit programs! Yes, Auntie Ryan is as old as dirt, sweetheart!)
Over two days this week, I set up two new pieces of hardware in my office at the University of Manitoba Libraries: a brand new desktop personal computer with a high-end graphics card, and a new virtual reality headset tethered to it.*
Yes, I finally cut my very last tie to Facebook/Meta, gleefully packing up my old Oculus Rift headset, and uninstalling all traces of the Oculus software from my former PC before it goes on to its next owner! I doubt anyone will want the now-antiquated Rift, but at least my old PC should gladden the heart of whoever receives it!
And it struck me (as I was relaxing on the sofa today after a busy, sweaty, sweary Thursday and Friday) that over the past six years, I have set up no less than four different models of virtual reality headset:
An original Oculus Rift, bought in January 2017 (followed by a second Rift for my work computer later that same year);
An HTC Vive Pro 2 headset, bought last month to replace my work Oculus Rift.
My brand new Vive Pro 2: PCVR setup is still a pain in the ass
Of these, only the Quest was a wireless VR headset; the Oculus Rift, Valve Index, and HTC Vive Pro 2 are all what are collectively termed PCVR, that is, virtual reality headsets that require a cable to a high-end gaming computer in order to work. Of course, even the Quest could be turned into a PCVR headset with the addition of a cable and some extra software, something I eagerly tested out myself as soon as I could! However, the primary purpose of the Oculus Quest, both version 1 and version 2, was as a standalone device to be sold at a cheaper price, to entice more of the general public to dip their toe into VR waters, and get them hooked! (I have been reliably informed that Meta sells the Quest itself at a loss, in order to recoup that loss and earn the real profits through the sale of games and apps via the Oculus Store.)
However, PCVR is—still, six years after the first consumer models arrived on the marketplace—an absolute pain in the ass to get set up! Allow me to recount my experience of installing, configuring, and troubleshooting my PCVR setup this week.
In the box which contained my HTC Vive Pro 2 office kit, was a large paper document listing the dozens of cables and other parts, with a website address from which I could download a setup program, which was supposed to install all the software I needed, and walk me step-by-step through the setup of my VR headset and controllers. Despite install attempt after attempt, the setup program kept hanging at the 5/6th point, leaving me to attempt to piece everything together on my own.
I landed up spending over an hour in text chat with a support person on the Vive customer support portal, who talked me through a complete reinstall of all the software components (I never did get the step-by-step walk-through of device setup that I was expecting, which was disappointing).
I was supremely grateful for the friendly, reassuring and professional tech support person I was chatting with, however, and I commend Vive for making it quite easy to reach out for immediate help when I got stuck (quite unlike my previous horror-show of tech support when my Valve Index headset at home broke earlier this year). Don’t get me wrong; I still love my Valve Index, but my customer support experience in March 2022 was so horrible that I would hesitate to purchase another VR headset from Valve in future. Valve could learn a lot from Vive!
Valve Index: a wonderful product, but customer support needs improvement
Finally, I left work on Thursday evening with a fully working system after a full day of frustration, fussing and futzing! On Friday I returned to face a brand new set of challenges: installing various social VR platforms, and getting them to work properly with my new Vive Pro 2 setup. By the end of Friday, I finally had set up working access to VRChat, Neos, and Sansar, and in each I had my fair share of bugs and problems (partly because I was so unused to the Vive wand hand controllers, which take some getting used to). It was frustrating and exhausting.
Which brings me the point of this editorial rant: why, six years into the age of consumer virtual reality, is it still such a daunting task to set up a tethered virtual reality headset? How is it that you basically need the knowledge and expertise akin to someone at NASA Mission Control in to put a PCVR system together and get it working right the first time? It’s akin to asking people who want to drive to buy the car frame from one manufacturer, the interior seats and steering wheel from a second company, and the engine and transmission from yet another firm, and then giving them a set of IKEA instructions and a hex wrench and telling them, good luck, buddy!
I mean, if even I, with all my previous virtual reality and computer assembly experiences over the decades (and an undergraduate degree in computer science, to boot!) had trouble pulling everything together, what does that say about the average, non-technical consumer that just wants everything to work? Virtual reality in general, and PCVR is particular, is still way too far away from plug-and-play consumer friendliness, and the VR industry needs to address that hurdle before it can see more widespread adoption. If you want to throw money at a problem, throw some at this!!!
The one thing that the Quest still has going for it, despite its association with Meta’s sketchy embrace of surveillance capitalism, is this: out of all the VR setup experiences I have had to date, it was easily the closest to plug-and-play! (All I needed was a cellphone.)
Don’t get me wrong; I know that Steam, Vive, and Valve also collect customer data. It’s just a question of how much data, and how much you trust the companies collecting it. That why I have zero trust in Meta, and it’s also why so many people are watching carefully to see how and when Apple enters the VR/AR marketplace. (Apple is not perfect, but at least I trust them with my privacy. They also have a reputation for creating beautifully-designed, plug-and-play, consumer-friendly devices!)
Things are, as always, going to be interesting to watch over the next couple of years!
Wireless VR headsets are still the closest to the Holy Grail of plug-and-play (Image by dlohner from Pixabay)
*For those of you who are interested in the specifications of my new work setup, here they are: a Dell Optiplex 7000, running Windows 10, with an Intel Core i7-12700 CPU with 32GB of RAM, and an NVIDIA GeForce RTX 3070 GPU, and an HTC Vive Pro 2 office kit (VR headset, 2 base stations, and Vive wand hand controllers).
This afternoon, Linden Lab (the makers of virtual world Second Life) made an announcement:
Wouldn’t it be cool if you could animate your avatar in real time? What if you could wave your arm and your avatar could mimic your motions? Or imagine if your avatar could reach out and touch something in-world or perform animations? Linden Lab is exploring these possibilities with an experimental feature called “Puppetry.”
We have been working on this feature for some time and now we are ready to open it up to the Second Life community for further development and to find out what amazing things our creators will do with this new technology.
The code base is alpha level and does contain its share of rough edges that need refinement, however the project is functionally complete, and it is possible for the scripters and creators of Second Life to start to try it out.
The animated GIF I copied from the Linden Lab announcement didn’t work in my blogpost, so I downloaded the video from their tweet below:
Now, Second Life is not the first flatscreen virtual world to announce such a feature (that would be Sinespace; I wrote about their Avatar Facial Driver back in 2018). At that time, Sinespace said that facial coverings such as glasses might interfere with the tracking. However, four years have passed and I have zero doubt that the technology has improved!
Linden Lab goes on to explain how the Puppetry technology works:
Puppetry accepts target transforms for avatar skeleton bones and uses inverse kinematics (IK) to place the connecting bones in order for the specified bones to reach their targets. For example the position and orientation “goal” of the hand could be specified and IK would be used to compute how the forearm, elbow, upper arm, and shoulder should be positioned to achieve it. The IK calculation can be tricky to get right and is a work in progress.
The target data is supplied by a plug-in that runs as a separate process and communicates with the viewer through the LLSD Event API Plug-in (LEAP) system. This is a lesser known functionality of the Viewer which has been around for a while but has, until now, only been used for automated test and update purposes.
The Viewer transmits the Puppetry data to the region server, which broadcasts it to other Puppetry capable Viewers nearby. The receiving Viewers use the same IK calculations to animate avatars in view.
For more details about the Puppetry technology, take a look at the Knowledge Base article Puppetry : How it Works
To my knowledge, this marks a major change in how avatars move in Second Life. One of the things which the newer generation of metaverse platform users (much more used to social VR platforms like VRChat) have found odd is that SL avatars rely so much on the playback of pre-recorded animations. (Keep in mind that SL does not support users in VR headsets, as it cannot reach the necessary frame rates to avoid VR sickness! There have been valiant attempts made over the years, however.)
If you are intrigued by this development and want to test it out for yourself, here are the details (it does sound as though you will need to be a bit of a computer geek to participate, at least in this open beta test period!):
The Puppetry feature requires a project viewer and can only be used on supporting Regions. Download the project Viewer at the Alternate Viewers page. Regions with Puppetry support exist on the Second Life Preview Grid and are named: Bunraku, Marionette, and Castelet.
When using the Puppetry Viewer in one of those regions, if someone there is sending Puppetry data you should see their avatar animated accordingly. To control your own avatar with Puppetry it’s a bit more work to set up the system. You need: a working Python3 installation, a plug-in script to run, and any Python modules it requires. If you are interested and adventurous: please give it a try. More detailed instructions can be found on the Puppetry Development page.
We look forward to seeing what our creators do with the new Puppetry technology. Compared to other features we have introduced, it’s quite experimental and rough around the edges, so please be patient! We will keep refining it, but before we go further we wanted to get our residents’ thoughts.
We will be hosting an open discussion inworld on Thursday, Sept 8 1:00PM SLT at the Bunraku, Marionette, and Castelet regions on the Preview Grid. We’re also happy to talk about this at the upcoming Server User Group or Content Creator meetings. Come by, let us know what you think, and hear about our future plans!
I for one will be quite excited to test this new feature out!