At yesterday’s Meta Connect 2023 event, after Mark Zuckerberg and a parade of Meta employees extolling the wonders of Meta products and services, there was a post-keynote, a conversation between Michael Abrash (the Chief Scientist of Meta’s Reality Labs) and Andrew “Boz” Bosworth (Meta’s CTO and Head of Reality Labs). I hadn’t had a chance to watch this yesterday because I was so busy, but I did have some time today to watch. Here are my notes.
First off, I was pissed off that the only way I could watch this conversation was via Facebook. This is so typical of the gatekeeping that Meta engages in: forcing you to connect to a Facebook website, where you aren’t given the option of disabling any cookies that are set. At least I was able to use Firefox browser’s Facebook Container plug-in to mitigate things somewhat, but yes, I was irritated. Believe me, if I could have found this conversation on YouTube or elsewhere, I would have watched it there!
To give another example, Meta has the gall to say that they’re embracing “open source”, yet they pull stunts like making several of their newly-announced games for the Meta Quest 2 and 3 exclusive to the Quest ecosystem, and unavailable to, say, Steam players. Not cool. But I disgress; let’s get back to the topic at hand: the conversation between Boz and Michael.
Michael Abrash talked about codec avatars and how they’re not quite yet at the place where your brain is fooled that you are looking at a real person (in the same way that, perhaps, a well-designed virtual space feels “real” and immersive to your brain, and not just an image you are looking at, something many of us have experienced). Here’s a recent example to give you a sense of just how quickly the technology is evolving (this is, to my knowledge, at a research stage only, and not yet commercially available):
Michael considers codec avatars to be something that will help the concept of the metaverse reach its full potential: a way to put people together in a virtual space that feels fully real.
When Boz asked Michael to reflect on what he’s working on that’s most inspiring to him, he replied with a beauty-pageant-contestant answer that everything they’ve been working on is important. Michael then replies:
If I had to pick one thing, I would say that the personalized, contextualized, ultra-low-friction, Ai interface is the thing that I find most exciting, and the reason is…the way that humans interact with the digital world has only changed once ever, and that really was Doug Engelbart, Xerox PARC, the Mac, and since then, we’ve been living in that world. And as we move into this world of mixing the real and virtual freely, we need a new way of interacting. And so I feel that that has to be this contextualized AI approach, and getting that to happen is the thing that I find most exciting. It is a once-in-a-lifetime opportunity to change the way that everybody lives.
Meta is playing a long game here; betting that the research work that they are doing now will lead to their dominating the virtual reality/augmented reality/mixed reality/extended reality marketplace, and it’s clear that they do see their AI work as being a key part of that. How that will play out remains to be seen, but it is fascinating to see two people talk about this in a public form (even if it is on Facebook!).
Let’s just hope and pray that this “once-in-a-lifetime opportunity to change the way that everybody lives” does not become some corporate-run, surveillance-capitalism dystopia!
Following this was segment called the “Developer State of the Union,” a promised deeper dive into tools, programs and features for Meta ecosystem developers. Funny how “ecosystem” sounds so much friendlier than “walled garden.” 😉
But I am going to pause my cranky snark, hit publish on this blogpost, and call it a day.