NeosVR Demonstrates Full Mouth Tracking on Their Avatars

Tomáš Mariančík (a.k.a. Frooxius), the creative and talented lead software developer who is building the NeosVR social VR platform, recently shared the following video via Twitter, saying:

The HTC Vive lip tracker dev-kit I integrated into NeosVR really adds a whole new level of expressivity to social VR. Never before [have] I had other people in VR telling me they like to see me smile when there’s something funny, or to go to sleep when I’m yawning!

This is so cool! Of course, most people in VR headsets use hand controllers of some sort to animate thier avatar’s arms and hands. Valve Index hand controllers can even animate individual fingers (although not all platforms can take advantage of this feature).

Many people have also been experimenting for years with using the HTC Vive “pucks” (on platforms such as VRChat, the old High Fidelity, and Sansar), to animate their avatar’s full bodies, attaching them to shoulders, waists, hips, knees, and feet. But adding mouth, lips (even tongue!) movements pushes the envelope even further, and these non-verbal expressions can add so much to conversations in social VR!

It should be noted here that the quest to animate your avatar’s face has been going on for quite some time now, with a variety of different solutions. For example, Sinespace sells a product called an Avatar Facial Driver, which works using your webcam to capture your facial expressions and play them on your avatar’s face (this is for a non-VR, desktop user, though). I blogged about this back in May of 2018.

Congratulations to Tomáš and the entire team at NeosVR for pushing the boundaries in avatar expression!

Liked it? Then please consider supporting Ryan Schultz on Patreon! Even as little as US$1 a month unlocks exclusive patron benefits. Thank you!