Image: Meta & Nanyang Technological University
People identify with their hands. Meta is researching how to make them as accurate and realistic as possible in virtual reality.
Researchers from Meta’s Codec Avatars Lab and Nanyang Technological University present URHand in a new research paper. The name refers to “Universal Relightable Hands” and “Your Hand”.
URHand is a hand model that can be adapted to a user’s individual hands and allows for realistic illumination.
The researchers refer to URHand as the “first universal relightable hand model that generalizes across viewpoints, poses, illuminations, and identities.”
Another highlight of this research is that the hand model can be adapted to the user’s hands using a series of smartphone pictures, so that one day it may be possible to bring one’s own hands into virtual reality using simple means. This alone would be a major technological breakthrough.
It is currently unclear whether and when this research will find its way into products.
Hands are essential for realistic VR avatars
Meta has been researching Codec Avatars for many years in the hope of achieving photorealistic telepresence. Mark Zuckerberg and Lex Fridman recently demonstrated the current state of the technology in a podcast. Meta’s goal is to one day bring codec avatars to standalone headsets. But there are still many hurdles to overcome.
A realistic alter ego includes not only the head and face, but also the rest of the body, the physical behavior of clothes, and detailed personalized hands. The latter are especially important because users identify with their hands. Anyone who has ever had hands in virtual reality that look nothing like their own will know how disconcerting that feels.