Meta files patent for this feature inspired by the Apple Vision Pro
Meta has filed a patent application for new technology that would allow Facebook’s parent company to equip its wearable mixed reality headsets with functionality inspired by a specific Apple Vision Pro feature. Apple’s first spatial computer is equipped with an external display that can show an indicator of what the wearer is doing in immersive mode, or a version of their facial expressions, using sensors in the headset. Meta could deliver a feature that takes advantage of its avatar capability, the patent document says.
A application titled ‘Embedded Sensors in Immersive Reality Headsets to Enable Social Presence’ was recently filed with the European Patent Office (EPO) by Meta Platforms (through Patently Apple). The patent application was also published two months ago by the United States Patent and Trademark Office (USPTO).
In the patent application Meta filed with both regulators, the company describes the use of sensors (including electrocardiogram and electroencephalogram sensors) placed in the mixed reality headset. These sensors can observe a user’s facial expressions in real time and control a virtual version of a user (the company calls these avatars) based on their facial expressions, which are displayed elsewhere.
The document also includes a flowchart (Figure 4) that describes the entire process. When a person wears a headset, a sensor inside will track movements in the facial muscles. It will also determine the user’s facial expressions by mapping them to the movement of the muscle — 20 of them are shown in another diagram (Figure 3), along with two anatomical drawings.
After the device determines the user’s current facial expression, it will automatically “customize” the avatar for the user, according to the patent application. Finally, it will deliver the customized avatar to an “immersive reality application” hosted on a remote server.
While the Apple Vision Pro displays a user’s facial expressions on the outer display, Meta’s collection of expressions appears to enable detailed renderings of these avatars on other services. The patent application also describes the use of machine learning technology to associate facial muscle movement with the user’s expression.
It’s important to note that the patent application doesn’t give any indication of when the feature will be available in a Meta product. However, its potential inclusion in the future could lead to the development of more responsive avatars in other Meta products and services.