The experimental new smart glasses from Meta Ai can see everything you do and even tell you how you think about it
- Advertisement -
- Advertisement -
- Meta develops its Aria Gen 2 smart glasses, which is full of sensors and AI functions
- The smart glasses can follow your gaze, movement and even heartbeat to gauge what happens around you and your feelings happens
- The smart glasses are currently being used to help researchers train robots and to build better AI systems that can be included in smart glasses of consumers
The Ray-Ban Meta Smart Glasses Are still relatively new, but Meta is already making the work with its new Aria Gen 2 Smart glasses. Unlike the Ray-Bans, these smart glasses are only for research purposes, but are full of sufficient sensors, cameras and processing power that it seems inevitable from what Meta learns will be included in future wearables.
Project Aria’s research level tools, such as the new smart glasses, are used by people working on computer vision, robotics or a relevant hybrid of contextual AI and neurosciences that attracts meta’s attention. The idea for developers is to use these glasses to come up with more effective methods for education machines to navigate, contextualize and communicate the world.
The first Aria Smart glasses came out in 2020. The Aria Gen 2s are much more advanced in hardware and software. They are lighter, more accurate, catch more power and look much more like glasses that people wear in their regular lives, although you would not confuse them with standard glasses.
The four computer vision cameras can see an 80 ° arch around you and measure depth and relative distance, so that it can see how far your coffee mug is from your keyboard, or where the landing gear goes from a drone. That is just the start of the sensory equipment in the glasses, including an ambient light sensor with ultraviolet mode, a contact microphone that can pick up your voice, even in noisy environments, and a pulse detector embedded in the nose cushion that can estimate your heart rate.
Future face
There is also a lot of eye-tracking technology, able to tell where you look, when you blink, how your students change and what you concentrate on. It can even follow your hands, measure joint movement in a way that can help with training robots or teaching gestures. Combined, the glasses can find out what you look at, how to hold an object, and whether what you see, increases your heart rate because of an emotional reaction. If you hold an egg and see you sworn enemy, the AI may find out that you want to throw the egg at them and you help to accurately aim it.
As mentioned, these are research tools. They are not for sale to consumers, and Meta did not say whether they will ever be. Researchers must submit an application to access and the company is expected to start taking these applications later this year.
But the implications are much larger. Meta’s plans for smart glasses go much further than checking for messages. They want to link human interactions to the real world to machines and learn to do the same. Theoretically, those robots could look, listen and interpret the world around them like people do.
It will not happen tomorrow, but the Aria Gen 2 Smart Glass proves that it is a lot closer than you might think. And it is probably only a matter of time before a version of the Aria Gen 2 is for sale at the average person. You will leave that powerful AI brain on your face, remember where you have left your keys and send a robot to pick them up.
Maybe you like it too
- Advertisement -