Ray-Ban Meta Smart Glasses get even more AI features, such as live language translation and Meta AI for live video
The Ray-Ban Meta Smart Glasses are arguably the most popular smart glasses out there, and that’s likely due to their feature set and housing. Meta has partnered with Ray-Ban to bring the iconic Wayfarers into the tech age with slightly thicker frames, dual cameras, some speaker tech, microphones, and connectivity.
These smart glasses started out as a unique way to take photos or videos, sometimes even more ‘in the moment’, since you didn’t have to pick up your phone and start the camera. In recent months, Meta has equipped these smart glasses with Meta AIwhere you can look at something and just say, “Hey Meta, what’s this?” and have it look at it, analyze it, and then give you an answer. It’s pretty cool.
But now, at Meta Connect 2024, the team working on Smart Glasses wants to make them even smarter. And if you thought they were going to do it with AI, you’d be right.
Language translations and visual reminders
Perhaps the most useful new feature is the Ray-Ban Meta Smart Glasses, which will be live translated later this year. Similar to what Samsung has done it with the Galaxy Buds Pro 2 or Google with the Pixel Buds Pro 2The Ray-Ban Metas can translate languages almost instantly, initially between English and Spanish, Italian and French.
This can be very convenient and more natural than trying to do it with earbuds in, as it is already built into the smart glasses you may already wear every day if you have opted to have prescription lenses fitted.
Additionally, in addition to verbally asking Meta to set a reminder, you can now set reminders based on things you see that Meta is looking at. So it might be getting milk from the fridge and realizing it’s running low, or maybe a package you left on your doorstep that you absolutely have to pick up. This feature should be rolling out sooner rather than later.
Likewise, you can now scan QR codes for events, phone numbers, and even full contact information. If a QR code is visible, you can ask Meta to scan it via the Ray-Ban Meta Smart Glasses. We believe the information will then appear in the Android or iOS companion app.
An ambitious video step
Probably the most ambitious future feature, also coming later this year, is Meta AI for video, which means Meta can see what you’re looking at in real time, not just a snapshot, and provide clarity or answer questions. This could be useful for navigating a city, cooking a meal, solving a math problem, or helping you finish a Lego set.
This is probably a big step and would raise some privacy concerns, as it is a live image from your glasses that is processed in real time. You also need the Ray-Ban Meta Smart Glasses to be connected to the internet via an iPhone or Android for this to work, as the information needs to be processed in real time.
Still, it probably gives us an idea of where Meta is heading in the smart glasses category, so it’s great to see Meta continue to roll out new features for its smart glasses. And that’s the good news here – these updates, as far as we know, do not require any new hardware. These will be delivered as over-the-air updates to people who already own or purchase Ray-Ban Meta Smart Glasses.
Another update coming is integration with Amazon Music, Audible, iHeart, and Spotify integration, making it easier to access your favorite songs, artists, podcasts, and books, hands-free. You’ll also see new Transition lens options coming from EssilorLuxottica, the eyewear giant behind brands ranging from Dolce & Gabbana to Oakley.
So if you’re not crazy about the available looks enough to buy a pair, or you just want to freshen them up, now’s a good time to reconsider when they hit the market. We’ll be testing out these new features, from language translation to Meta AI for video, as soon as we can, so stay tuned to TechRadar.