Snap AR glasses get a lighter appearance, a new name and launching consumers in 2026
- Advertisement -
The race to put augmented reality smart glasses on your face is warmed up. Snap Bril transforms into “specifications” and launches as lighter and more powerful AR -Wearables in 2026.
CEO Evan Spiegel announced the completely new specifications on stage at the XR event awe, promising smart glasses that are smaller, considerably lighter and “with a ton more options”.
The company has not spoken a specific time frame or price, but the launch schedule of 2026 will inform Meta, which is busy preparing its exciting Orion Ar -Bril before 2027. It seems that Snap specifications come across with the Samsung/Google Android XR-based glasseswhich are also expected somewhere in 2026.
As far as consumers can expect from specifications, Snap is building the same Snap operating system that is used in its use Fifth generation glasses (And probably still a few Qualcomm Snapdragon XR chips)). That means that all interface and interaction therapies, such as based checks, will remain. But there are a considerable number of new functions and integrations that will appear this year, long before specifications arrive, including AI.
Upgrade the platform
Spiegel explained the updates by first revealing that Snap started working on glasses “before Snapchat” was even one thing and that the umbrella goal of the company “makes computers more human”. He added that “with progress in AI, computers behave more than ever and behave like people.”
Snap’s plan with these updates to Snap OS is to bring AI platforms to the real world. They bring Twin And Openi Models in Snap OS, which means that some AI options with multiple models will soon be part of the fifth generation glasses and, ultimately, specifications. These tools can be used for on-the-fly text translation and currency conconer version.
The updated platform also adds tools for Snap Lenes builders that will integrate with the AR-Golf-Formed Display options of the Spectacles’ and Specs.
With a new Snap3D -API, for example, developers can use Genai to make 3D objects in lenses.
The updates contain a depth -module AI, which can read 2D information to make 3D cards that can anchor virtual objects in a 3D world.
Companies that implement spectables (and ultimately specifications) can appreciate the new Fleet Management app, so that developers can manage and control multiple specifications at the same time, and the possibility to implement the specifications for guided navigation in, for example, a museum.
Later, Snap OS WebXR supports support to build AR and VR experiences within web browsers.
Let’s make it interesting
Spiegel claimed that Snap has the largest AR platform in the world through lenses in Snapchat. “People use our AR lenses in our camera 8 billion times a day.”
That is a lot, but it is almost all via smartphones. At the moment, only developers use the extensive glasses and their lens options.
The release of the consumer of specifications could change that. When I tried glasses last year, I was impressed by the experience and, although not as good as Meta Orion glasses (the lack of can-tracking was noticed for me), full of potential.
A lighter form factor that approaches or surpasses what I have found with Orion and have seen it in some Samsung Android XR -glasses, Snap -Specifications in the AR glass lead could be correct. That is, provided that they do not cost $ 2000.
Maybe you like it too
- Advertisement -