Meta has announced its second generation smart glasses in conjunction with EssilorLuxottica, which owns Ray-Ban, two years after the first Stories generation was launched. Photos and videos taken by the glasses’ camera are stored in a tiny flash chip inside the temple side arms.
These Ray-Ban Meta Smart Glasses are wirelessly tethered to an iOS or Android smartphone and respond to Hey Meta commands, actuate on-frame touches or switch clicks, take photos, livestream video, or play audio. They follow on from the 2021 gen 1 Stories smart glasses, being lighter, and with better camera and audio features, longer activity period, and access to Meta AI through voice requests.
At the at the Connect conference, which also revealed the Quest 3 VR headset and Meta AI, founder and CEO Mark Zuckerberg said: “Smart glasses are the ideal form factor for you to let AI assistants see what you’re seeing and hear what you’re hearing.”
An in-frame battery powers the Qualcomm Snapdragon AR1 gen 1 system-on-chip for up to four hours, better than the gen 1 Stories’ three. The system draws less than 1 watt and supports Wi-Fi 6 and Bluetooth 5.7.
There is a charging case, connected by a USB-C cable, capable of powering the glasses for up to 36 hours. Camera photos and up to video clips are stored in an e.MMC (embedded MultiMediaCard) NAND+controller chip with 32GB capacity, enough for up to 100 x 30-sec videos and 500 three-frame burst photos. The gen 1 Stories NAND card had a mere 4GB capacity. Photos and videos can be transferred to a tethered phone to free up capacity in the glasses.
Unlike the Stories’ dual 5MP camera, this gen 2 design has a single 12MP ultra wide camera with 3,024 x 4,032 pixel resolution for images and 1,440 x 1,920 pixels at 30 fps video. A LED light on the frame front glows when a video is being recorded.
Audio is played through open speakers embedded in the sidearms with a directional capability to enhance hearing and reduce leakage. This audio is said to have improved bass sounds and overall be 50 percent louder than the Stories speakers. Voice commands are picked up from a five-microphone array in the front of the glasses.
The Meta AI assistant is based on Llama 2 tech and accesses real-time information through a Bing Search partnership. It includes photo-realistic image generation thorough Emu.
Check out a Ray-Ban Smart Glasses video here.
These water-resistant specs will be available from October 17 and can be pre-ordered on meta.com and ray-ban.com at $299 for the Wayfarer style and $329 for the Headliner. The glasses are compatible with prescription lenses. Meta AI features will be available in the US in beta only at launch.
A no-charge update is scheduled for 2024 to enable the smart glasses to recognize what the wearer is seeing via the onboard camera, such as a building, and provide information about it. We think augmented reality could be coming as well, with virtual objects or diagrams displayed in the glasses’ lenses.
Stratechery’s Ben Thompson writes: ”I think that smart glasses are going to be an important platform for the future, not only because they’re the natural way to put holograms in the world, so we can put digital objects in our physical space, but also — if you think about it, smart glasses are the ideal form factor for you to let an AI assistant see what you’re seeing and hear what you’re hearing.” He reckons it is more natural and quicker to talk to a ChatGPT-like AI, and hear an answer, than typing in text and reading a response.
It will be interesting to see if Apple agrees.