In December 2023, Meta announced some AI-based updates coming to the second generation of connected glasses made together with EssilorLuxottica.
Now, the "reading" of the world through the American giant's artificial intelligence is being released as a preview for anyone who wants to try it on board the Ray-Ban Meta 2. As written in a post on the Threads social network by Meta manager Andrew Bosworth, users can ask the AI multimodal, which operates on the glasses thanks to the internet connection shared with the smartphone, to identify reference points in the surrounding environment and thus provide relevant answers.
In this way, gadgets become like a sort of tour guides for travellers.
While walking around a city you can ask the AI which monument you are in front of, obtaining textual and vocal explanations, broadcast via the speakers placed on the ends of the two temples of the smart glasses.
Bosworth showed a couple of example images in which the AI explains why the GoldenGate Bridge in San Francisco is orange and a brief description of the Coit Memorial Tower, also in the US city.
The AI's responses arrive both via voice on the glasses and in the Ray-Ban management app on the smartphone.
In the latter case, you need to take a photo of the surrounding environment and have it analyzed by the software.
Mark Zuckerberg used Instagram to show off the new features via some videos shot in Montana.
In that case, the Ray-Bans provided verbal descriptions of Big Sky Mountain and the history of the Roosevelt Arch, explaining how snow forms.
Meta previewed the AI innovations at last year's Connect event, as part of new "multimodal" capabilities that allow the company's models to answer questions in context and in real time.
“For those who don't have access to the beta yet, you can add yourself to the waitlist as we work to make it available to more people,” Bosworth said in the post.
Reproduction reserved © Copyright ANSA