Last year, Ray-Ban’s Meta sunglasses introduced AI-powered visual search capabilities that stirred both admiration and concern. Recently, the latest beta version unveiled a new feature that seems particularly beneficial. It can identify landmarks and provide detailed information about them, effectively serving as a digital tour guide for explorers, according to Meta’s Chief Technology Officer, Andrew Bosworth, in a post on Threads.
Bosworth highlighted the feature by sharing examples, such as explaining the distinctive orange color of the Golden Gate Bridge (to enhance visibility in fog) and offering insights into the history of San Francisco’s iconic “Painted Ladies” houses. These explanations were presented as text beneath the images.
Furthermore, Mark Zuckerberg demonstrated the feature’s advancements through a series of videos on Instagram, captured in Montana. In these videos, the sunglasses audibly narrate information about notable sites like Big Sky Mountain and the Roosevelt Arch, even humorously elucidating the formation of snow in a simplistic manner.
The unveiling of this feature occurred at Meta’s Connect event, showcasing it as a part of the sunglasses’ new “multimodal” capabilities. These allow the device to interact with and provide information about the wearer’s surroundings. This improvement was made possible by enabling access to real-time information via Bing Search, moving beyond the previous limitation to information up to 2022.
This capability mirrors the functionality of Google Lens, allowing users to visually query the AI about objects they see through the glasses, such as identifying fruits or translating text in foreign languages. Currently, access to this feature is exclusive to participants in Meta’s early access program, which remains somewhat restricted. Bosworth mentioned that those eager to try the beta can join a waitlist as efforts continue to broaden availability.