Meta has rolled out a significant AI upgrade to its Ray-Ban smart glasses, introducing multimodal AI capabilities that allow the glasses to see and interpret the world around the wearer. This update, initially available in beta for users in the US and Canada, marks a major step forward in wearable AI technology.
Meta’s latest update to its Ray-Ban smart glasses introduces a range of AI-powered features, enabling users to interact with their surroundings in new ways. By simply saying “Hey Meta,” you can now ask questions about objects in your field of view, request translations, and even get suggestions based on what the glasses’ camera sees.
The new AI assistant can identify objects, translate text, and provide detailed descriptions of your surroundings. For example, you can ask about items in a store, get help reading menus in foreign languages, or receive clothing suggestions based on items you’re looking at.
This update builds on the existing features of the Ray-Ban smart glasses, which include a 12-megapixel ultra-wide camera for photos and videos, five microphones for audio recording, and the ability to livestream directly to Facebook or Instagram.
Meta CEO Mark Zuckerberg demonstrated the new capabilities in an Instagram reel, showcasing the glasses’ ability to suggest pants that would match a shirt he was holding.
The AI features are powered by Meta’s advanced conversational assistant, optimized for a hands-free, on-the-go experience. While the technology is impressive, early tests have shown that the AI can occasionally “hallucinate” or provide inaccurate information.
Meta is also expanding its smart glasses lineup with new styles, including a cat-eye design called Skyler and a low bridge option for the Headliner frames.
As Meta continues to refine and expand the AI capabilities of its Ray-Ban smart glasses, the company is positioning itself at the forefront of wearable AI technology. This update represents a significant step towards Meta’s vision of seamlessly integrating AI assistants into our daily lives, potentially reshaping how we interact with the digital world.