Meta AI just turned Ray-Ban smart glasses into a business accessory

Meta’s Ray-Ban smart glasses just got a lot more useful, as you can now ask questions about what you’re looking at and have AI give you the answer.

A new version of these high-tech Ray-Ban sunglasses was introduced in September, improving the cameras, microphone and processor which allow you to do things like take a photo and post it to Instagram without getting out your phone. Now, wearers of the glasses can use generative artificial intelligence to find out information about whatever the camera is pointing at.

Or at least, they can if they’re in a beta trial of new features powered by Meta AI.

Showing off the new multimodal AI feature on Instagram, Meta CEO Mark Zuckerberg asked Meta AI to suggest a funny caption for a photo and identify a fruit he holds in his hand. He’s also seen picking out a shirt and asking his glasses to suggest what to wear with it.

This kind of everyday use may be useful for fashion retailers and Clueless fans, but it suggests enormous potential for business beyond this kind of everyday use.


Related reading: Can these AR glasses replace screens?


Business case for Meta AI on Ray-Ban smart glasses

We don’t yet know the full abilities of Meta AI, but extrapolate from Zuckerberg’s examples and you can easily imagine business use cases for this combination of AI with augmented reality.

Customers could interact with adverts. Instead of fruit, an engineer or contractor could identify tools or components just by looking at them. In a warehouse or factory, AI could read labels on boxes and point out which you need, or guide you to where you need to be.

In an office, you could ask questions about the presentation you’re watching. You could discreetly be given a summary of the topic in question, straight into your ear.

Or maybe you’re meeting someone. Have you met them before? Can you work out their name, what they do? All delivered to you like a private prompter.

And because they’re glasses, all this would be hands-free so you can continue to work, type or carry things with both hands.


Related reading: Does Apple’s Vision Pro have a business case?


How Meta AI works on Ray-Bans

To use the AI feature, the wearer looks at something and says: “Hey Meta, take a look at this”, then asks a question about what’s in view.

The glasses take a photo, which the AI then analyses.

The speaker in the glasses says the answer out loud, and the whole conversation is saved in text form with the photo in the Meta View phone app so you can look back on it later.

Journalists trying the new feature noted that while it’s pretty clever, the smart shades are still prone to “hallucinations”: incorrect answers that undermine the reliability of this current generation of AI. 

As with several generative AI systems, Meta AI previously had a problem with giving you answers about recent events or data, as it could only access information up to December 2022. That time limitation is now gone, and the AI can answer your questions with bang-up-to-date information from Microsoft’s search engine Bing.

If you want to be part of the trial, go into the settings of the Meta View app, find the Early Access option and tap Join Early Access. 

Richard Trenholm
Richard Trenholm

Richard is a former CNET writer who had a ringside seat at the very first iPhone announcement, but soon found himself steeped in the world of cinema. He's now part of a two-person content agency, Rockstar Copy, and covers technology with a cinematic angle for TechFinitive.com

NEXT UP