Skip to main content

The latest Ray-Bans already offer some limited Meta AI capabilities, such as snapping a photo using voice. But now Meta is significantly expanding the range of what its new bot can do: On Tuesday, the company announced that it’s testing a new “multi-modal” artificial intelligence feature that can recognize objects, as seen through the smart glasses, hear requests and answer relevant questions about them — from identifying foods to offering style guidance.

The key to this “early access” experience is the outward-facing camera system in the eyeglass frames. The Ray-Bans don’t pack the beefy processor chips included in, say, the latest iPhones and Pixel smartphones, so it can’t do onboard computations or processing. Meta AI must send the requests and images to the company’s servers for processing, before the glasses can respond. This can lead to a few seconds of lag, though the engineering team is actively working to shorten the delay.

When the user speaks an inquiry or command out loud, the device captures and transmits images of what the person sees to the company’s servers, so that Meta AI can understand what the wearer is looking at and give a response relevant to the subject. It’s somewhat akin to stuffing Google Lens into a pair of Amazon’s Echo Frames.

In a demo for WWD, the glasses were pointed at a multi-colored patterned top, and the bot was asked, “Hey Meta, look and tell me what goes with this top.” Meta AI suggested dark pants to set off the print. For black leather ankle boots with studs, the tech recommended pairing it with jeans. But it can’t specify a particular set of jeans or search for shops that sell this particular boot, as Google Lens can do. At least not yet.

“Sometimes, you’ll hear that in an error message; when you try that in early access, you’ll hear, ‘I’m sorry, I can’t answer product questions, but I’m working on getting that capability soon’,” Anant Narayanan, Meta’s engineering director of smart glasses, told WWD. “We know that there’s more work specifically that we have to do to get this right. But for now, in the early access stage, it’s more generic ‘what goes well with that’–type questions.”

According to Narayanan, the wait may not be long, as the team aims to deliver product-driven abilities next year. This should surprise no one. Meta AI has been spreading across the company’s device and social media portfolio, and social commerce is on track to become a billion-dollar business this year. Statista estimates that 2023 global sales across Instagram, Facebook and other platforms will total roughly $1.3 billion by year’s end. That could balloon further in 2024, thanks to Meta’s latest partnership with Amazon. The e-commerce giant has been striking deals to more directly integrate its shopping platform with social networks.

In the meantime, testers have a variety of Meta AI inquiries and commands to check out. All initial trigger phrases begin the same way, with “Hey Meta, look and …” — as in “Hey Meta, look and tell me what recipes I can make with these ingredients” or “Hey Meta, look and summarize this article for me.” Once activated, the bot remembers context, so it can understand shorter follow-up commands, without the need to repeat “look” each time. It can also respond to inquiries when the user points to a specific word or phrase in a text-based document, restaurant menu or WWD article, and offer language translations on the go.

In the demo, the bot identified food products, recommended recipes that use particular ingredients, read restaurant menus to highlight spicy or vegetarian dishes and even created a humorous description of artwork installed in the lobby upon demand. Some commands worked better than others. In one instance, Meta AI misidentified a dragonfruit for a pomegranate, likely because of the bot’s ability to hold onto context. The quirk vanished once the history was erased.

The user’s ability to delete the AI’s history and images is part of the company’s stated privacy push — which makes sense, considering the longstanding privacy criticisms and legal complications that have dogged Meta. It’s also a priority in how the AI was developed. The tool was trained on a blend of data specifically collected by Meta or culled from its family of apps, but, in the latter case, only when users granted permission for their data to be used this way, Narayanan told WWD.

Naturally, as a test feature, the new Meta AI functionality isn’t perfect. But it’s still exciting for the developers behind the scenes. They’re eager to see how testers use the bot and are parsing feedback for improvements. This user input comes in very Facebook-like form, with a thumbs-up or -down rating system for each interaction, as saved in the Meta View app.

Meta AI was first introduced at Meta Connect in September, and the bot has since spread to the Quest 3 headset and the Instagram, WhatsApp and Messenger apps, as well as the newest Ray-Ban glasses. Along the way, Mark Zuckerberg, chief executive officer, has been giving the public glimpses of what’s in store. In an October Instagram post, “Zuck” marveled at his newfound ability to braid his daughter’s hair with a video share that read, “Finally learned to braid. Thanks, Meta AI.”

Early access to this latest feature will open to testers selected from a pool of U.S. users who register their interest online. Andrew “Boz” Bosworth, Meta’s chief technology officer, shared the news on Instagram and other social media on Tuesday: “We’re testing multimodal AI in beta on Ray-Ban Meta glasses via an opt-in early access program (US only). It’s early but I’m excited at how this will enable Meta AI to be increasingly useful especially in a glasses form factor.” Meta is also rolling out the ability for Ray-Ban users in the U.S. to ask Meta AI for real-time information, with searches powered in part by Bing.

Zuckerberg’s video Instagram post highlighted the styling aspect. The CEO, presumably wearing the glasses, held up a brown shirt with multi-color stripes and said, “Hey Meta, look and tell me what pants to wear with this shirt.”

Meta AI responded, “Based on the image, it appears to be a striped shirt. A pair of dark-washed jeans or solid-color trousers would complement this shirt well.”