Fashion Magazine

Meta’s Ray Ban Smart Glasses Can Now Offer Fashion Advice

By Elliefrost @adikt_blog

Meta’s Ray Ban smart glasses can now offer fashion advice

The latest Ray-Bans already offer some limited Meta AI capabilities, such as taking a photo with voice. But now Meta is significantly expanding the scope of what its new bot can do: On Tuesday, the company announced it is testing a new "multimodal" artificial intelligence feature that can recognize objects as seen through the smart glasses, hear and respond to requests relevant questions about this - from identifying foods to offering style advice.

The key to this early access experience is the outward-facing camera system in the glasses frames. The Ray-Bans don't contain the powerful processor chips found in the latest iPhones and Pixel smartphones, for example, so it can't do any onboard calculations or processing. Meta AI must send the requests and images to the company's servers for processing before the glasses can respond. This may cause a delay of several seconds, although the technical team is actively working to reduce the delay.

More from WWD

When the user speaks a question or command out loud, the device captures images of what the person sees and sends them to the company's servers, so Meta AI can understand what the wearer is looking at and provide a response relevant to the subject. It's somewhat like cramming Google Lens into a pair of Amazon's Echo Frames.

In a demo for WWD, the glasses were pointed at a multi-colored patterned top and the bot was asked, "Hey Meta, take a look and tell me what goes with this top." Meta AI suggested dark pants to accentuate the print. For black leather ankle boots with studs, the tech recommended pairing them with jeans. But it can't specify a specific pair of jeans or search for stores that sell that specific boot, like Google Lens can do. At least not yet.

"Sometimes you hear that in an error message; if you try that in early access, you'll hear, 'I'm sorry, I can't answer product questions, but I'm working on getting that capability soon,'" Anant Narayanan, technical director of Meta smart glasses, told WWD. "We know there is more work specifically that we need to do to get this right. But for now, in the early access phase, they are more general 'what fits well with that' type of questions."

The story continues

According to Narayanan, the wait may not be long as the team aims to deliver product-driven capabilities next year. This shouldn't surprise anyone. Meta AI has spread across the company's device and social media portfolio, and social commerce is on track to become a billion-dollar business this year. Statista estimates that global sales on Instagram, Facebook and other platforms will reach approximately $1.3 billion by the end of the year in 2023. That could rise further in 2024, thanks to Meta's latest partnership with Amazon. The e-commerce giant has struck deals to integrate its shopping platform more directly with social networks.

Meta’s Ray Ban smart glasses can now offer fashion advice
Meta’s Ray Ban smart glasses can now offer fashion advice

In the meantime, testers have a variety of Meta AI questions and commands to check out. All initial trigger phrases start the same way, with "Hey Meta, look and..." - as in "Hey Meta, look and tell me what recipes I can make with these ingredients" or "Hey Meta, look and summarize this article me together." Once activated, the bot remembers context so it can understand shorter follow-up commands without having to repeat 'look' each time. It can also respond to queries when the user refers to a specific word or phrase in a text document, restaurant menu or WWD article, and offering translations along the way.

In the demo, the bot identified food items, recommended recipes that use certain ingredients, read restaurant menus to highlight spicy or vegetarian dishes, and even created a humorous description of artwork installed in the lobby upon request. Some commands worked better than others. In one case, Meta AI misidentified a dragon fruit for a pomegranate, likely due to the bot's ability to retain context. The quirk disappeared once the history was cleared.

The user's ability to delete the AI's history and images is part of the company's privacy incentive - which makes sense, given the longstanding privacy criticism and legal complications that have dogged Meta. It's also a priority in how the AI ​​was developed. The tool is trained on a combination of data specifically collected by Meta or pulled from its family of apps, but in the latter case only when users have given permission for their data to be used in this way, Narayanan told WWD.

As a testing feature, the new Meta AI functionality is obviously not perfect. But it's still exciting for the developers behind the scenes. They are eager to see how testers use the bot and analyze feedback for improvements. This user input comes in a very Facebook-like form, with a thumbs up or down rating system for each interaction, as stored in the Meta View app.

Meta AI was first introduced at Meta Connect in September and the bot has since spread to the Quest 3 headset and the Instagram, WhatsApp and Messenger apps, as well as the latest Ray-Ban glasses. Along the way, Mark Zuckerberg, CEO, has given the public a glimpse of what lies ahead. In an October Instagram post, "Zuck" marveled at his newfound ability to braid his daughter's hair with a video that read, "Finally learned to braid. Thank you, Meta-AI."

Early access to this latest feature is open to testers selected from a group of US users who register their interest online. Meta Chief Technology Officer Andrew "Boz" Bosworth shared the news on Instagram and other social media on Tuesday: "We are beta testing multimodal AI on Ray-Ban Meta glasses through an opt-in early access program (only in The United States) . It's early days, but I'm excited about how this will make Meta AI increasingly useful, especially in the form of glasses." Meta is also introducing the ability for Ray-Ban users in the US to ask Meta AI for real-time information, with searches powered in part by Bing.

Zuckerberg's video Instagram post highlighted the styling aspect. The CEO, who was believed to be wearing the glasses, held up a brown shirt with multicolored stripes and said, "Hey Meta, look and tell me what pants to wear with this shirt."

Meta AI responded: "Based on the image, it appears to be a striped shirt. Dark washed jeans or plain trousers would complement this shirt well."

The best of WWD

Back to Featured Articles on Logo Paperblog