Meta’s Ray Ban AI Sunglasses

THE RAY-BAN META sunglasses are the first artificial-intelligence wearable success story. 

The onboard AI agent can answer questions and even identify what you’re looking at using the embedded cameras. People also love using voice commands to capture photos and videos of whatever is right in front of them without whipping out their phone. Oh, and they’ve got that chic Ray-Ban styling.

And now Meta’s smart glasses are getting even more AI-powered voice features. Meta CEO Mark Zuckerberg announced the newest updates to the smart glasses’ software at his company’s recent Meta Connect event.

Zuckerberg said at Connect. “So, they’re great glasses. We keep updating the software and building out the ecosystem, and they keep on getting smarter and capable of more things.”

You can already ask Meta AI a question and hear its responses directly from the speakers embedded in the frames’ temple pieces. Now there are a few new things you can ask or command it to do.

One of the most impressive features is the ability to set reminders. You can look at something while wearing the glasses and say, “Hey, remind me to buy this book next week,” and the glasses will understand what the book is, then set a reminder. In a week’s time, Meta AI will tell you it’s time to buy that book.

Meta notes that live translation services are coming to the glasses soon, meaning people encountering different languages could see translated speech in the moment.

There are new frame colours and lens colours being added, and customers now have the option to add transition lenses that increase or decrease their shading depending on the current level of sunlight.

Meta hasn’t said exactly when these additional AI features will be coming to its Ray-Bans, except that they will arrive sometime this year. And with only two months of 2024 left, that means very soon.