Ray-Ban Meta smart glasses just got three big upgrades — here's the new features

3 weeks ago 3
Ray-Ban Meta Smart glasses
(Image credit: Future)

We were rather impressed with the 2nd gen Ray-Ban Meta smart glasses when we reviewed them, but thanks to Meta AI they’ve got steadily better over time. Even something as mundane as grocery shopping can be transformed with the AI implementation if you’re wearing them.

But the latest update is the most significant yet. Teased at Meta Connect back in September, three new updates have arrived at the same time, making the glasses all the more useful. However, two of the features are only available to those on the early access program, and all three are currently for those in the U.S. and Canada only.

As detailed on the Meta blog, the three upgrades arrive with the v11 software update. The one that’s available outside of early access is Shazam integration.

If you somehow hear a track you want to hear more of amidst all the Christmas muzak that’s currently on repeat, you can simply say “Hey Meta, what is this song?” and your glasses will use the microphone to listen and come up with the answer for you to stream at your leisure. Again, this is only available in North America, so if you’re elsewhere, you’ll just have to rely on the Android and iOS apps.

Then there are the two extras that require you to be a member of the Early Access program. Live AI is the first of these, adding video to Meta AI on your glasses. When activated, Meta AI can now "see" what you’re looking at, and converse naturally about what’s going on before your very eyes.

Meta thinks this will be hugely useful for activities when your hands are busy (think cooking or gardening), or just when out and about. You’ll be able to ask Meta AI questions about what you’re looking at, for example how you can make a meal out of a bunch of ingredients in front of you. There will be battery drain though, with Meta suggesting you’ll get around half an hour of live AI use on a full charge.

Still, it’s an exciting development, and one that Meta teases will improve over time: “Eventually live AI will, at the right moment, give useful suggestions even before you ask,” the post reads.

Get instant access to breaking news, the hottest reviews, great deals and helpful tips.

Live translation

Ray-Ban Meta Smart Glasses

(Image credit: Future)

Finally and, to me, the most exciting, is live translation which promises to let you understand foreign languages without ever attempting to learn them. When enabled, if someone is talking to/at you in French, Italian or Spanish, you’ll get a real-time translation provided for you through the open-ear speakers or as text on your phone.

We’ve seen this kind of thing done before, of course. The first-generation Pixel Buds were attempting this seven years ago, and Samsung’s Galaxy AI does something similar with live phone calls. But this feels a bit more natural than both given Meta’s Ray-Bans are designed to keep the tech largely invisible.

Again, these last two AI features require you to be part of the early-access programme. If you’re in North America, you can sign up here, though it does describe the process as joining a waitlist, implying acceptance isn’t guaranteed.

Still, these will roll out to all users eventually, and Meta has hinted that more will be coming soon. “We’ll be back with more software updates—and maybe some surprises—in 2025,” the post concludes, cryptically.

More from Tom's Guide

  • Xreal One vs Viture Pro: Which AR glasses should you buy?
  • Google Labs just launched a fast and fun AI image generator — meet Whisk
  • Apple Glasses launch details revealed in new report — here's when they could arrive

Freelance contributor Alan has been writing about tech for over a decade, covering phones, drones and everything in between. Previously Deputy Editor of tech site Alphr, his words are found all over the web and in the occasional magazine too. When not weighing up the pros and cons of the latest smartwatch, you'll probably find him tackling his ever-growing games backlog. Or, more likely, playing Spelunky for the millionth time.

Read Entire Article