Imagine striking up a conversation in a bustling French café or navigating a street market in Mexico City without knowing a single word of the local language. Thanks to a new update from Meta AI, you can now use your Ray-Ban smart glasses as a live translator.
The update does a lot to make the experience as seamless as possible, though there are still some hold-ups, of course. One of the biggest hold-ups is that to use the feature the easiest, you’ll either need the person you’re talking to to have a pair of Meta’s Ray-Ban smart glasses, or they’ll need to have the Meta app open to receive the translation.
The live translator built into the new update for the smart glasses is currently only available in four languages: English, Spanish, French, and Italian. To start using it, you just have to ask Meta to start translating, and the AI will take over from there.
You can download the languages ahead of time, too, so even if you go somewhere where you don’t have service, you can use the new feature. Despite the limitations, though, the convenience of having a live translator built directly into everyday eyewear can’t be overstated.
Meta’s smart glasses aren’t stopping at language, either. The company is doing everything it can to make its AI-powered Ray-Bans as enticing as possible by expanding features across the board. These new features include music streaming options, access to Instagram messaging, and object recognition.
Sure, the smart glasses live translator still has room to grow. But there’s no arguing against the fact that it will make communication easier by removing one of the barriers to traveling outside of countries where you know the language well enough to talk without any help or translator.
A previous update also improved the vision capabilities and other functionality on the glasses.
The post Meta’s smart glasses can now translate your conversations in real time appeared first on BGR.