Real-Time Translation with Ray-Ban Meta Glasses: Innovation or Hype?
Arthur "Art" Fridrich
Founding Partner | IT Management, Project Management, Technology Design
?
When I initially purchased the Ray-Ban Meta glasses, my primary goal was to capture photos and live-stream to Facebook during my travels. I found the battery life to be average, with the need to recharge through the case limiting prolonged use. Despite this, I occasionally used the glasses for podcasts and music.
The introduction of live translation caught my attention, as it could prove invaluable when navigating the Latino community in Richmond with my limited Spanish skills. As soon as this feature became available, I jumped at the chance to test it, exploring both real-time conversation translation and text translation from articles on my phone. Here’s what I discovered.
Setup Experience
Upon release, I eagerly attempted to activate the live translation feature, but it initially failed. Meta's initial support response wasn’t particularly helpful. After several attempts, I turned to online resources, eventually locating the correct setup instructions.
The setup involved selecting Spanish-to-English translation from four available languages (English, Spanish, Italian, and French). This required navigating to the "Translate speech in real time" option within the "Learn and Explore" section, scrolling to the second screen, clicking "Continue," and then selecting "Try it Now." While the process worked for adding Spanish, subsequent attempts to add additional languages resulted in the same error messages I encountered initially.?
Setup Grade: C-
Invoking Live Translation
Live Translation can be initiated through the Meta app or by voice command.
Via the app, you revisit the same setup process, but upon clicking "Continue," it directly launches the Live Translation screen, where you select "Start." Alternatively, the voice command “Hey Meta, Live Translation” initiates the feature hands-free.
领英推荐
?
Invocation Grade: B+
Translation Speed and Accuracy
I tested translation speed and accuracy in two scenarios: a conversation with my wife and a Telemundo news video on YouTube.
In conversation, the translation was quick and relatively accurate, although my wife spoke slower and in shorter sentences than she typically would with native speakers. The video test showed slower initial translations, but performance improved over time with reduced lag.
Translations appeared visually on the app in text blocks, first displaying the original spoken words, followed by the translation. A moment later, the translation was delivered audibly through the glasses. While generally accurate, occasional errors occurred.
Translation Performance Grade: B+
Visual Translation
I also tested the visual translation feature using the "Ask Meta AI about what you see" option on articles and headlines from Spanish newspapers. Instead of providing an exact translation, the glasses offered summarized content, which limited its effectiveness.
Final Thoughts
For an early-access feature, Meta's Live Translation tool is promising and enhances communication between speakers of the four supported languages. With a few improvements, it could become an essential feature of the Meta glasses.
Suggested Enhancements:
I look forward to seeing how Meta refines this feature, particularly through expanded language support and improved accuracy.
Overall Grade: B
?
?
This is pretty slick