Skift Take
We haven't been able to test Meta's live translation, so we can't say how seamless it will be. But development of these products is coming fast.
The Ray-Ban Meta AI-powered glasses are getting a feature for live voice translation.
Mark Zuckerberg, CEO and founder of Meta, demonstrated the new feature and a slew of other updates during the keynote speech of the Meta Connect developer conference, streaming from the company’s main campus in California.
When talking to someone who speaks Spanish, French, or Italian, the user wearing Ray-Bans should be able to hear a real-time English translation through a speaker in the glasses, he said. The user then replies in English, and a mic in the glasses picks up the voice and transfers it to the user’s connected mobile app. The translation from English to the other language is voiced aloud through the app for the other person to hear.
Meta confirmed with Skift that the glasses will only support translations between English and one of those three languages, and vice versa. So, English-Spanish or Spanish-English, etc. — as long as one of the languages involved is English. It won’t support different combinations of those languages yet, like Spanish-French, for example.
In the case that two people are wearing the glasses — like one English speaker and one Spanish speaker — they would each hear a translation in their respective language through the built-in speaker.
The implications for travel are obvious, and a press release from Meta about the upgrade said as much: “Not only is this great for traveling, it should help break down language barriers and bring people closer together.”
The company did not say when the update is coming but said it would be soon. The company plans to add more languages in the future.
Meta earlier this year said that it was integrating voice-activated AI into the glasses, which meant it could translate menus and answer questions about landmarks seen through the lenses.
The Ray-Bans are available starting at $299.
Live Demo Between Spanish and English
Zuckerberg demonstrated the new translation feature with Brandon Moreno, former two-time UFC Flyweight Champion. Moreno spoke in Spanish to Zuckerberg, and Zuckerberg responded in English to Moreno.
The demo was brief, but the chatbot was able to translate in real time despite some slang and pauses.
OpenAI earlier this year demonstrated an upgrade for live voice translation through its app, which would understand non-verbal cues like exhalation and tone of voice. The feature has not become widely available yet.
Upgraded AI Model
The new features for the glasses are powered by Meta’s latest AI model, Llama 3.2, revealed during the presentation on Wednesday.
The Meta AI chatbot — accessible through WhatsApp, Instagram, Messenger, and Facebook — is getting an upgrade with the new model. It includes “multi-modal” capabilities, which means users can interact with the chatbot using text, photos, and speech.
Another new AI translation feature: Automatic video dubbing for Reels, starting with English and Spanish, with automatic lip syncing so it appears as if the subject is speaking the language.
Have a confidential tip for Skift? Get in touch
Tags: artificial intelligence, meta, ray-ban, the prompt