With latest updates, Meta has enhanced its Ray-Ban glasses and made them the smartest glasses in the market by adding features like live AI, translation, and Shazam.
The live AI feature works similarly to Google’s Project Astra which is powered by Gemini 2.0. The Shazam support is available for all users in the US and Canada and the live AI and translation features are limited to Meta’s members who had earlier accessed the Program, as per reported by Silicon UK.
The glasses can answer the asked questions in real time. It also supports real-time translation between the languages like English, Spanish, French, and Italian. It means that when the other party is speaking in one of those languages, one will hear it in English if he is speaking in English and vice-versa. This will be much helpful for travelers where language could be a barrier.
The smart glasses will also now recognize the music that is being played in the background, thanks to the Shazam feature. All the user needs to do is to just ask “Hey Meta, what is this song?” and Meta will come back with the answer in just a few seconds.
This is the best feature for people who love local music. Plus it’s more immediate as compared to opening the phone and asking Shazam the same question, as these glasses mostly stay on the face.
Also Read: AI Program Developed To Tackle Hate Speech Menace in India