Ray-Ban Meta smart glasses have got two AI-driven features recently- Live AI, which enables real-time video processing, and Live Translation, which offers instant speech translation. The new capabilities were demonstrated by CEO Mark Zuckerberg at Connect 2024 and they are available to Early Access Programme users in Canada and the US currently.
Key features of the v11 software update
The latest v11 software update for Ray-Ban Meta smart glasses introduces:
1. Live AI for enhanced interaction
- Real-time visual processing: Live AI allows the glasses’ cameras to stream a continuous video feed to Meta AI, providing instant visual analysis of the user’s surroundings.
- Seamless conversations: Users can interact naturally with Meta AI, ask follow-up questions, revisit previous topics, and even switch between subjects smoothly.
- Proactive suggestions: Future updates may enable Live AI to offer suggestions without prompts, creating an intuitive user experience.
2. Live Translation for multilingual conversations
Instant Speech Translation: Live Translation supports real-time translation of conversations between English and Spanish, French, or Italian.
Audio and Visual Outputs: Translations are delivered audibly through the glasses’ open-ear speakers, with an option to view transcripts on the connected smartphone.
Limitations and future rollout
Meta has emphasized that these features are in their early stages and may not always deliver perfect results. Feedback from users will play a key role in refining the technology.
Global availability awaited
- While these features are currently limited to select markets, Meta has not disclosed a timeline for a global rollout. Notably, Indian users are yet to access any of these AI-driven enhancements.
- Meta’s continuous upgrades aim to redefine the smart glasses segment, blending AI-powered features with real-world practicality.
ALSO READ: How to create and manage WhatsApp Polls? Guide
ALSO READ: How to use your smartphone with 'Gloves-on' this winter?