In this week’s episode of the Untangling Web3 podcast, we dive into Alec’s hands-on review of the new Meta AI Glasses v2 — Meta’s latest step toward merging AI and AR into everyday tech.
Fresh from attending Meta’s launch event in California, Alec shares his firsthand impressions of how these next-generation smart glasses blend voice-controlled AI, contextual awareness, and seamless design to create what Mark Zuckerberg calls “the next mobile interface.”
From hands-free recording to real-time translation, this episode explores where these glasses succeed — and where they still fall short.
Key Points Discussed:
- First Impressions and Use Cases: Alec shares his early experiences using the Meta AI Glasses, highlighting their intuitive voice commands, sleek design, and surprisingly good photo and video quality. While the AI assistant can perform simple contextual tasks — like identifying objects or fetching quick answers — it still falls short of full autonomy. The glasses excel in recording and hands-free content creation, making them ideal for creators, travelers, and anyone wanting to stay present without constantly using their phone.
- AR, Audio, and Intelligent Context: The hosts explore how Meta’s AI and AR features bring subtle layers of digital intelligence into the real world. Built-in microphones and bone-conduction speakers allow for private listening, live translation, and contextual responses from Meta’s LLaMA AI model. Alec demonstrates how the glasses can “see” and describe surroundings, respond to voice commands, and even play music — though full contextual awareness and live streaming of visual data are still in development.
- Meta’s Vision and the Future of Wearable Tech: Beyond the gadget itself, Alec and Jack unpack Zuckerberg’s long-term vision for a post-smartphone future powered by wearable AI. They discuss how Meta’s hardware strategy — combining smart glasses, neural input devices, and the company’s AI ecosystem — could one day replace traditional screens. With an upcoming SDK for third-party developers, the potential for apps like note-taking, live translation, and real-time guidance could transform how we interact with both technology and the world around us.
The Meta AI Glasses v2 are a bold step toward the fusion of AI, AR, and everyday life — a glimpse into the future of personal computing.
While the current generation remains limited in scope, its design, integration, and potential applications make it one of the most exciting tech innovations of the year.
--
This episode is sponsored by the VeChain foundation. Learn more about VeBetterDAO here:
https://vebetterdao.org/
--
Learn more about Web3 at:
https://untanglingweb3.com/
--
Untangling Web3 is brought to you by hosts Jack Davies and Alec Burns, with music by Daniel Paigge. Got a question or topic suggestion? Send us an email at theuntanglingweb3podcast@gmail.com.
Love what you're hearing? Show your support by becoming a subscriber and don't forget to leave us a stellar review.
The views we express here are our own, and do not represent the views of our employers. Nothing discussed or stated in the show should be considered advice.