Zuckerberg Talks About New Ray-Ban: Will Glasses Become the Next-Generation Computing Platform? Neural Band Is the Real "Crazy"
Glasses and AI
Are Meant to Come Together
At 8 a.m. Beijing time on September 18th, at Meta Connect 2025, Meta's biggest annual event, Mark Zuckerberg unveiled three new smart glasses and a neural bracelet.
The most eye - catching one is the Ray - Ban Display smart glasses jointly launched by Meta and Ray - Ban, which are equipped with functions such as map navigation, real - time translation subtitles, photography, and voice calls. The glasses are also paired with the Meta Neural Band, allowing users to control the display through subtle hand movements.
According to the press conference, the starting price of these glasses is $799 (approximately RMB 5,683) and they will be launched in the United States on September 30th.
●Meta's official website
However, despite the grand vision, the demonstration at the event kept going wrong: First, during the live demonstration, a demonstrator wearing the glasses was making Korean steak sauce under the guidance of the AI assistant. When continuously asking "What should I do first?", the AI gave irrelevant answers and failed to provide the correct ones, causing the audience to burst into laughter and the demonstration to be aborted.
●Screenshot of the live broadcast of Meta Connect 2025
Zuckerberg quickly stepped in to smooth things over, blaming the network signal: "The irony of the whole thing is that you spend years developing technology, and then one day, WiFi gets you."
Second, during the demonstration of a video call, Zuckerberg, wearing the neural bracelet, tried repeatedly to answer a video call from the CTO using gesture controls but failed. The CTO had to urgently take the stage to save the situation, also saying that "the WiFi is terrible."
But even with the glitches at the event, the industry's evaluation of the Ray - Ban Display is mostly positive. Although foreign media The Verge expressed "regret", their final evaluation was: "Meta's new smart glasses are the best I've ever tried."
Zuckerberg said that glasses are "the ideal form for AI." In the future, AI will no longer be a tool that needs to be actively launched, but an ever - present intelligent companion that provides help at critical moments.
After the conference, he also gave an exclusive interview to TBPN, further discussing the future of smart glasses, the implementation of AI, and Meta's ultimate vision. The following is the transcript of the interview, compiled and translated by the Future Human Laboratory -
●Screenshot of the live broadcast of Meta Connect 2025
Host: Let's talk about the recent announcements at the Connect conference. What do you think about the future cooperation with developers? I can imagine there will be a lot of cool ideas for the Ray - Ban Display smart glasses. But there are huge limitations in operating on such a small device. What will it look like in the next few years?
Zuckerberg: I think there are two interesting platforms here. One is the display glasses, and the other is the neural band. I think both of these platforms may evolve into independent and important platforms in the future.
Let's start with the glasses. For example, in terms of navigation, we will first work with some partners to solidify and polish the most commonly used application scenarios and truly integrate them into the user experience. Then, over time, we hope to gradually open up the ecosystem, but the specific approach still needs to be explored.
As for the Neural Band, originally we designed it to provide computing power support for the glasses, which was the initial purpose. But it doesn't necessarily have to serve only the glasses. This is a direction worthy of further exploration. Imagine using it while watching TV at home. It would be really cool. So the future direction remains to be explored.
The glasses will attract most of the attention, but the Neural Band is really "crazy" — — I can't wait for everyone to try it. Just think, you'll be able to buy it directly in a few weeks. That's insane.
●Screenshot of the TBPN interview
Host: Let's talk about the foresight of the team at the intersection of "glasses + AI". Now it seems like an almost natural combination, like an always - on AI. But not long ago, many people thought they belonged to two completely different technological paths.
Zuckerberg: Yes, every important new technology needs a new type of device to carry it in order to provide a first - class experience. I think glasses have three unique advantages that make them the best candidate for the next - generation major computing platform.
First, glasses can help maintain the "sense of presence" between people. The moment you take out your phone, you're actually disengaged, but glasses can keep that sense of presence intact.
Second, glasses are the most suitable form for AI. They are the only devices that can allow AI to "see what you see, hear what you hear, and interact with you throughout the day." Soon, it will even be able to generate a UI interface in your field of vision in real - time.
Third, glasses are the only form that can combine the physical world around you with realistic holographic images.
There is a strange contradiction in our world today: We have an extremely rich digital world online, but we mainly access it through a 5 - inch screen. It's only a matter of time before these two are completely integrated, and glasses are the key carrier.
This has actually been our plan all along. As early as 2014, after we went public and started making profits, we launched these long - term exploration projects. At that time, I founded FAIR (Facebook AI Research), which is also the predecessor of Reality Labs. The technological paths of glasses and AI are actually meant to come together.
●Meta's official website
Host: Let's talk about the long - term vision of "personal super intelligences". For example, can I use the glasses to take a picture of your watch and say "This gift would be suitable for my partner", and then the system can help me find it, place an order, and have it delivered?
Zuckerberg: Well, I think what "personal super intelligence + glasses" ultimately aims to achieve is what I mentioned before, "live AI vision".
Now, when you wear the glasses, you can summon Meta AI, for example, by saying "Hey Meta" or making a "G gesture" with the Neural Band to call it up and ask it questions. In the future, this AI will be always on, but you can also control it and turn it off at any time. It will be like an agentic AI with context awareness, always knowing what you're talking about and actively helping you find information.
For example, if something you need to know comes up in a conversation, it will automatically look it up for you and pop up the result in the corner of your vision. If there is something you should be reminded of after the conversation, it will handle it first and then give you feedback. This form of AI, which has context, can act autonomously, and seamlessly bring back the results to you, will be very powerful.
Host: Just like when you're having a conversation, it can sense that you're struggling to think of a word and prompt it directly.
Zuckerberg: Right. Like the thought experiment I've always done: Every time I'm in a conversation, I think, "If only I could know certain information right away." The most annoying thing is that you have to stop the conversation to ask someone else and then come back to it. In the future, you can simply send a message with the Neural Band and get an answer immediately without interrupting the conversation. It's like seamless multitasking.
●Meta's official website
Host: Yes, I think we can already see some prototypes now. After people discover a product, they'll see it on Instagram, search in the comments, or ask their favorite creators what they think. It makes perfect sense for the super - intelligence to gather all this information for you in the future.
Now, let's talk about the future of virtual reality. In the long run, how many pairs of glasses will people own? On one hand, we want to condense everything into one pair of glasses, but on the other hand, humans love variety, just like no one wants to wear the same watch every day.
Zuckerberg: I think people will indeed have a lot of immersive interactive experiences with VR glasses in the future. However, a more appropriate analogy is that augmented reality (AR) is the future of mobile phones, carried around and mobile - first; while virtual reality (VR) is the future of televisions, more focused on immersion.
In fact, Americans spend about as much time on TVs as they do on mobile phones each day. It's just different use - cases: one is more immersive and interactive, and the other is more portable. Both will be important. The difference in experience is ultimately limited by computing power.
AR glasses are small in size and can only accommodate a limited battery and computing unit, and you can't walk around with a cable attached; while VR headsets have more space to pack in more computing power. Just like the difference between mobile games and console/PC games today: You can play various games and watch videos on your phone, but for the most immersive experience, you need a dedicated device. The same will be true in the future.
Host: Imagine you're on your way to work, wearing the Meta Ray - Ban Display to read emails. Then you enter the office, sit down for a meeting, and start your day.
Zuckerberg: Haha, thank you all.
Editor | Du Xueying, Ba Rui
Cover image source | X
What are your expectations for the Ray - Ban Display smart glasses?
Welcome to share your thoughts in the comments section!
This article is from the WeChat official account "Future Human Laboratory". The author is Du Xueying and Ba Rui. It is published by 36Kr with permission.