首页文章详情

I tried on Meta AI's glasses with a screen: For the first time, I thought AI glasses had a future.

爱范儿2025-09-18 19:22
Meta Ray-Ban Display: Waveguide Screen + AI, Potentially a Turning Point for Smart Glasses

Wow, the feeling of wearing this pair of glasses is so sci-fi!

This was my first reaction after putting on the Meta Ray-Ban Display. To be honest, after seeing the performance of last year's Meta Orion prototype, my expectations weren't particularly high. But when I actually wore the Display on my head, I suddenly felt that there's hope for this industry.

At the Meta Connect 2025 press conference, this pair of "smart glasses with a screen" priced at $799 finally made its debut. It's Meta's first smart glasses that support the display function. Through the color waveguide HUD on the right eye and the supporting Neural Band wristband, it has turned the concept of "AI on the face" into a reality.

Similar to the Orion prototype, the Display uses color waveguide display technology. This technology, which relies on light repeatedly refracting within the carrier and reflecting the image onto the middle of the lens, is currently the most mainstream display method for AI glasses. But Meta has achieved a very special indicator: a light leakage rate as low as about 2%.

That means when the HUD is lit up, the person facing you can't see the reflection and light leakage of the waveguide at all, unlike the XR glasses prototype launched by Google earlier this year, which had an obvious reflection area:

The Display worn by @Myra doesn't show obvious screen light leakage at a similar angle

Even when my friend was standing right in front of me and the screen was fully lit, they couldn't notice it at all. Only from a side angle could you see the reflection of a few rows of waveguide screen wires.

However, according to the hands - on experience of foreign media TheVerge, Meta's overly transparent screen also has some drawbacks: Since the person talking to you can't see what you're looking at at all, they may think you're a bit absent - minded and not focusing on them.

As for the actual screen display effect, the brightness of the Display's waveguide screen is as high as 5000 nits. Paired with the default photochromic lenses, the lenses will turn into tinted lenses outdoors in strong light, and the readability is still excellent.

Monocular Display and the Magical Neural Band

I used to worry that monocular display would be strange, but the actual experience completely dispelled this concern. I've been using it continuously for over an hour, and the most obvious feeling is that it can seamlessly blend with the real world.

Although the Display only has a screen on the right side, its monocular display effect blends well with the environment. Because the position and field of view (FOV) of the screen are designed to be relatively low, when looking straight ahead, the display area of the screen will fall on the other person's chest, almost exactly like movie subtitles.

Since Zuckerberg is shorter than the other person, he needs to look down when talking

Although this 600×600 pixel screen isn't large, it can do things similar to a mobile phone. In other words, the design and logic of the entire display interface are very well - developed, and you rarely feel at a loss when using it.

For example, when I use it to view WhatsApp and Messenger messages, the entire conversation flow can actually be fully displayed. Scrolling through pages by swiping your finger up and down and replying with voice - to - text is also very smooth. I even thought that I really don't need to take my phone out of my pocket for a long time.

As the main control method for the Display, the supporting Neural Band wristband has changed the game.

Although it can be controlled by finger gestures like the Apple Vision Pro, the Display doesn't rely on a camera. Instead, it reads the electromyographic signals (EMG) of the wrist through the wristband for recognition. That means you can put your hands on the sides of your body, behind your back, or even in your pockets, and the gestures can still be accurately recognized.

The main operation methods of the Neural Band include double - clicking the middle finger to light up the screen, single - clicking the middle finger to exit, single - clicking the index finger to confirm, rotating the wrist to zoom in, and swiping the thumb on the index finger to turn pages, etc. Although it takes a little time to learn, it's not very difficult to master, and the sensitivity is excellent.

The way the Display processes input is even more interesting. In addition to using the 6 - microphone array on the earphones for voice input, you can actually write directly on a flat surface while wearing the Neural Band. The wristband will recognize the handwritten letters by combining EMG signals and the gyroscope, just like what Zuckerberg demonstrated at the press conference:

Zuckerberg revealed an interesting detail in an interview: he can already type at a speed of 30 words per minute with this system and uses the glasses to send and receive messages almost all day long. His executives can even tell which messages are sent from the glasses - like those shorter but instant - reply messages.

More importantly, Zuckerberg believes that the potential of the Neural Band goes far beyond this:

(Although) we invented the Neural Band to be used with the glasses, I think it may eventually become an independent platform. In the future, it can not only achieve personalized auto - completion through thinking but may also become a way to control other devices and even smart homes.

Features That Really Impressed Me

If wearing the Neural Band and writing in the air makes you feel like Iron Man, then the real - time caption function in the glasses makes me feel like I've gained superpowers.

The way to activate it is very simple. Just say "Hey Meta, turn on live captions", and the English spoken by the person in front of you will be displayed on the screen in real - time. In a noisy environment, the Display can even accurately identify the voice of the person you're looking at, and it won't get confused when multiple people are speaking at the same time.

Thanks to the screen position mentioned earlier, wearing the Display to view real - time captions is like turning the real world in front of you into a movie.

Although the core software still relies on a mobile phone connection, this is far more than just a notification mirror. You can send text, answer audio and video calls, display the music that's playing, get turn - by - turn navigation instructions, view the camera viewfinder, and run Meta AI to identify objects in front of you.

The LiveAI real - time visual recognition scene demonstrated at the press conference

Meta's CTO Bosworth emphasized: Clear text rendering is the key to making AI useful - if AI can only reply by voice, the information you get is limited. When it can directly display the answer after you ask a question, it's not only more intuitive but also more private.

This sense of certainty based on "you can see the result" can't be provided by the pure voice interaction of the regular Meta Ray - Ban smart glasses. In real life, it's often inconvenient to wake up the AI assistant by voice in many situations, but the Display's mixed voice/gesture/visual interaction doesn't have this problem.

When experiencing the Display, I tried to use Meta's AI function to identify a painting. The Display not only gave a voice answer but also showed relevant information cards on the screen. The information density is also higher compared to pure voice interaction (waiting for AI to slowly read out the answer).

This small screen can even be used to preview camera images and watch videos - although the FOV is small, the image size is just right, and you can see all the details. If a friend sends you a video link, you can directly open it on the glasses to view and quickly reply without taking out your phone.

Overall, after experiencing the Meta Ray - Ban Display, I think the core value of this pair of glasses lies in three points:

Increase information density: Even if it's just a display of one or two lines of text, the information obtained is much more than that of pure audio

Provide a sense of certainty: Being able to see the specific interface, animations, and menu options makes the operation of the glasses more confident

Be a gateway to AI: AI can see what you see, and you can see its feedback, allowing AI to blend more seamlessly into daily life

This Might Really Be the Next iPhone Moment

Zuckerberg's judgment in the interview was clear:

Glasses will be the next computing platform. This is the only device that allows AI to see what you see, hear what you hear, and talk to you all day long.

There are 1 - 2 billion people around the world who wear glasses daily. Will most of these glasses turn into some form of AI glasses in five to seven years? Zuckerberg said: I think it's like when the iPhone first appeared, and everyone was still using flip phones, but it's just a matter of time before they become smartphones.

Meta is targeting "optimists" and "people who pursue productivity". Although the initial production volume is only in the hundreds of thousands, Bosworth predicted that "we can sell as many as we produce". Zuckerberg also hinted that the real profit won't come from the hardware itself but from "people's long - term use of AI and other services."

After the experience, my biggest feeling is that the Display is no longer a concept product or a developer's toy but a tool that can really be used in daily life.

Although $799 isn't cheap (the price of an iPhone 17 is also the same), and the 20 - degree FOV and 600×600 screen aren't large, these don't prevent it from being the smart glasses that are closest to my ideal.

I think the goal of the Meta Ray - Ban Display is very clear - to reduce the number of times you take out your phone and complete all those quick viewing and reply operations through the glasses. From this perspective, it has indeed achieved its goal.

Perhaps, as Zuckerberg said, this is like the moment when smartphones replaced feature phones. The technological inflection point of smart glasses has arrived, but it hasn't been fully popularized yet. And the Meta Ray - Ban Display might be the start of this revolution, the next iPhone moment.

This article is from the WeChat official account "ifanr" (ID: ifanr), written by Xiao Qinpeng and Ma Fuyao, and published by 36Kr with permission.