Will Gyges Labs be the future answer for AI + AR glasses?
The essence of venture capital is to bet on uncertainty. If the risk is very low and the certainty is extremely high, where does the excess return come from?
When a top - tier investment institution pursues controllable risks and takes a fancy to an AI + AR glasses product with “certainty and a sense of security”, what kind of form will it take?
Halliday Glasses.
Pessimists are always right, while optimists win the future. However, when technical problems, especially those key obstacles that have remained insurmountable for decades, can suddenly be bypassed or solved by an “innovative solution” from an unknown startup, it requires very careful screening and vigilance.
AR glasses are not something new. Over the decades, numerous solutions claiming to have achieved more natural near - eye display, integrated and lightweight design, higher light efficiency, and lower energy consumption have been disproven one by one. Microsoft's Hololens, Magic Leap which almost turned into a fraud, and Apple Vision Pro which has gradually faded into obscurity in the consumer market. They made efforts, achieved some progress, saw hope, and then were mercilessly beaten down by the market.
Zhu Xiaohu, a well - known investor who probably doesn't know much about hardware and AR, gave the following evaluation of his first AI hardware investment, an AI glasses project:
“Compared with other solutions, Gyges Labs' DigiWindow technology has a very important advantage: its display module is invisible, and it can project content directly into the eyes at a very low cost. This technology can greatly expand the usage scenarios of information - prompting smart glasses and also help glasses manufacturers with intelligent upgrades.”
Halliday Glasses, which integrates Gyges Labs' DigiWindow solution, does not use the mainstream diffractive waveguide or geometric optical solutions with combined optical paths (such as BirdBath, free - form surface, etc.). Instead, it uses a monocular micro - projector to project directly into the eyes.
If you don't understand the principle of this solution, its “intuitive effect” is no less than a real deep - water bomb in the industry, because Gyges Labs' DigiWindow solves many fundamental problems in AR near - eye display.
Image from Gyges Labs
Currently, almost all AR glasses use diffractive or arrayed waveguide technology. They all have obvious waveguide plates “embedded” in the lenses, with two - dimensional pupil - expanding square, strip, or even dot - shaped dark patterns. Everyone in the industry is wracking their brains to achieve “visual hiding” of the waveguide plates, so that users can have more natural eye - to - eye visual communication when wearing the glasses and not appear out of place in social situations.
The DigiWindow solution does not require a waveguide plate, nor does it need to design a holographic film on the lens like Google North Focals to achieve focusing and reflection (into the eyes) of the micro - projector. When wearing Halliday Glasses, outsiders can't see the “private images” projected through the lenses. When chatting face - to - face, it's very difficult to tell that the person in front of you is wearing AR glasses rather than normal ones.
In today's absolutely mainstream diffractive waveguide display, it is very difficult to eliminate the rainbow patterns. Those bright red lines flashing in the picture greatly affect the visual experience. Neither the wearer nor outsiders can accept wearing a pair of glasses with strange colorful light patterns for a long time. In the industry, people keep re - doing the diffractive waveguide layout and constantly making trial - and - error attempts in the details of materials and processes just to reduce the rainbow patterns a little. The DigiWindow solution fundamentally eliminates the waveguide plate, so there is no problem of rainbow patterns at all.
For the AR + AI glasses currently on the market, whether they use Micro LED, Micro OLED, LCoS, laser - scanned micro - display, monocular or binocular, when the light engine is embedded in the extremely limited space of the frame, it looks like a lump. On one hand, people are trying to “reduce the swelling” in terms of volume, making the light engine smaller and smaller without sacrificing brightness and color as much as possible. On the other hand, they are thinking about how to seamlessly integrate the optical engine to make the glasses more integrated. For example, the light engine is designed at the bend of the temple to achieve “concealment”.
In contrast, the “light engine” of Halliday Glasses is even smaller than a soybean and is directly “embedded” in the frame. The integrated complete glasses with computing, sensing, and battery functions only weigh 35g, and its battery life can outperform the mainstream solutions with only 1 - 2 hours of battery life, achieving 6 - 12 hours of battery life with a single charge per day.
Photo taken by foreign media ZDnet
There is also the thorny issue of prescription glasses for myopia in AR glasses. Nowadays, most people are myopic. When wearing AR glasses and adding additional myopia lenses, it is very cumbersome and troublesome in terms of the product. The “prescription” methods for AR glasses include front - fitting or rear - clip - on. Myopia lenses often need to be custom - made separately. If they are fully fitted in the front, the degree cannot be adjusted. If they are clip - on at the rear, the optical module will be “actually thickened”, deviating from the original module design. The exit pupil distance needs to be longer, which affects various related optical parameters, and the ergonomics also need to be adjusted accordingly.
In contrast, with the “micro - projector” of DigiWindow, you can adjust the refraction/focus to adapt to your vision by rotating the front ring of the lens.
Each advantage of DigiWindow hits the key pain points:
Efficiency and brightness —— (Micro - projector) Direct projection into the eyes, far higher than any waveguide/combined optical path solution, visible outdoors and power - saving.
Compatible with ordinary lenses of any degree —— Since the light is directly projected into the eyes without passing through prescription lenses, any ordinary lenses can be used.
Private and no forward leakage —— No external light capture, avoiding the rainbow artifacts of diffractive waveguides; does not block the forward field of view, which is the key.
All - day lightweight wearing and use —— Halliday Glasses are lighter, thinner, and have a longer battery life than waveguide AR glasses. Moreover, it is an integrated AR with highly integrated modules, achieving a single charge per day.
Why didn't others come up with such a magical solution?
DigiWindow is indeed ingenious to the point of being “unexpected”. That micro - projector, which is smaller than a soybean and is the necessary light engine in other glasses, is essentially a “reverse telescope” —— that is, the “reverse optical path” of the telescope we are familiar with. The MicroLED micro - display screen is placed at the eyepiece of the telescope, and the image is emitted from the objective lens end.
Schematic diagram of the optical principle of the Cassegrain telescope
When users use it, they need to roll their eyes upwards and look at a certain area at the top of the frame to see the image. Its eyebox is much smaller than that of the waveguide. They must adjust the slider and pitch to align the eyeball axially with the projector.
At the beginning of this year, Halliday Glasses attracted a lot of media attention at CES because of its very unique optical design for AR/AI glasses. A post on Reddit titled “Halliday Glasses – Smart Glasses with AI Assistant” pointed out that its optical principle is similar to that of MojoVision's contact lens display.
Picture of Mojo
Halliday Glasses use Gyges Labs' DigiWindow technology. DigiWindow is essentially of the same optical principle as MojoVision — both are “reverse telescopes”, except that it is larger in size and farther from the eyes than Mojo.
Picture of the wearing effect of Mojo
Jia Jieyang, who graduated with a doctorate from Stanford, is currently the founder & CEO of Gyges Labs. From 2016 to 2018, as an initial employee, he participated in the development of the first smart contact lens prototype of Mojo Vision (the first company to bring AR contact lenses to the consumer market). This development experience inspired Jia Jieyang's idea of developing an ultra - miniature near - eye display system and applying it to smart glasses in the future.
Jia Jieyang got the “inspiration” for the optical principle from his former employer MojoVision, made secondary improvements and innovations, and created a micro - projector that directly projects onto the eyeball — Gyges Labs' DigiWindow technology. Its volume and power consumption are indeed small. It realizes virtual image focusing, direct projection onto the eyeball, and can also correct refraction and myopia. Most importantly, it bypasses all the troublesome root causes: the waveguide plate for light transmission has non - eliminable dark patterns, the rainbow patterns caused by color separation in diffractive waveguides, the brightness loss during the total internal reflection propagation of image light, and the scattering reflection/privacy leakage.
DigiWindow seems perfect. Its more “minimalist early form”, the Mojo contact lens, can even be directly attached to the eyeball as a contact lens, physically realizing eyeball tracking and real - time displacement for “synchronous following” of the AR picture. But what are the drawbacks? The eyebox of this solution is very small. You have to work hard to move your eye muscles to see the content at the edge of the picture.
External photo of Halliday Glasses
The eyebox is a three - dimensional area in front of the eyeball where the light engine projects the picture. Only within the eyebox area can you see a clear AR picture. The eyebox of DigiWindow is very small. You have to “roll your eyes” so much that you need to keep rolling your eyes upwards and seriously staring at a certain area at the top of the frame to see the projected virtual image. You also often have to carefully adjust the left - right, tilt position of the micro - projection module and the distance of the nose pads to align the angle to have a slightly appropriate visual experience.
Photo taken by foreign media ZDNet
The limited eyebox means that as long as the micro - projector is slightly misaligned or the glasses are bumped and displaced a little during daily use, users won't be able to see the image.
There is another “intuitive” problem. According to the patent drawings of MojoVision, the LED outputs Lambertian (approximately diffuse) light. Even if the MicroLED has a micro - lens, a large amount of light misses the absorption sidewall and the secondary mirror and becomes “stray light”, causing overall floodlighting, that is, there is a bright ring around the image. DigiWindow will probably directly inherit this inherent defect.
Screenshot from the analysis of AR optical expert Karl Guttag
After CES at the beginning of this year, Halliday released a video titled “Halliday AI Glasses: CEO Shares Insights Behind the Vision and Concept of Our Product!”. The video contains a direct - shot perspective segment from a mobile phone. By capturing the picture, you can see that the stray light causes floodlighting around the text, and the outer ring becomes brighter when there is more text.
Screenshot from the analysis of AR optical expert Karl Guttag
There is also the issue of resolution. As the volume of the micro - projector becomes smaller, the micro - display device also needs to be reduced synchronously. It is extremely difficult to improve the resolution, which requires progress in basic physics and material technology. Otherwise, the optical components cannot fit into the frame. There is a lower limit to the physical size of pixels. To improve the resolution, a larger display and mirror are needed, which is a contradiction that is difficult to reconcile.
Since DigiWindow is essentially a micro - projector solution with a reverse optical path of a telescope, its eyebox is much smaller than that of the waveguide solution, and the field of view is also extremely limited. The spatial positioning, rendering, and calculation of AR are even a huge and almost unsolvable problem.
What have consumer - grade AR products been busy with these years? Array waveguides, diffractive waveguides, BirdBath optical display solutions, Micro LED, Micro OLED, LCoS light engines. The technical direction that everyone has been working on is to transmit and cover a high - resolution, full - color AR virtual image to a larger eyebox area in front of the eyeball with sufficient brightness and as low power