An era in which people are surrounded by cameras is coming.
When Code Gets a Physical Body
OpenAI is quietly making big news, a piece of news that perhaps only three people in the world are currently aware of - OpenAI's CEO, Sam Altman, former Apple chief designer Jony Ive, and Laurene Powell Jobs, the widow of Steve Jobs.
Shortly after the opening ceremony of Google I/O 2025, two other major pieces of news followed: First, Apple plans to release a pair of smart glasses next year and has canceled the watch project with a built - in camera; then, OpenAI announced the acquisition of io Products, a company founded by former iPhone designer Jony Ive, with the intention of launching AI - related hardware products in the future. It is believed that Laurene Powell Jobs highly approves of this new hardware.
The three tech giants, Apple, Google, and OpenAI, have taken turns to reveal or announce their AI product strategies, as if in perfect coordination, making this rainy summer even more exciting.
Apple: Taking a Steady Route, Continuing to Expand the AR + AI Ecosystem
According to Bloomberg reporter Mark Gurman, informed sources revealed that Apple plans to launch a smart glasses device "with a camera, speaker, and microphone" in 2026. Users can interact with Siri and issue commands through the smart glasses to analyze the surrounding environment, make calls, control music, navigate with maps, and perform real - time translation.
It is believed that Apple's smart glasses will be equipped with Apple's self - developed chips, similar to the Vision Pro. The final hardware form may be similar to the smart glasses jointly launched by Meta and Ray - Ban before, but with "better craftsmanship". In addition, this smart glasses product is also being developed by Apple's Vision Products Group, on the same development track as the improved and more affordable Vision Pro.
Based on the breaking news information and some scattered patent information since 2024, we can roughly guess the form of Apple's smart glasses.
For example, according to the patent document, Apple has designed a pair of glasses legs with a double - hinge structure for its smart glasses, which can better fit the head when worn, improving safety, comfort, and speaker performance. Some analyses even suggest that the adjustable - size glasses legs at the back can help Apple glasses achieve the "adjustable diopter" function, which is undoubtedly good news for users who already wear medical glasses.
Image | patentlyapple.com
In another patent, Apple marked the general distribution of components on the glasses legs. Due to the double - hinge design, the battery of the glasses is placed at the back, while the front space is mainly reserved for the display device. If we simply speculate based on the markings in this patent, the display method used in Apple's smart glasses may be reflective projection or waveguide, or even a combination of both.
Image | patentlyapple.com
Interestingly, in the lower part of the patent, Apple mentioned adding a transmission interface at the end of the glasses legs, which can be used to "connect various external components". Judging from the design of the locking structure inside the interface and the exposed contacts, Apple's smart glasses may support wired communication with iPhones or Macs while being worn, enabling usage scenarios that require low latency, such as gaming or high - quality video playback.
While shifting to the development of smart glasses, internal sources also indicated that Apple has canceled the previously planned Apple Watch product with a camera, only retaining the AirPods with a camera, in order to allocate more resources to the smart glasses. According to Mark Gurman's breaking news in March this year, Apple had planned to add cameras under the screen of the future Apple Watch and at the digital crown of the Apple Watch Ultra for environment recognition functions related to Visual Intelligence.
However, although it can use its own chips, the augmented reality (AR) functions that Apple previously emphasized in the Vision Pro may not be realized in the rumored smart glasses. Bloomberg believes that Apple still needs a few years to achieve the AR effects they envision.
Bloomberg also pointed out that before the internal sources revealed information about the smart glasses, Apple's stock price had dropped by about 19% this year, indicating that Apple's performance in the AI field, which is currently the focus of the technology industry, is not well - recognized. Therefore, we can speculate that at the WWDC to be held next week at the earliest, Apple will officially announce some information about the rumored smart products. It may not be a finished product for demonstration, but there will probably be a more specific time plan.
Google: Taking a Symbiotic Route, Gemini Fully Integrating into Google's Product Line
As another giant that started a bit late in the AI competition, Google has shown particularly outstanding performance since 2025. Its Gemini model has gone from barely keeping up with ChatGPT to gradually integrating into Google's product ecosystem. Now, it has become the only manufacturer in the market that can provide full - stack AI services across its product line, and its progress is hard to ignore for all competitors.
It's no exaggeration to say that Google's ability to combine software and hardware in AI services has far left Apple behind. At the just - concluded Google I/O conference, in addition to demonstrating the update of the Gemini model, Google also brought us a hybrid ecosystem product, Project Aura, developed in cooperation with Chinese company Xreal and based on the Android XR platform.
We can see a lot of similarities between Project Aura and Google Glasses from a decade ago. However, different from Google Glasses, which only lasted for two years, if the functions demonstrated at the I/O conference can be fully realized in the mass - produced product, then Google's claim that "Android XR is an Android platform built in the Gemini era" will not be an exaggeration. Android XR is very likely to truly become the benchmark for all future extended reality (including virtual reality VR, augmented reality AR, and mixed reality MR) products, just like Android for mobile phones.
Since the opening ceremony of Google I/O, more and more media have had the opportunity to experience Project Aura hands - on, and the outline of the entire product is gradually becoming clear. Different from the logic of Apple's Vision Pro, the glasses in Project Aura are just a medium for user interaction. The operation and networking of the Gemini Live model are still carried out on the mobile phone, leaving a lot of room for future model and capability upgrades to some extent.
There is also some good news. Google has continued its good tradition of making Nexus phones in Android XR and has opened up cooperation to many third - party manufacturers with a very open attitude. For example, all the Project Aura glasses used for technology demonstration are currently developed in cooperation between Google and Xreal. Later, Google also announced that it will cooperate with Gentle Monster, a youth fashion brand under Essilor, and Warby Parker, an emerging glasses brand focusing on the O2O concept, to promote Android XR to the market as a fashion product -
Judging from the record that Meta Ray - Ban sold more than one million pairs of glasses in a year in 2024, Google's path of "learning from Meta" is undoubtedly very promising.
OpenAI: Taking an Exploratory Route, Trying to Incorporate AI Jewelry into Daily Life
What's even more thought - provoking is that shortly before informed sources revealed that Apple was going to launch a smart glasses product, Sam Altman, the CEO of the artificial intelligence giant OpenAI behind ChatGPT, just announced the acquisition of a startup called io Products for $6.5 billion - the very company co - founded by Altman and Jony Ive, who has left Apple.
On OpenAI's official website, Altman and Ive announced in a joint letter that after the acquisition of io Products, several co - founders of io and the engineers in the team will be incorporated into OpenAI; while Ive and his earlier - founded design company LoveFrom will still operate independently, but "will undertake in - depth design and creative work for OpenAI and io".
But OpenAI has always been a pure software company. Why did Altman invest so many resources to invite Ive to participate? According to the prediction of well - known analyst Ming - Chi Kuo of TF International Securities, OpenAI's purpose of joining hands with Ive this time is to provide a "new form of AI hardware device" for artificial intelligence software. This hardware device for hosting AI is different from the conventional concept. It is not a device like a mobile phone or a speaker that we are familiar with, but "as small and delicate in appearance design as an iPod Shuffle".
In terms of configuration, Ming - Chi Kuo speculates that this OpenAI hardware will be equipped with a camera and a microphone for environmental perception but will not have a display function. In addition, it may need to be connected to devices such as mobile phones or PCs and run by using the display function and computing power of the mobile phone or computer. In terms of usage, Ming - Chi Kuo speculates that one of the usage methods of this new - form product may be to hang it around the neck, which is somewhat similar to the wearing method of the Pendant, an AI recording device launched by Limitless (formerly Rewind):
One way to wear and use the Limitless Pendant
This is a very interesting product form.