The second-largest acquisition in Apple's history, but the target is not mobile phones.
Whenever we talk about Apple, we always subconsciously depict it as a pure product company.
In the real world, today's Apple is far more than just a product company. It is also an investment company with unique insights - not investing in financial products, but in future technologies.
▲ Image | TheVerge
Just last night, it was reported that Apple had spent a huge sum of money to acquire an Israeli human-computer interaction technology company called "Q.ai". The main research and development direction of the latter is related technologies for reading facial movements and understanding silent communication.
Although the two parties have not announced the specific details of the acquisition, Q.ai had previously received support from venture capital giants such as Kleiner Perkins, Google Ventures (GV), Spark Capital, and the EXOR Group. The outside world estimates that the acquisition amount is around $1.6 - 2 billion.
If the valuation is correct, this will be the second-largest acquisition case in Apple's history, second only to the $3 billion acquisition of Beats in 2014. It may also be Apple's largest single investment in the AI field.
But who exactly is this Q.ai company? It is a scientific research company with no technical details on its official website. Why is Apple willing to spend so much money to acquire it?
A new face, but an old friend
Although the name of Q.ai contains "AI", it is not an "AI company" in the conventional sense, like OpenAI or xAI, which provides model services.
The main technology of Q.ai is based on machine learning algorithms to analyze facial muscles and micro - expressions when people speak, so as to understand silent communication and convert this information into specific input or control instructions.
In other words, the "product" of Q.ai is actually a new type of human - computer interaction technology based on ML - in terms of the technical route, it belongs to AI, but not in the currently popular field of generative AI.
When it comes to Q.ai, we also have to mention its main founder: Aviad Maizels.
▲ Image | CTech
You may not know this Israeli, but you have probably used his products.
In 2005, Aviad Maizels founded the PrimeSense company with several technical colleagues, focusing on 3D perception and spatial interaction technology. Based on the principle of static structured light, they developed a dynamic structured light technology called "Light Coding".
At the Game Developers Conference (GDC) in San Jose the following year, PrimeSense demonstrated the prototype of their 3D sensing system, which was immediately spotted by Microsoft, which was seeking a breakthrough for its game consoles.
The final product of the cooperation was the Kinect motion - sensing remote control kit for the Xbox 360. It sold 8 million units in two months after its launch and almost surpassed the Wii to become the synonym for "motion - sensing games":
▲ Image | CNET
However, the cooperation between the two companies only lasted for a short time. In 2013, for the Xbox One, Microsoft abandoned PrimeSence's dynamic structured light solution and changed the second - generation Kinect to its self - developed 3D ToF route.
This instead gave Apple an opportunity to make a move.
In the same year, Apple confirmed that it had acquired PrimeSense for about $350 million. At that time, this was one of Apple's largest technological acquisitions, second only to the acquisitions of the fingerprint technology company AuthenTec and the chip manufacturer PA Semi.
Needless to say about the subsequent products. PrimeSence's dynamic structured light technology ultimately became the cornerstone of Apple's TrueDepth Camera and ARKit, which is also the root of the Face ID we use now.
▲ Image | Apple
What Apple wants to get from Aviad Maizels this time is also a revolutionary way of human - computer interaction.
As mentioned earlier, the core technology of Q.ai lies in identifying the precise instructions issued by users in a silent state.
This is fundamentally different from the visual gesture recognition interaction used by the current Vision Pro, which easily makes us think of another thing - lip - reading.
In other words, in Apple's vision, in the future, wearable devices can accept operation instructions or content input based on facial movements even if they can't hear voice commands -
▲ Image | OneEarPod
As for why Apple wants to invest in this technology at this point, we can make a bold speculation:
Apple needs to find a more reasonable interaction method for the smart glasses on its 2027 product roadmap.
After all, when using smart glasses, you definitely don't want people on the street to hear what you say or the commands you give to the AI glasses - that scene is too strange and doesn't conform to Apple's consistent elegant and intelligent interaction style.
The approach of Q.ai can perfectly solve this problem, allowing you to give instructions without making a sound, just by making a mouth movement. It is private and elegant, and also in line with our imagination of super AI glasses:
▲ Image | Marvel Cinematic Universe Wiki
Another bolder guess is that Apple may develop the legendary under - display Face ID technology based on the recognition principle of Q.ai.
By changing the principle to high - precision muscle recognition and instantaneous muscle movement, Face ID can achieve a similar level of security as structured light and further reduce the number of parts to achieve the effect of being hidden under the screen.
▲ Image | 9to5Mac
The second acquisition may have been premeditated
Although Aviad Maizels stayed at Apple after PrimeSence was acquired by Apple in 2013, he didn't stop moving forward.
In addition to continuing to research human - computer interaction technology, in 2016, Aviad Maizels also co - founded the Bionaut Labs company with two of his colleagues who founded PrimeSence, aiming to research precision medical robots that can cross the blood - brain barrier.
▲ Bionaut Labs showcases their micro - surgical robot | Daily Sabah
In 2022, Aviad Maizels, who had become the senior director of Apple's hardware and technology department, announced his departure and founded the aforementioned Q.ai company to continue researching machine learning and human - computer interaction.
Of course, we all know the subsequent story - Q.ai, which had kept a low profile, was researching until 2026 when it was acquired by Apple, and Aviad Maizels returned to Apple again.
▲ Aviad Maizels when studying at the Weizmann Institute of Science | WeizmannCompass
But if we look at the whole picture, Aviad Maizels' journey from "leaving the company to start a business" to "being acquired for the second time" may actually have been carefully planned -
This is not some conspiracy theory, but a common business move.
At that time, Apple was at a critical point of "big things are coming". It was preparing for the future Apple Intelligence and Vision Pro at the same time, and its internal research direction and organizational structure were in constant change.
In this chaotic context, separating the less urgent technology routes from the parent company to ensure that R & D is not affected by management decisions, and at the same time serving as an external "strategic reserve", this is a typical corporate spin - off strategy.
▲ Image | MacRumors
Similar things are also recorded in Apple's history, which is the story of Mac and NeXT -
Although Steve Jobs was forced out of the company back then and it was not his will, Jobs himself and NeXT just avoided the chaotic period of then - Apple CEO Gil Amelio.
The Unix route adhered to by NeXT finally took over the Mac system through the acquisition in 1996 and became the foundation of the later macOS and iOS.
From this perspective, Aviad Maizels and the technology stack of "silent communication" and "micro - expression recognition" he leads may be the technology reserve spun off by Apple in the past two years.
Why does Apple need new interaction
In any case, it doesn't matter whether Q.ai is Aviad Maizels' own entrepreneurial move or Apple's "diversion of forces" to ensure that the research is not affected.
What is more worthy of attention now is what kind of impact the technology of Q.ai will have after it returns to Apple.
Apple has just released a financial report that is the strongest in its history. The total revenue in fiscal year 2026 was $143.756 billion, with a gross profit margin of an astonishing 48% and a net profit of $42.097 billion:
▲ Image | App Economy Insights
However, even with such a strong stimulus, Apple's stock price showed a mediocre reaction. The closing increase on that day was only 0.72%, basically the same as normal fluctuations.
This phenomenon of good performance but a flat stock price is not only because Apple lacks the AI narrative urgently needed in the current stock market, but also because there are some signs of imbalance in Apple's financial report structure:
Among the $143.8 billion in revenue, the iPhone contributed $85.27 billion, accounting for nearly 60% of the total revenue, a 23% increase compared to last year.
Although the figures are strong, the market's attention is turning to the future. The reason is simple: after the almost crazy flash memory price increase in the second half of 2025, the market realized the fragility of the smartphone business.
Related reading: "Apple's financial report is excellent! But a price increase for the iPhone 18 is inevitable."
In other words, the iPhone, which has long been the main driver of Apple's revenue, is now a bit "unable to support the whole situation" on its own.
▲ Image | Tom's Guide
In this context, Apple urgently needs new - form products to break the situation.
Undoubtedly, the technology route that Apple is optimistic about is Vision Pro and similar smart - glasses - type wearable products. The biggest obstacle to making good smart glasses is the interaction method.
If, based on the technical concept of Q.ai, the new Vision Pro or Apple glasses can achieve facial expression interaction and input in complete silence -
Its significance for wearable devices will be no less than that of the capacitive screen on the iPhone back then.