Quantum computing is triggering a wave of IPOs, and Jensen Huang's "ambition" can no longer be hidden.
A few years ago, quantum mechanics was often treated as a joke: When in doubt, turn to quantum mechanics.
But now, the joke has turned into a prospectus.
In the past few months, three quantum computing companies, Infleqtion, Xanadu, and Horizon Quantum, have successively gone public, and several other companies are queuing up to enter the NASDAQ.
A project that once only belonged to laboratories and science - fiction movies has suddenly been pushed in front of the public market.
The question is, has quantum computing really reached the eve of commercialization?
I don't think so.
The most interesting thing about this wave of listings is not that it proves the maturity of quantum computing, but that it exposes the real situation of this industry.
Although they are all called quantum computing, the technical routes are diverse.
Moreover, when you carefully study the financial reports of these companies, you will find that only a few general - purpose quantum computers have been sold. On the contrary, the peripheral products of quantum computing support the operations of these quantum computing companies.
In addition, although this business is still in its early stage, NVIDIA has already entered the field.
As early as 2021, NVIDIA used GPUs to help researchers simulate quantum circuits on classical computers.
Later, it invested in several quantum computing startups all the way. At the GTC in 2025, Jensen Huang even directly announced the establishment of the Boston Quantum Research Center NVAQC.
However, what Jensen Huang wants to do is not the quantum computer itself. He wants to turn NVIDIA into the underlying entry point in the era of quantum computing.
Just like in the AI era, NVIDIA doesn't sell models. It sells the computing power needed for training and inference.
Whether NVIDIA can replicate its success is still unknown for now. But before that, let's first understand what the current situation of quantum computing is like.
01 Technical Routes
Although they are all called quantum computing, the technologies are very different. There are four mainstream routes, and each route is based on completely different physical principles.
Superconducting quantum computing is the fastest - industrializing route at present.
Large companies such as IBM, Google, and Rigetti are all on this route.
Its technical principle is to use Josephson junctions to construct artificial qubits. Therefore, it requires an extremely low - temperature environment, reaching the millikelvin level.
This is really a cold fact. The temperature environment required for superconducting quantum computing is colder than outer space, which is about 2.7 Kelvin.
The advantage of superconducting quantum computing is that its process is close to that of traditional semiconductors, and it has strong bit scalability. However, its coherence time is short, and the noise is large.
This route has the largest financing scale, but its high dependence on the refrigeration system makes the cost remain high. A dilution refrigerator can cost several million dollars.
IBM's "Golden Eye" dilution refrigerator costs more than $800,000, and the annual electricity bill is more than $100,000.
For a larger - scale system, such as the refrigeration equipment of Rigetti that can support 500 qubits, the cost can reach more than $2 million. The refrigeration system accounts for more than 90% of the total cost of the entire superconducting quantum computer.
Ion - trap quantum computing is another route.
Currently, IonQ and Quantinuum are working on it. They use charged ions as qubits and achieve quantum gate operations through laser manipulation. This route has the highest quantum gate fidelity.
It's like a big abacus. The charged ions are the abacus beads. Each electrical impulse is like moving a bead. High fidelity means it performs actions more accurately and has a lower error rate.
In October 2025, IonQ announced that it had achieved a 99.99% fidelity of two - qubit gates, which is a world record. Quantinuum had already achieved a fidelity of over 99.9% in 2024. It also has the longest coherence time, ranging from 0.2 seconds to 600 seconds, far exceeding the tens of microseconds of the superconducting route.
However, the problem with ion - trap quantum computing is that it is difficult to expand the number of qubits.
The more ions there are, the more difficult they are to control. So, you can't simply increase the computing power by "adding more ions." Instead, you have to use a more complex control system to manage these ions. Therefore, ion - trap quantum computing can easily reach the computing power ceiling.
Neutral - atom quantum computing has only emerged in the past two years, but it is also the hottest at present. Infleqtion, Pasqal, and QuEra are working on it.
Its principle is to use an optical lattice to trap an array of neutral atoms and use optical tweezers, that is, focused laser beams, to fix the atoms. Its greatest advantage is that the number of qubits can easily reach thousands, and the coherence time is relatively long.
Infleqtion has already achieved an array of 1,600 physical qubits, which is the current record. Its entanglement fidelity reaches 99.73%, which is the highest among neutral - atom companies.
Infleqtion went public in February 2026. CEO Matthew Kinsella said, "Neutral atoms are moving from scientific progress to commercial relevance."
Finally, there is photonic quantum computing, which is also the easiest to understand.
Xanadu, mentioned earlier, is on this route.
Its technical principle is to use photons as information carriers. Its greatest advantage is that it can operate at room temperature, without the need for a vacuum or refrigeration system, and it is naturally suitable for the integration of quantum communication and computing.
Xanadu became the first photonic quantum company to go public in March 2026. Its Aurora system is known as the first modular and networked photonic quantum computer, with real - time error - correction capabilities. It is planned to reach 500 logical qubits from 2029 to 2030.
Aurora consists of four independent server racks, interconnected by optical fibers, containing 12 qubits, 35 photonic chips, and 13 kilometers of optical fiber. It operates at room temperature, and only the photon detectors need a low - temperature environment.
This is the natural advantage of photonic quantum computing.
However, the gate operation fidelity of photonic quantum computing is far lower than that of superconducting and ion - trap quantum computing.
Photons do not interact with each other naturally. Two photons can pass through each other without causing any disturbance. This makes it very difficult to implement a deterministic two - qubit gate. Light will be attenuated during propagation, and the information carried by light will also be lost.
That is to say, to achieve the same computing power, the difficulty coefficient of a photonic quantum computer is significantly higher than that of other routes.
Which one is more reliable? From the perspective of technological maturity, superconducting and ion - trap quantum computing are the closest to commercialization, while neutral - atom and photonic quantum computing are still in the "very promising" stage.
However, the current problem is which route has the best cost - performance ratio. You need to consider all the issues such as performance, cost, and deployment together.
The essence of this wave of listings is that the capital market is forced to vote on different technical routes for the first time. Investors are no longer satisfied with the grand narrative of "quantum computing is very important." They want to see costs and revenues.
On the first day of its listing, Xanadu's stock rose by 15%, but it fell by more than 10% after hours. Horizon Quantum's stock fell by 18% after hours. When Infleqtion went public in February, its valuation was $1.8 billion, and its highest market value was $3.8 billion. However, after entering April, its market value has dropped to about $2.374 billion.
02 NVIDIA's Quantum Ambitions
When it comes to computing, we have to mention NVIDIA.
NVIDIA's quantum strategy is very clear. It plans to replicate the success of CUDA and turn it into CUDA - Q, that is, the quantum version of CUDA.
However, before that, I need to popularize a concept for you, which is fault - tolerant quantum computing.
The qubits we mentioned earlier are very fragile. Temperature, vibration, electromagnetic noise, photon loss, and even an imperfect operation can cause the quantum state to deviate.
Fault - tolerant quantum computing adds a whole set of anti - fall mechanisms to this pile of building blocks.
It uses many unreliable physical qubits to form a more reliable "logical qubit." Even if some of the physical qubits make mistakes, the system can detect, correct them, and then continue to calculate.
It's like telling one thing to 100 people and asking them to pass the message for me. Even if some people forget or say it wrong, at least someone will remember it.
At the hardware level, NVIDIA has developed the NVQLink platform architecture. It uses RDMA over Ethernet to achieve a microsecond - level delay connection between GPUs and quantum processors, with a delay of less than 4 microseconds. This delay level is the key to quantum error correction.
For the most advanced quantum processors, the decoding window for each round of error correction is only a few microseconds. NVQLink enables GPUs to complete error - correction decoding within the clock cycle of the QPU, which is a necessary condition for achieving fault - tolerant quantum computing.
At the software level, NVIDIA has developed the CUDA - Q platform and the CUDA - Q QEC library, providing a unified programming interface.
Developers can write quantum - classical hybrid applications in the same environment without worrying about the differences in underlying hardware. The CUDA - Q QEC 0.6 version, which was just released in April 2026, has achieved deep integration with NVQLink and supports real - time GPU decoding.
At the ecological level, NVIDIA has cooperated with more than a dozen supercomputing centers around the world. Including Japan's G - QuAT, the National Quantum Computing Center of Singapore, etc., it integrates quantum processors into the existing HPC infrastructure.
Quantinuum has announced that its latest Helios QPU and all future processors will be integrated with NVIDIA GPUs through NVQLink. The Helios QPU is equipped with NVIDIA's GH200 Grace Hopper as a real - time host for real - time quantum error correction.
Now, quantum computing is at a turning point from a "laboratory prototype" to a stage that "requires large - scale classical computing support." Quantum error correction, calibration, and hybrid algorithms all require powerful classical computing capabilities for real - time cooperation, which is NVIDIA's home field.
But there is a problem here, that is, quantum computing is not AI.
The explosion of AI is because deep learning is a killer application on GPUs. Only GPUs can do it well, while CPUs can't. This is why NVIDIA has become so successful.
So far, there has been no killer application in quantum computing.
The application scenarios that can really make enterprises willing to spend money on quantum computing time are not very clear at present.
Regarding when the fault - tolerant quantum computer will be released, the current industry prediction is that it will take another 5 to 10 years. NVIDIA, which is betting on both physical AI and digital twins, may not have so much time and energy to invest more in quantum computing.
In September 2025, NVIDIA successively invested in Quantinuum, QuEra, and PsiQuantum, covering the three major routes of ion - trap, neutral - atom, and photonic quantum computing. This shows that NVIDIA is casting a wide net, but it also shows that it is not sure which route will ultimately succeed.
If the coherence time of quantum processors is significantly improved, or a new architecture that does not rely on real - time error correction emerges, NVQLink will be a waste of investment.
NVIDIA is betting on the idea that "quantum computing will inevitably move towards fault - tolerance, and fault - tolerance will inevitably require powerful classical computing support."
This assumption seems reasonable for now, but it is not the only possible technical path.
It took about 10 years for AI to move from the laboratory to commercialization, from AlexNet in 2012 to ChatGPT in 2022.
But quantum computing is still in an earlier stage. If it takes 10 years to be commercialized, can NVIDIA wait that long?
03 What is the Truth of the Industry?
If you pay attention to the quantum computing industry, you will find that few people buy general - purpose quantum computers. Now, quantum computing makes money entirely through peripheral products.
This is also the most worthy - of - attention issue in this wave of listings.
For most quantum computing companies, the real source of income is not the general - purpose quantum computers they advocate the most, but quantum sensors, quantum clocks, control chips, software stacks, and HPC integration services.
The general - purpose quantum computer has not yet formed a mature commercial market that can be scaled and replicated.
To put it more bluntly, the industry is using the income from peripheral products to support a long - term main business.
Infleqtion's main sources of income are optical atomic clocks, quantum radio - frequency receivers, and inertial sensors, which are used in fields such as energy and space.
As of June 2025, Infleqtion had sold three quantum computers and hundreds of quantum sensors. Its income in the past 12 months was about $29 million, and the compound growth rate in the past two years was about 80%. Its income in 2026 is expected to be $40 million.
The prices of quantum sensors range from tens of thousands of dollars to hundreds of thousands of dollars. Research - grade atomic clocks and gravimeters can cost more than $500,000.
As the manufacturing scale expands, the cost is expected to decrease by an order of magnitude in the next decade, just like solid - state lidars. They used to cost tens of thousands of dollars each, but now they only cost $2,000.
The situation of Xanadu is the same. Its main income comes from quantum computing peripheral products, and the income sources are its top three customers.
In addition, almost all listed quantum companies have received a large amount of government funding.
Xanadu has received support from the DARPA project and funds from the Canadian "Quantum Champions" program. Infleqtion, IonQ, and Rigetti all have contracts with the U.S. Department of Defense and the Department of Energy.
The key question is, how long can this peripheral - income model last?
The market scale of quantum sensing is limited.
The markets for products such as atomic clocks and inertial sensors are mainly in the fields of national defense, aerospace, and scientific research. It is not a mass market that can support a valuation of tens of billions of dollars. Moreover, even government contracts cannot grow indefinitely. The government also has limited resources.
Before the performance of quantum computers reaches "quantum supremacy," it is also difficult for cloud services to form a scale. After all, the cost - performance ratio of current quantum computers is far lower than that of traditional computers.
You may say that SpaceX also used launch services to support its Mars project in the early days, and Tesla used carbon credits to subsidize the R & D of electric vehicles.
But don't forget that SpaceX's launch services themselves are a huge market, and rocket technology is universal. The same technology is used for launching satellites and going to Mars. Although Tesla's electric vehicles suffered losses in the early days, at least the products could be sold to consumers, and the market demand was real.
Quantum computing is different. No matter how many quantum sensors are sold, it is difficult to support the long - term operation of a company with a valuation of billions of dollars.
The current situation of the quantum computing industry is a bit awkward. The technology is indeed progressing, but there is still a long way to go before real commercialization, especially since even the entrepreneurs themselves can't give an accurate answer.
How far this model can go depends on two factors. One is the speed of technological breakthrough. If a major breakthrough is suddenly achieved in a certain route, such as a ten - fold increase in coherence time or a significant improvement in error - correction efficiency, the commercialization process of the entire industry will be accelerated.
The other is the patience of the capital market. Those who dared to invest in AI 10 years