StartseiteArtikel

The curse of the hundred qubits at Google has been completely broken by this company.

新智元2025-12-11 16:13
Quantum computing breaks through the 10,000-qubit bottleneck, and NVIDIA takes over as the ecological entry point.

[Introduction] Has the quantum ceiling that Google and IBM have been unable to break through in a decade been lifted by a 10,000 - qubit chip? While tech giants are still stuck in the hundreds of qubits, quantum computing has suddenly entered an era of practical applications. Ironically, it's NVIDIA, which has long been lurking at the entrance of computing power, that has truly embraced this revolution.

When looking at the long - term timeline, it's easy to find that quantum computing has been stagnant for a long time.

In 2019, Google announced in Nature that their Sycamore processor with 53 qubits had achieved "quantum supremacy".

The entire tech circle was shocked, and many people thought that the quantum era had really arrived.

But six years later, the latest number of qubits publicly announced by Google is only 105.

In 2021, they set a goal of over 1000 qubits. However, by 2023, their flagship chip, Heron, still only had 133 qubits, and the official had to postpone the expansion plan.

IBM's roadmap clearly shows the industry's embarrassment.

IBM's predicted roadmap. From 2019 to 2029, IBM predicts that quantum processors will remain in the scale of hundreds to low thousands of qubits in 2024.

From Google to IBM, all mainstream approaches have been stuck at the scale of hundreds of qubits from 2019 to 2023.

IBM's roadmap doesn't even include qubit expansion in its short - term plans, indicating that the industry has been stuck for an entire decade.

Rigetti, IonQ, Quantinuum... None of the global quantum giants have been able to break through the "hundred - qubit ceiling".

Ironically, quantum computing has long been depicted as the infrastructure for all future industries. However, the actual number of qubits has remained stagnant.

Why is this the case?

The Birth of 10,000 Quantum Bits

The reason why quantum computing has been stagnant for 10 years is not that Google and IBM are not strong enough, but that "expansion" itself is a dead end.

Adding each additional qubit requires exponentially increasing engineering efforts.

The control lines soar, the wiring density becomes crowded, the quantum state collapses at the slightest interference, and the error rate skyrockets to an unacceptable level.

When reaching the scale of hundreds of qubits, the entire system is like a bow pulled to its limit. Adding a little more force may cause the whole bow to collapse instantly.

So, people have to resort to a stopgap measure: connecting multiple small QPUs through a network to form a "large system".

This may seem like an expansion, but in fact, it just outsources the problem to the system level: higher cost, higher complexity, lower reliability, and the number of qubits never truly increases.

This deadlock was truly broken for the first time this year.

The VIO - 40K architecture launched by QuantWare has directly increased the number of qubits from the industry - recognized scale of hundreds to 10,000, thus lifting the ceiling.

This undoubtedly announces that expanding qubits is no longer the ultimate problem in quantum computing!

If the past decade is compared to a well and everyone is stuck on the well - wall, what QuantWare has done is to create a new exit for this well.

The difference between the scale of hundreds and ten - thousands of qubits is not just the quantity, but the paradigm.

Hundreds of qubits are for proof - of - concept, while ten - thousands of qubits are the scale that can potentially change the real - world computing landscape.

QuantWare's VIO - 40K architecture has started accepting pre - orders and is planned to be shipped in 2028.

The quantum world, which has been stagnant for a decade, has finally stepped from theory into the era of "scalable hardware".

The Ceiling Has Been Broken

At first glance, the increase from 100 to 10,000 qubits seems just like "a larger number".

But for quantum computing, what it truly changes is not the scale, but the direction.

In the past decade, the quantum industry has been on a narrowing path. Forcing the expansion of qubits will only damage the entire system.

Hundreds of qubits are not the starting point, but the bottleneck.

As long as you keep adding qubits, the noise and errors will rise to an unusable level.

Therefore, in the past decade, all companies have had to connect multiple small chips together and use the network to "pretend to expand".

The significance of 10,000 qubits has undergone a qualitative change here.

QuantWare uses a 3D scaling + chiplet architecture to rewrite all the parts that were previously impossible to expand, such as I/O, wiring density, signal interference, and inter - module connectivity.

As a result, scalability has been reopened for the first time, and the future trajectory of the entire industry has also been rewritten.

Quantum Computing Transforms from "Runnable" to "Scalable"

The failure to expand qubits indicates that quantum computing has always lacked the ability to enter the real world.

Because drug simulation, material design, and optimization problems require thousands of qubits of space.

Hundreds of qubits can only be used for demos, while ten - thousands of qubits can be used for real - world calculations.

Previously, people could only console themselves by saying, "We'll be able to do it when future quantum processors are developed."

But with the lack of scalability for a decade, such a future has never really existed.

The emergence of 10,000 qubits has enabled quantum computing to have "scalability" for the first time.

This is much more crucial than performance itself.

Breaking Through Engineering Bottlenecks: A Dead - End Becomes a Highway

The biggest problem in the industry has never been "too few qubits", but rather that adding qubits will damage the system.

This is why Google has only increased the number of qubits by 50 in six years.

That's also why Google and IBM have been constantly adjusting their roadmaps and even postponing their expansion plans.

However, the VIO - 40K architecture uses 3D scaling and chiplet design to overcome the five major challenges of I/O, wiring density, noise, interconnection, and calibration all at once.

The engineering path for expanding qubits has been reopened, and the industry has transformed from a trapped dead - end into a highway for continuous progress.

Quantum Computing Gains "Economic Significance" for the First Time

In the past decade, all expectations for quantum computing, such as code - breaking, accelerating drug and material reaction simulations, and optimizing energy systems, have remained at the theoretical level because hundreds of qubits are simply not enough.

The emergence of 10,000 qubits has made these applications achievable for the first time .

Once scalability is unlocked, the industry no longer has to rely on a huge leap in the future. Instead, it can grow along a visible and repeatable engineering path.

This is also the core meaning of what the CEO of QuantWare said:

 100 qubits only allow us to talk about the future, while 10,000 qubits make the future truly start.

The Computing Power Growth Curve is Re - Ignited

In the past decade or so, Moore's Law has almost stalled, and the growth of GPUs has approached the physical limit.

The unlocking of quantum scalability means that computing power has a "second growth curve".

This has opened up a new dimension for human computing power, especially at a time when computing power is becoming more expensive and difficult to expand.

The significance of 10,000 qubits has never been "more qubits", but rather a major breakthrough for quantum computing from being "stuck" to being "scalable".

This is the foundation for all quantum roadmaps and industrial plans in the next decade.

Rewriting the Spatial Structure of Quantum Processors

It's impossible to increase the number of qubits in a processor from 100 to 10,000 by simply "stacking".

QuantWare's breakthrough lies in the fact that it doesn't just increase the number of qubits, but completely changes the "spatial structure" of the quantum processor.

3D Scaling: Unleashing the Third - Dimension of Quantum Processors

The problem with traditional quantum chips is that the control lines must be introduced from the edge of the chip, while the qubits are in the middle. This leads to longer wiring and stronger electromagnetic interference.

The academic community calls this the "fan - out limit", and there has never been a clean solution.

QuantWare's 3D scaling allows control lines to enter the quantum chip from multiple layers and directions, effectively expanding the wiring from a two - dimensional plane to a three - dimensional spatial structure.

This has three direct results:

  • The connections between qubits are shorter, and the noise is lower.
  • The control lines are no longer crowded at the edge of the chip.
  • The expansion space is no longer limited by the planar area.

In short, they have added an additional dimension of space to the quantum processor.

Chiplet Architecture: Building "Small Blocks" Instead of Large Chips

Chiplet is not a concept unique to the quantum field. AMD and Intel have already achieved performance breakthroughs in the classical chip field using this concept.

AMD's Zen architecture consists of multiple small chips to form a large CPU.

QuantWare applies this mature Chiplet concept to the quantum field: splitting a large QPU into multiple modules, each maintaining high fidelity, and then combining them into a complete system through high - quality interconnection.

This solves the problems of traditional large chips, such as the increasing difficulty of manufacturing, more errors, and lower yield as the size increases.

The Chiplet architecture allows quantum processors to be manufactured, calibrated, and repaired in modules, and can be flexibly expanded in modules.

Most importantly, the connections between modules are no longer a source of noise amplification.

QuantWare emphasizes "high - fidelity chip - to - chip connection". This means that in the past, "combining QPUs would degrade the system quality", but now it has become "combining QPUs can continue to expand".

The industry has obtained a replicable expansion path for the first time.

40,000 I/Os: Breaking Through the Qubit Expansion Problem

The number of I/Os has always been one of the fundamental reasons why quantum chips cannot be expanded.

QuantWare's architecture supports 40,000 I/O control lines - this is an exponential increase and is also the first case in the industry where "expanding I/Os does not immediately lead to a noise - induced collapse".

If 3D scaling provides space and the chiplet architecture provides modularity, then 40,000 I/Os are the infrastructure.

Without this number, 10,000 qubits simply won't work.

Achieving 100× in the Same Volume: A True Engineering Singularity

When QuantWare announced its product, it emphasized that the 10,000 - qubit chip is "smaller" than existing systems.

In engineering, this is very rare: increasing the scale while reducing the volume indicates that the architecture has entered the "scalable region".

Every time Google and IBM expand the number of qubits, their systems become larger, more difficult to manage, more fragile, and more expensive.

In contrast, QuantWare achieves a larger scale, smaller volume, higher yield, and lower noise.

So, QuantWare's real breakthrough is to transform the "architecture that is bound to fail in qubit expansion" into a hardware system where "qubit expansion can continue".

This is the first time in a decade that a "scalable hardware system" has emerged in quantum computing, and it is the first engineering path that allows the industry to see a real future.

The Door is Opened, and NVIDIA is Standing at the Door

After the scale barrier of quantum computing is broken, the industry immediately faces a more practical problem:

10,000 qubits can perform calculations, but how can these calculations be integrated with the classical computing power system in the real