HomeArticle

The quantum computing revolution is on the horizon: IBM's new chip could solve a millennium-old problem

36氪的朋友们2025-11-13 14:05
IBM unveils quantum chips Condor and Nighthawk, breaking through in fault-tolerant quantum computing and moving towards engineering reliability.

An IBM researcher holds a wafer of the company's Loon experimental quantum processor.

On November 12 local time, while artificial intelligence continues to dominate the technological narrative, another deeper computing revolution is approaching. It aims to solve an age - old "millennium problem" that humanity has long pursued: enabling computers to think in parallel like nature.

The philosophical root of this "millennium problem" can be traced back to ancient times - when humans first pondered whether rationality could simulate nature, the concept of computation quietly germinated. In modern times, it has transformed into a scientific proposition: how to make machines reproduce the infinite changes of the complex world within limited time and energy? The logical system of traditional computers is based on determinism and binary, while the real world follows the uncertain laws of quantum superposition and probability. The gap between the two is the boundary that computational science has failed to cross for hundreds of years.

IBM today launched two experimental quantum chips - the Loon processor and the Nighthawk chip in Albany, New York. Researchers said that this breakthrough may enable machines to perform calculations based on the laws of quantum physics for the first time, addressing complex problems that traditional computing cannot solve. This is not only a technological innovation but may also mark that the distance between humanity and the "millennium computing problem" has been shortened to a visible scale.

Jay Gambetta, the director of IBM Research, said at the press conference, "We are trying to do something that no one has ever done before, which is to make the quantum system operate continuously despite its imperfections." This declaration is not only a technological breakthrough but also like a formal dialogue between humanity and the millennium computing problem: to make machines calculate nature in a natural way.

01. IBM Chip Breakthrough: From Theory to Feasibility

IBM's Nanoscale Science and Technology Laboratory in Albany, New York.

The history of quantum computing is almost parallel to that of modern physics. From the Schrödinger equation to the Turing machine model, scientists have been searching for a way to transform the complexity of nature into computable logic. However, it was not until the 21st century, with the rapid rise of quantum information science, that this ideal found a real - world path.

The core goal of IBM's newly launched Loon and Nighthawk chips is to pave the way for the realization of a "fault - tolerant quantum computer." The so - called "fault - tolerant" means that even if there are computational errors in the system, it can maintain the accuracy of the results through self - correction. This has always been the biggest obstacle for quantum computing to move out of the laboratory and into the real world.

The Loon processor is IBM's experimental verification platform in this direction. It demonstrates the core components required for large - scale quantum computing and achieves control and isolation of errors at the hardware level. The Nighthawk chip further optimizes the structure of quantum gates, enabling the system to perform more complex computing tasks. It is considered a key node on IBM's roadmap to achieve a "universal quantum computer."

"Previous quantum chips could only operate briefly in extremely low - temperature environments," Gambetta explained. "But the Loon and Nighthawk prove that we can now make the system perform continuous calculations in the presence of errors and noise - which means that the fault - tolerant architecture is no longer just a theory."

IBM said that this achievement indicates that quantum computing is entering a new stage from "physical feasibility" to "engineering reliability." Arvind Krishna, the CEO of the company, pointed out in a statement, "If artificial intelligence has taught computing to think, then quantum computing will teach it to deduce the future."

02. Solving the Millennium Puzzle: The Principles and Potential of Quantum Computing

The "millennium problem" does not refer to a literal millennium but symbolizes humanity's exploration of the essence of computing since ancient times. From the astronomical algorithms of ancient Babylon to Leibniz's binary system, from the Turing machine to supercomputing, humans have always relied on deterministic logic to describe the world. However, the operating laws of the quantum world are different: particles can exist in multiple states simultaneously, can be "entangled" as a whole, and can collapse upon observation.

What quantum computing is trying to answer is the question that has puzzled scientists for hundreds of years: how to make machines calculate certainty in an uncertain environment?

Different from traditional computers that rely on the binary logic of bits (0 and 1), the core unit of a quantum computer is the qubit. A qubit can be in a superposition state of "0" and "1", which means it can represent multiple possibilities simultaneously. This property endows quantum computers with exponential parallel processing capabilities.

Anand Natarajan, a professor at the Massachusetts Institute of Technology, explained vividly, "Imagine a spinning coin. A classical bit is the heads or tails when the coin lands, while a qubit is in the spinning state - it is both heads and tails."

This characteristic enables quantum computing to complete tasks in a very short time that would take traditional computers thousands of years. For example, the pharmaceutical industry can use quantum computing to simulate drug reactions at the molecular level, significantly shortening the R & D cycle of new drugs; materials science can design lightweight alloys or high - temperature superconducting materials at the atomic scale; financial institutions can use quantum algorithms to calculate the pricing or risk models of complex derivatives in real - time; climate research can simulate changes in the Earth's system to predict extreme weather patterns.

A McKinsey report shows that by 2035, 72% of technology executives and investors believe that fault - tolerant quantum computing will achieve commercialization. Once successful, it will become the most disruptive technological breakthrough for humanity after AI.

However, the potential of quantum computing coexists with challenges. The "superposition state" of qubits is extremely fragile. Temperature changes, magnetic field fluctuations, or even a weak ray of light can cause it to "collapse," leading to computational failure. As an IBM scientist said, "Shaking a table is enough to render an entire quantum computer useless."

This is the background for the birth of the Loon processor. It does not pursue absolute perfection but uses mathematical and engineering means to enable the system to "survive in the presence of errors." This shift from "pursuing perfection" to "tolerating imperfection" is the essential leap in quantum engineering.

03. The Intensifying Global Quantum Race

Researchers check the cryostat used to cool quantum computing chips at Google's Quantum AI Laboratory.

IBM's breakthrough has undoubtedly intensified this global "quantum arms race." In the past two years, Google, Microsoft, Intel, and research institutions in Europe and China have successively announced new progress in quantum projects, striving to gain the upper hand in the next computing era.

· Google (Google Quantum AI)

In late 2024, Google launched the Willow quantum chip, claiming that it can complete computational tasks in 5 minutes that would take a traditional supercomputer 10 to the power of 24 years. The core design of this chip is to reduce "scaling errors," that is, when the number of qubits increases, the errors only grow linearly rather than exponentially.

· Microsoft (Microsoft Quantum)

In early 2025, Microsoft announced the development of the Majorana 1 chip, which uses topological quantum materials and can theoretically produce more stable qubits. Microsoft said that this is the result of "creating a new form of matter," which can significantly extend the lifespan of quantum information.

· Emerging Forces

The startup Quantinuum is collaborating with BMW Group and Airbus to use quantum algorithms to improve fuel cell efficiency; 1QBit is working with Accenture Labs and Biogen to explore the application of quantum simulation in drug design.

The common goal of these projects is to bring quantum computing from the laboratory to the real - world industry. As Professor Natarajan of MIT said, "Future laboratories may no longer need test tubes and microscopes but simulation environments running on quantum platforms."

Meanwhile, governments around the world are also actively involved. Reports suggest that some quantum companies are discussing a cooperation model with the US Department of Commerce, where they would exchange equity for government funds. Although the official has not confirmed this, it shows that quantum technology has been elevated to a national strategic asset.

04. From Breakthrough to Application: The Critical Point of the Quantum Era

Although IBM's Loon and Nighthawk represent significant progress in quantum computing, to reach the real "quantum era," three thresholds - technological, economic, and ethical - still need to be overcome.

Firstly, the technological threshold. Quantum computers need to operate in an environment close to absolute zero (-273°C), which makes their large - scale deployment extremely expensive. In addition, indicators such as system stability, quantum gate depth, and quantum connectivity are still limited. IBM plans to launch a commercial - grade system with thousands of qubits before 2030 and build an open cloud platform, allowing research institutions and enterprises to remotely access quantum computing power.

Secondly, the economic threshold. McKinsey estimates that the annual investment in the global quantum computing field has exceeded $7 billion. However, a stable profit - making model has not been formed in the short term. Companies such as IBM and Google are adopting the "Quantum - as - a - Service" business model, opening algorithm platforms in advance to provide experimental computing power for pharmaceutical, financial, and materials companies.

Thirdly, the security and ethical threshold. Once quantum computing reaches a critical capacity, existing encryption algorithms such as RSA and elliptic curves may be cracked within minutes. The US National Institute of Standards and Technology (NIST) has launched a plan to develop "post - quantum encryption standards" to prevent future risks. "Whoever first has quantum computing capabilities will have the power to redefine network security," Natarajan warned.

IBM's researchers may realize that the Loon and Nighthawk are not just the birth of single technological products but a restart of human computing civilization. From the abacus to the transistor, from AI to qubits, humanity has repeatedly broken the boundaries of computing limits. Now, quantum computing allows us to touch for the first time a future where "time and complexity are no longer obstacles."

Sridhar Tayur, a professor at Carnegie Mellon University, said, "Today we are still using spoons and forks for brain surgery, but the quantum computer will be the real scalpel." Perhaps, what will truly change the world is not AI that makes machines more like humans but quantum intelligence that allows machines to break free from human thinking constraints.

This article is from "Tencent Technology", author: Wuji. It is published by 36Kr with authorization.