HomeArticle

Jensen Huang's latest speech: In the next 10 years, the way you work will be completely changed.

笔记侠2025-10-01 16:56
What are the future intelligent trends?

Note-taker says:

When asked about his prediction a year ago that "inference computing power would increase by a factor of one billion," Jensen Huang, the CEO of NVIDIA, candidly said in a recent interview, "I really underestimated it."

Behind this self - reflection is the fact that the AI revolution is surging forward at a pace far exceeding the most optimistic predictions.

In this interview, he systematically expounded for the first time the superimposed effects of the three expansion laws driving AI development, and thoroughly explained the underlying logic of the "double exponential growth" of computing power demand.

From revealing the strategic significance of the cooperation with OpenAI's trillion - scale "Stargate" project, to declaring that "the era of general computing is over," and then discussing the global "sovereign AI competition," Jensen Huang described a brand - new blueprint for social and economic operation in the next 10 years.

He believes that we are not experiencing a technological upgrade but facing a real industrial revolution. In the future, there will be billions of "AI colleagues" around the world.

Can we understand and adapt to the trend and board this accelerating train?

I believe that after reading today's content, you will gain a lot of inspiration.

I. What did Jensen Huang mainly talk about?

In this interview on September 25, 2025, Jensen Huang actually said three things: the AI industrial revolution has arrived, we have been prepared for it, and in the future, it's up to everyone's capabilities.

You may think the term "AI industrial revolution" is too grand, but Jensen Huang made it very clear in the interview: We are not waiting for the future; we are already standing at the starting line of a new industrial revolution.

The "steam engine" of this revolution is not an iron lump burning coal but an AI factory that can work autonomously.

His core view, to put it simply, is "one logic, two strategies, and one overall situation." Let's break it down.

1. One core logic: AI progress doesn't rely on "brute force" but on a "triple - engine"

Previously, we always thought that making AI smarter only relied on "piling up resources" - more data and sufficient computing power would naturally make the model stronger. But Jensen Huang said that this idea is outdated. Now, AI is powered by three "engines" simultaneously, and the demand for computing power is skyrocketing, not just like an ordinary rocket but a "two - stage rocket."

The first engine: Pre - training (laying the foundation)

This is like a child going through primary school to university, reading all the books in the library to understand "what the world is like." AI will "swallow" all the information on the Internet, learn to speak, acquire knowledge, and understand logic. This is the stage we are most familiar with, where "strength creates miracles."

The second engine: Post - training (training skills)

This is new knowledge! It's not enough for AI to "graduate"; it also needs to attend a "practical training course."

Pre - training is like learning "stepping on the accelerator, turning the steering wheel, and looking at the rear - view mirror" when getting a driver's license. These are the basic rules of driving. But after getting the license, you have to practice repeatedly at rush - hour intersections on "how to change lanes without causing traffic jams," on rainy days on "how to brake without skidding," and even in narrow alleys on "how to reverse into a parking space in one go."

These targeted trainings are "post - training." Only when you form a conditioned reflex can you be considered an experienced driver capable of handling complex road conditions.

The same goes for AI's post - training. After learning to "speak" in pre - training, post - training requires repeated practice on specific tasks such as "writing code" and "doing design" until they can be completed accurately. This step consumes a huge amount of computing power because thousands of methods need to be tried to find the optimal solution.

 

The third engine: Inference (using the brain)

Previously, AI was like a "Baidu search," directly retrieving answers from memory when asked a question. Now, AI is like a "meeting of experts." When you ask a question, it will first "search for information," "verify facts," "ponder the logic," and even learn new things before giving you a reliable answer. And the longer it thinks, the higher the quality of the answer.

In the future, AI will not work alone but as a "smart team" composed of multiple models.

These three engines don't work in turns but are "accelerated simultaneously." On one hand, more and more people are using AI. On the other hand, the computing power consumed by each person using AI once is also increasing. When these two factors are multiplied, the demand for computing power shoots up vertically.

This also explains why OpenAI is investing hundreds of billions to build the "Stargate" computing infrastructure. There simply isn't enough computing power!

2. Two strategic pivots: NVIDIA is not just selling chips; it's "building a racetrack"

Facing such a crazy demand for computing power, NVIDIA didn't just focus on "making chips faster" but played two "game - changing moves."

Pivot one: From "selling race cars" to "creating an entire racing ecosystem"

Jensen Huang said that NVIDIA is no longer just a chip company; it has become an "AI infrastructure contractor."

The technologies they update annually, such as Blackwell and Rubin, don't just upgrade chips. Instead, they completely rebuild models, algorithms, software, CPUs, GPUs, networks, and data centers, and ensure that these components fit together seamlessly. He calls this "extreme collaborative design."

A bit hard to understand, right? Let's use an analogy.

Previously, NVIDIA made faster race cars (GPUs). Now, they design everything from how to lay the racetrack, how to set traffic rules, how to build gas stations, and even the entire racing ecosystem.

As a result, from Hopper to Blackwell (Note from Notesman: two important technical architectures in NVIDIA's GPU products), the performance has tripled in just one year!

Pivot two: What makes us better than competitors is "long - term cost - saving"

Jensen Huang made a bold statement: "Even if our competitors give away their chips for free, customers will still choose us." This is not an empty boast. The core lies in the "total cost of ownership" (Note from Notesman: the sum of all costs associated with a product, service, or asset throughout its entire lifecycle. It includes not only the initial purchase price but also subsequent costs for use, maintenance, operation, and final disposal).

In the AI era, what data centers lack most is not money but electricity and space. Although NVIDIA's solution may be a bit more expensive, it can generate extremely high performance per watt of electricity.

With the same electricity bill, NVIDIA's solution can generate several or even dozens of times more AI results, and thus earn more money.

Let's do a simple calculation: If your electricity budget is fixed, using free chips can earn you $1 million, while using NVIDIA's paid chips can earn you $10 million. Which would you choose?

The answer is obvious. Free chips may seem cost - saving, but in fact, you'll lose $9 million in potential revenue. The opportunity cost is too high. So, customers would rather spend a bit more money to choose a more efficient solution.

3. A global perspective: The AI market is about to explode, and every country wants to be in control

Based on the above logic, Jensen Huang painted a huge "cake" and pointed out a new trend.

Market size: First, "replace the old," then "create the new"

Jensen Huang said that "the era of general computing is over." Traditional computing devices worth trillions of dollars globally need to be replaced from CPUs with AI - driven devices.

Just replacing the "recommendation engines" (such as the short - videos you watch and the ads you see) of companies like Google, Meta, and ByteDance from CPUs to GPUs represents a market worth hundreds of billions of dollars.

 

In addition, AI will also add value to the global GDP of $50 trillion generated by humans. For example, if a company spends $10,000 on an AI tool that can double the efficiency of an employee with an annual salary of $100,000, the return on investment is huge.

Sovereign AI: Every country wants to "build its own AI base"

Jensen Huang sees clearly: AI is now the "key" to a country's economy and security. Every country wants to develop "sovereign AI" - building its own AI infrastructure and integrating its own culture, history, and data rules into AI.

Just as every country needs its own power grid, communication network, and Internet, AI infrastructure has now become a "necessity." This is another trillion - level new market driven by governments around the world for NVIDIA.

Attitude towards China: Don't underestimate China; cooperation is the smart choice

This interview also touched on the Sino - US technological competition. Jensen Huang was very straightforward: NVIDIA should be allowed to compete fairly in China, which is beneficial to both China and the United States. He also reminded that don't underestimate the innovation ability of Chinese enterprises. China has the world's top entrepreneurs and engineers.

If the United States pursues "decoupling" and cedes the market to Chinese local enterprises, it's like throwing away its weapons before the battle even starts. It's a self - defeating move.

You see, Jensen Huang's logic is actually very simple: The three "engines" have made the computing power insufficient; NVIDIA seizes the opportunity through "full - stack transformation" and "cost - saving capabilities"; now, countries around the world are developing "sovereign AI," which has expanded the market.

In this interview, he wasn't just talking about "how good our chips are"; he was talking about "how to play the future game."

II. Five trends in the next 10 years that we should remember

Okay, Jensen Huang has helped us understand the "new rules of the game in the world." Now, here is the "roadmap for the next decade" he presented directly. These are not wild science - fiction ideas but real trends that are already emerging.

1. Computing power becomes the "new electricity." The more electricity bills you can afford, the more competitive you are

Previously, we judged a country's strength by how much coal it mined and how much steel it produced. Later, in the digital age, we judged by the extent of network coverage.

In the AI era, the criteria have changed: Now, it depends on how much "usable computing power" you have, especially the intelligent computing power that can run AI.

Jensen Huang once said a very crucial sentence: "NVIDIA's revenue is almost directly related to power consumption." This sentence is a bit difficult to understand and requires careful consideration.

Why "power consumption"? Because what costs the most in an AI data center is not buying equipment but paying the electricity bill.

You can think of a data center as a "factory that turns electricity into intelligence": The amount of electricity bills you can afford is equivalent to the amount of intelligence you can "produce."

Recently, Wu Yongming, the CEO of Alibaba, also said that by 2030, the total power consumption of Alibaba's data centers will increase by 10 times. This is not just Alibaba's situation; it's the path all players who want to develop AI are taking.

What does this mean? In the future, the competitiveness of a country or a company will not only depend on how many mines it has or how many people it has but on two points:

One is whether it can obtain cheap and stable electricity, and the other is whether it has AI facilities that can efficiently convert electricity into computing power.

Now, the global race for computing power and energy has already begun.

2. AI is no longer a "tool"; it will become your colleague

Previously, we thought of AI as just a "smarter calculator" that would do whatever you told it to. But the AI Jensen Huang talked about will have a "partnership" with you, which is a huge change.

First, the tasks of AI have changed: from "doing things according to orders" to "actively finding solutions." Now, AI is not a single model working alone but a group of models working together. It will search for information, conduct research, and handle complex tasks on its own. For example, if you ask it to write a proposal, it won't just fill in words randomly. It will first search for industry data, analyze competitors, and then come up with ideas for you.

This is like a calculator that could only do addition, subtraction, multiplication, and division in the past but has now become a researcher who can help you analyze problems.

The most practical example is NVIDIA itself. Jensen Huang said: Every software engineer and every chip designer in the company works with AI, with a 100% coverage rate. As a result, chips are designed better, produced more, and at a faster speed.

Let's imagine: In the next 10 years, having an "AI colleague" for white - collar workers will change from a "benefit" to a "standard configuration." Think about it. For an employee with an annual salary of $200,000, if the company spends $20,000 to equip him/her with an AI assistant, the efficiency will double. Which boss would refuse such a good thing?

In the future, the way we work will also change: Previously, we had to stay up late to search for information by ourselves. In the future, AI will organize it for you. Previously, we struggled with how to write a proposal. In the future, AI will give you three drafts first. Therefore, making good use of AI will be the way for future employees to survive and develop in the workplace.