HomeArticle

Eric Schmidt: In this wave of AI, Google's past cannot be avoided.

AI深度研究员2026-03-26 09:54
Eric Schmidt on AI Transformation: Google's Past and the Reconstruction of Future Work

The best way to foresee the technological trends in the next decade is to listen to the person who once "dealt the cards" at the main table.

As the former CEO of Google who helmed the company for a decade (2001 - 2011), Eric Schmidt witnessed numerous moments of technological breakthroughs. He was present throughout the entire process of how cutting - edge technologies like Transformer, TPU, and AlphaGo were incubated in the lab and gradually became the foundation of Google's business system.

In this conversation, Schmidt revealed many little - known past events.

Those investments that were once questioned by the outside world as "money - burning" eventually became crucial strategic moves. Those once - marginal experimental projects later completely reshaped the landscape of the entire technology industry.

Schmidt's recollection of the past is not just about reviewing history.

He wants to make one thing clear: To understand the end - game of today's AI revolution, one must first understand the path Google took back then.

Section 1 | Google's Past: A Puzzle No One Understood at the Time

Looking back more than a decade, Google also went through a period of obscurity when it was not understood.

What the outside world saw at that time were just some "cutting - edge but lacking clear application scenarios" breakthroughs. In 2014, Google spent $600 million to acquire DeepMind. The skeptical voices were extremely harsh at that time: Why invest so much money in an AI team that had "zero revenue and only knew how to play Go"?

It was only a few years later that people realized that just optimizing the air - conditioning system of the data center with their AI technology had already recouped the cost of this acquisition.

In 2016, Eric Schmidt flew to South Korea to watch the century - long man - machine Go match of AlphaGo. Backstage at the venue, the South Korean camp's room was full of confidence, vowing to defeat Google. On the other hand, the Google control room that Eric entered was unusually quiet. There was only one screen showing real - time win - rate predictions: 50%, 51%, 52%... The system architect David told him: "Our plan is to make this number approach infinity."

When the match ended, the South Korean Go player shed tears on the spot. However, the DeepMind team's reaction was extremely calm: "AlphaGo just did what it was supposed to do."

Eric later recalled, "Welcome to the new era."

But Go was far from the end. After conquering Go, the same team got bored and turned to studying protein folding. At that time, some researchers joked that no one would associate playing Go with solving biological problems. What was the result? Thanks to the birth of AlphaFold, a research project that originally took human scientists 4 years was condensed into just 1 hour.

These things seemed independent of each other at that time. Looking back now, we can see that they were gradually forming a complete set of capabilities.

The Transformer architecture completely changed the way machines process language and information. TPU provided greater - scale and more efficient computing power support, while DeepMind continuously explored the boundaries of machine reasoning and self - optimization. They evolved independently at first but eventually converged, forming different dimensions of the same underlying ability. Moreover, these abilities did not just stay in the lab. After protein folding, they entered search, advertising, and cloud computing, becoming the foundation of Google's entire business system.

Eric Schmidt recalled this experience and said: When these things were happening, most people actually didn't know where they would lead.

This is exactly the most important feature of that technological incubation period. On the surface, it was just a leap in single - point technology, but in fact, the way the industry operates was being quietly rewritten.

Looking at the current AI wave from this perspective, you can understand his current judgment: Many innovations may only seem like "useful" tools for now, but once they are combined with each other and take one more step forward, they will bring about a systematic reconstruction of the enterprise operation model.

Section 2 | This Round of Qualitative Change: The Full Transfer of Execution Power

If Google's past period of obscurity was about laying the foundation for the technological base, then this round of change has directly subverted our "workflow".

Eric Schmidt gave a very representative example: Now, a programmer only needs to clearly define the requirements and design the test cases, and then can fully hand over the task to the system. He doesn't need to monitor the whole process and can go for a meal or take a rest. When he turns on the computer the next morning, the complete code result is already waiting for him.

Schmidt sighed that for the same amount of work, it used to take 10 programmers in Google 6 months of hard work; now, while people are sleeping, the system can complete the whole process by 4 a.m.

The daily work of programmers in the Bay Area is also experiencing this reversal. In the past, it was "humans write 80% of the code and machines complete the remaining 20%"; now it has become "humans write 20% and the remaining 80% is handed over to machines".

Facing the evolution speed shown by these systems, Schmidt himself admitted that he had a strong sense of crisis: "Oh my god, I'm going to be out of a job. It seems that there's nothing I can do that it can't." He pointed out that even those who develop these systems have noticed the depreciation of their traditional skills and are being forced to transform from "code writers" to "AI system managers".

The combination of these phenomena has long exceeded simple "cost - reduction and efficiency - improvement".

The past man - machine cooperation model was: You write a part, it supplements a part; you ask a question, it answers a question. Humans dominated the whole process and had to be involved in every step.

Now the way of working has changed: People set the goals and standards, and the intermediate process is automatically completed. The work is divided into two parts. One part is to define the problem, and the other part is to complete the process. The former is still in human hands, and the latter is being handed over.

This paradigm shift is not limited to software development.

When a task can be clearly described and has a clear acceptance standard, it may be completed by AI. Whether it is interface design, content generation, or data analysis and testing, as long as the goals and evaluation systems are clear enough, the process no longer requires repeated human intervention.

Schmidt even described a more radical scenario: Without restrictions, a team can wake up thousands of "AI Agents" at the same time. They don't need to rest, have no communication costs, and won't slow down the progress due to scheduling. At this time, the only threshold becomes: whether you have the ability to issue clear instructions and whether you have enough resources to keep them running continuously.

When the "execution power" is transferred, people's positions in the workplace naturally change. The focus of work has changed from "doing things personally" to "clearly stating what is needed and checking if it is done correctly".

So the core of the impact brought by this round of change is not whether a specific skill will be eliminated, but that the entire work process in the workplace is being re - defined and divided.

Section 3 | Next: Three Definite Evolution Directions

If one only sees that "AI can complete things on its own", it is easy to understand this as just a tool upgrade. However, looking at Google's path in the past, after the capabilities start to be stably released, what really changes the industry is often not the technology itself, but the entire business ecosystem reconstructed around it.

The first change is that computing power resources have become a new threshold

Currently, a standard data center usually requires 400 megawatts of electricity and covers an area up to half a mile long. In Schmidt's view: "These data centers are essentially huge air - circulation systems. They suck in air, expel air, and use air cooling in between. At the same time, there is also a water - circulation system inside to keep the chips cool. These chips have a power of 2 kilowatts. Working in such an extreme environment can even be fatal."

Such a physical scale corresponds to an astronomical amount of investment. Building 1 GW (gigawatt) of computing power requires tens of billions of dollars. By 2030, the computing power gap in the United States may be as high as 92 GW. More importantly, the more advanced the technology and the richer the application scenarios, the greater the demand for computing power. In the future, whoever can own and stably dispatch these heavy - asset infrastructures will have the initiative at the table.

The second change is the polarization of the industry landscape

Tech giants with abundant funds and resources will build ultra - large - scale computing power through long - term investment and eventually transform into underlying platforms to provide services externally. At the same time, more and more small and micro teams or even "super individuals" can use these platforms to complete business that used to require an entire team.

The most difficult situation will be faced by those "mid - sized enterprises" in the middle. They don't have enough resources to build basic capabilities and can't compete with small teams in terms of efficiency. Eventually, their living space will be continuously squeezed and they will be eliminated. This process is very similar to Google's situation in the past: Once the infrastructure is in place, the industry will quickly concentrate at both ends.

The third change is the re - evaluation of personal value

Schmidt mentioned that in Silicon Valley, in the past, the value of the top programmers was always 10 times that of ordinary programmers. Now, this multiple may be even larger. It's not because they write code faster, but because they know better how to set goals, how to verify results, and how to break down complex tasks.

"These very few people will become extremely expensive," Schmidt concluded. "Because these powerful systems still ultimately need top - notch human brains to operate."

Looking at these three changes together, the future survival rule is clear: The resources you have determine the size of your business, your ecological position in the industrial chain determines how long you can survive, and your personal ability determines how many AIs can work for you at the same time.

Why does Schmidt always mention Google?

The core of the Internet wave was to solve the problem of "finding information"; while the core of the current AI wave is to reconstruct "how to work".

Conclusion | Not Just a Quantitative Change in Efficiency

When Google spent a huge amount of money to acquire DeepMind back then, the outside world generally couldn't understand it.

When AlphaGo defeated the world's top human Go players, most of the public only regarded it as a geeks' carnival.

It was only when these capabilities fully penetrated into search, advertising, and cloud computing that people suddenly realized: Those were not just a few isolated experimental projects, but Google was quietly reshaping its business core.

Now, the same scenario is playing out again.

Only this time, what is being completely subverted is no longer "how to obtain information", but "how to complete work".

At this moment, most people are still unaware and are still talking about so - called "efficiency improvement".

📮 Original Article Links:

https://www.youtube.com/watch?v=DpwmmXmzvfo

https://www.youtube.com/live/eYUYdpG4UT8

https://fortune.com/2026/03/06/eric - schmidt - former - google - ceo - big - tech - data - centers - grid - ai - utility - bills/

https://www.amnh.org/explore/videos/isaac - asimov - memorial - debate/2026

This article is from the WeChat official account "AI Deep Researcher". Author: AI Deep Researcher. Published by 36Kr with authorization.