Altman and Nadella's First Conversation After a Difficult Restructuring: "We're a Perfect Match"
Sitting in front of the camera, 40-year-old Altman couldn't hide his smile—a smile that usually only appears when he talks about his newly born son. But at this moment, another "child" is making him equally delighted: computing power.
Microsoft CEO Satya Nadella joked, "Whenever Altman talks about his newborn baby or computer computing power, a smile appears on his face."
This scene took place during a highly anticipated conversation.
Altman and Nadella, two tech leaders, stood on the stage together again to celebrate a milestone agreement finalized this week—a deep cooperation aimed at reshaping the future of AI.
Who could have imagined that a seemingly bold bet six years ago would give rise to one of the most eye-catching alliances in the tech industry today.
It all dates back to 2019. At that time, OpenAI was just a non-profit organization focused on AI research, in urgent need of a large amount of capital and computing power to train increasingly large models.
Microsoft extended an olive branch in a timely manner: a $1 billion investment and Azure cloud computing resources.
This was a groundbreaking bet at that time, and even within the Microsoft board, there were doubts for a while.
However, Nadella had a discerning eye. He saw a new dawn in natural language processing technology, continuing Bill Gates' obsession with natural language interfaces back then.
This investment allowed OpenAI to go all out in the field of deep learning.
As Altman later said, "Without Satya's determination back then, we wouldn't be where we are today."
Six years later, both sides joke that this cooperation "has no script and is all based on belief," yet it has yielded amazing results.
An Unprecedented Structure
A Win-Win for Profit and Public Good
The new agreement announced this week marks the next stage in the unique relationship between Microsoft and OpenAI.
Extended Reading: Altman: OpenAI's Listing Is Inevitable
According to the agreement, OpenAI has officially completed its restructuring: under the non-profit foundation, OpenAI Group Public Benefit Corporation (PBC) was established as a for-profit entity.
The non-profit OpenAI Foundation holds OpenAI shares worth up to $130 billion, becoming one of the largest charitable funds in the world.
This design aims to solve the core contradiction in AI development: on one hand, continuous investment from commercial capital is needed to push the technological limits; on the other hand, it is necessary to ensure that the final results benefit all of humanity rather than just a few.
The OpenAI Foundation announced that it will initially allocate $25 billion in funds to two areas: healthcare and AI security and resilience.
This huge sum of money will be used to tackle diseases and diagnostic challenges, as well as to address the social challenges brought about by AI development (such as cybersecurity and preventing the misuse of AI).
Altman is very enthusiastic about this.
He pointed out that the market mechanism cannot automatically drive research investment in these areas, but advanced AI has the potential to accelerate breakthroughs in life sciences and enhance society's resilience to risks, and someone should take on this task.
He said, "If we can use AI to discover new drugs, cure diseases, and make these discoveries available to all of humanity, it will be a remarkable thing."
OpenAI's innovative "two-tier structure" provides a path: the upper-level public welfare foundation controls the direction and the greater good, while the lower-level for-profit company can draw "ammunition" from the capital market to fully promote the R & D and product implementation of AI technology.
Even the California Attorney General's Office, after review, said it does not object to this structure, indicating its pioneering nature.
What does this mean for Microsoft?
As OpenAI's closest strategic partner and largest external shareholder, Microsoft's rewards are no longer just the book appreciation of its financial investment.
Microsoft currently holds approximately 32.5% of OpenAI's shares (worth about $135 billion).
Such a shareholding ratio is like binding the fates of the two companies.
Nadella said excitedly:
Since 2019, we have shared a vision with OpenAI—to develop AI in a responsible manner and make it benefit the public.
This relationship, which started with research investment, has now grown into one of the most successful partnerships in the industry.
Altman even said bluntly, "I really hope Satya can earn a trillion dollars from this investment!"
Altman half-jokingly mentioned this astronomical figure, which is not only gratitude for Microsoft's early support but also full confidence in OpenAI's future value.
Exclusive on the Cloud
Azure Becomes the New Engine of AI
This agreement also clarifies the division of labor and understanding between the two sides in terms of products and the market.
According to the new agreement, the APIs of OpenAI's current and future most powerful artificial general intelligence (AGI) models will be exclusively deployed on the Microsoft Azure cloud platform for the next seven years to provide computing power services to the outside world.
If other large technology companies (such as Google and Amazon) want to directly access OpenAI's top-notch model capabilities, they can only do so through Azure and cannot deploy these models on their own clouds.
This exclusive arrangement will last until 2032, unless OpenAI achieves AGI first and it is verified by an independent expert committee.
Such an exclusive strategy undoubtedly makes Azure a "holy land" for AI developers.
In the past year, countless startups and large enterprises have deployed their businesses on Azure in order to use the powerful capabilities of ChatGPT.
This has directly promoted the rapid development of Microsoft's cloud service business.
In the first quarter of fiscal year 2025, the revenue of Microsoft's Intelligent Cloud division increased by 27% year-on-year, with Azure making outstanding contributions and its growth rate far exceeding that of its main competitors, AWS and Google Cloud.
More notably, Microsoft disclosed that its commercial cloud business has $392 billion in unfilled orders, a year-on-year increase of 51%, with an average contract period of about two years.
A large part of these orders are huge computing power contracts related to AI.
Among them is the $250 billion Azure pre - purchase contract signed between OpenAI and Microsoft.
This means that OpenAI has committed to purchasing a huge amount of Azure cloud resources in the next few years to ensure that it has sufficient computing power for training and deploying AI models.
Such an astonishing contract amount has even raised questions from Wall Street analysts:
How can a company with a current revenue far lower than this amount digest hundreds of billions of dollars in cloud service expenditures?
After all, it is reported that OpenAI's revenue in 2023 was only in the order of billions of dollars.
However, Microsoft CFO Amy Hood is not worried about this.
She explained in a conference call that these large orders are usually delivered in stages, and Azure's flexible "elastic clusters" can serve any customer. "Hardware such as GPUs and CPUs will only be deployed when the contract is actually initiated and delivered." Microsoft is fully confident in managing risks.
Moreover, OpenAI's performance has grown astonishingly in the past year. ChatGPT accumulated 100 million monthly active users in just two months after it was launched in November 2022, becoming the consumer - level application with the fastest user growth in history.
The huge user base and payment prospects (such as the launch of a $20 - per - month subscription service) have made OpenAI's revenue curve rise sharply.
It is reported that at the end of 2023, OpenAI was raising funds on the secondary market at a valuation of $86 billion, almost tripling its valuation at the beginning of the year.
This has undoubtedly strengthened Microsoft's confidence—Hood said bluntly that if it weren't for the limited computing power supply, Azure's revenue could have been even higher.
The Battle for Computing Power
From High Demand to Possible Oversupply
When talking about computing power supply, this pair of partners both showed excited yet helpless expressions.
When the host asked about the growth bottleneck, Altman and Nadella said in unison, "We're extremely short of computing power."
Altman sighed, "The extent to which we are held back by the lack of computing power is beyond the imagination of the outside world."
In the past year, OpenAI has had to limit the access speed of new users and reduce the context length of models multiple times, all due to the constraints of graphics card and server production capacity.
Nadella also revealed that if there were more GPUs, Microsoft Azure's cloud business could have had higher growth.
This is not an exaggeration.
It is reported that Microsoft's capital expenditure last quarter reached as high as $34.9 billion, a sharp increase of 40% compared with the previous quarter, and a large part of it was invested in building new data centers and purchasing AI chips such as Nvidia H100.
Even so, it is far from meeting the surging demand.
Hood admitted helplessly, "We've been in short supply for several consecutive quarters. We thought we could catch up, but we didn't—the demand is still rising."
Microsoft found that the bottleneck lies not only in chip production capacity but also in the most traditional infrastructure such as electricity and rack space.
Nadella made an image analogy: "We're building a planetary - scale cloud and AI factory."
This "factory" requires not only silicon wafers but also a comprehensive supporting system of steel, cement, power grids, and cooling systems.
The two CEOs judge that AI computing power will still be in severe short supply in the next two or three years.
As Jensen Huang, who provides chips, said, the probability of "computing power oversupply" in the short term is almost zero.
However, they have also experienced the cyclical ups and downs of the Internet and mobile waves and are not careless about the long - term reversal of supply and demand.
Altman even boldly imagined a possibility: If one day a technological breakthrough improves model efficiency by several orders of magnitude and everyone can run a personal "AGI" on their laptops, the current all - out arms race for computing power will suddenly change tone—just like when the fiber - optic networks over - built during the dot - com bubble 20 years ago were once idle.
But then he changed the subject, "We'd rather prepare for the future in advance than regret being too slow later."
Therefore, OpenAI would rather sign an advanced large - scale contract for computing power than be in a tight spot at a critical moment.
This enterprising spirit and calm recognition of risks are a major characteristic of the way Microsoft and OpenAI work together: forward - looking but not reckless, optimistic yet prudent.
The Great Change in Software Paradigm
From App to Agent
The upsurge at the hardware level is only part of the AI revolution.
Both Altman and Nadella know well that AI is reshaping the form of software and the paradigm of human - computer interaction.
The emergence of ChatGPT allowed people to experience for the first time that they can skip the cumbersome application interface and directly get the answers or solutions they need by talking to AI.
Nadella called this the "magic moment" when the UI is combined with AI capabilities.
As the model capabilities continue to increase, this conversational interaction is evolving into a more powerful Agent.
It can not only answer questions but also "take action" to perform tasks, help you write code, organize schedules, and even autonomously call other software to complete a whole series of work.
Nadella predicted last year that traditional business software (those "thin shells over databases") will gradually disintegrate with the rise of Agents.
In the past, enterprise software often encapsulated data, business logic, and interfaces in one system, and the high licensing fees were for this fixed process.
But now, a general - purpose Agent can understand natural language instructions and flexibly call various back - end services to meet users' specific needs.
This is equivalent to replacing the "business logic layer" of software with AI.
This is subversive to the SaaS industry. Those single - function and expensive systems may be replaced by versatile AI assistants.
However, Microsoft is not worried that its products such as Office 365 will be subverted.
On the contrary, they actively injected the "soul" of AI into Office, launched Microsoft 365 Copilot, and added conversational assistants to applications such as Word, Excel, and Outlook, which can draft documents, analyze tables, and summarize emails according to users' intentions.
Each enterprise user is willing to pay $30 per month for this, which is even higher than the price of the Office suite itself, indicating its great value.
Just a few months after its release, the deployment momentum of Copilot has exceeded that of any of Microsoft's previous office products.
Similarly, GitHub Copilot in the software development field empowers programmers. While AI completes code, the code output has increased exponentially,