OpenAI courts the "trillion-dollar club" to kick off the 2.0 era of AI investment
OpenAI is orchestrating a "risk game" involving top global technology companies with trillions of dollars at stake.
Within just a few months in 2025, OpenAI has successively led a series of huge orders with giants such as NVIDIA, AMD, Oracle, and Broadcom, and plans to launch a series of large - scale AI infrastructure construction projects. On October 13th, OpenAI reached an order with Broadcom for the deployment of AI chips.
Amid OpenAI's "crazy ordering", the market values of its partner companies have entered a "crazy growth mode".
Among them, Oracle has seen the most exaggerated increase in market value. Thanks to a $300 billion deal with OpenAI over the next five years, Oracle's stock price soared by 36% at one point, and its market value once reached nearly a trillion dollars. AMD's market value also skyrocketed by nearly $70 billion in a single day after the cooperation announcement.
In the successive AI investment events, some people have noticed that OpenAI is forming a complete closed - loop value chain through a series of orders for chips, data center construction, and AI technology services. The players participating in OpenAI's cooperation are organizing a systematic AI construction alliance.
However, some people also believe that there are huge risks hidden behind these orders. The expected returns of these AI investments basically depend on OpenAI's long - term returns. If the revenue and profits generated by AI do not meet the standards, it will bring serious "value destruction".
"We are going to spend a lot of money on infrastructure. We are going to place this bet, a company - level high - stakes bet, betting that now is the right time to do this."
As OpenAI CEO Sam Altman emphasized in an interview on October 9th, the frequent cooperation between technology giants and OpenAI recently may reflect a new consensus emerging in the AI industry -
The era of "small - scale tinkering" is over. It's time to go all - in on AI with everything we've got.
The 2.0 stage of AI investment has arrived
If we summarize the significance of the OpenAI - Oracle order in one sentence, it is that this deal has broken the "comfort zone" of technology giants' "rational AI investment" in the past.
In the past, AI investment was essentially a competition among technology giants for disposable funds.
Taking Alibaba and Google as examples, the AI investments of technology giants have been relatively conservative so far, and their overall investment styles are quite rational.
Here, we introduce the Capex (Capital Expenditure) / EBITDA (Earnings Before Interest, Taxes, Depreciation, and Amortization) indicator as a reference. This indicator can reflect how much of a company's operating profit can be used for capital investment.
According to Alibaba's latest quarterly report (as of June 30, 2025), the company's Capex was 38.676 billion yuan, and the adjusted EBITDA was 45.735 billion yuan. The Capex / EBITDA ratio was 0.846.
According to Google's parent company Alphabet's latest quarterly report (as of June 30, 2025), the company's Capex (Purchases of property and equipment) was $22.446 billion. Since Google did not disclose its EBITDA, we can approximate it by adding Operating Income and Depreciation of property and equipment, which is approximately $36.269 billion ($31.271 + $4.998). Overall, Google's Capex / EBITDA ratio is about 0.619.
Generally speaking, the Capex / EBITDA indicators of these two technology giants are quite healthy. This can be understood as the giants only need to use a portion of their annual earnings for AI investment. Even Alibaba is waging a "food - delivery war" while investing in AI.
This kind of investment can be regarded as a profit - statement - based investment. The company invests a certain proportion of its earnings in AI. However, the OpenAI - Oracle order is essentially a debt - based AI investment, which is much more aggressive than Alibaba and Google's approaches.
According to the order information, OpenAI promised to purchase computing power from Oracle for five consecutive years starting in 2027, with a total amount of $300 billion.
On the surface, this order means that OpenAI will spend $60 billion annually on cloud services. But in fact, there is a pre - requirement for this order. That is, Oracle first needs to expand its data centers to provide complete services. To achieve this goal, Oracle plans to issue about $18 billion in new debt to fund this expansion. Since neither OpenAI nor Oracle has "ready cash" for this investment, this order can basically only be realized through debt financing.
The investment in AI has changed from a profit - statement - based investment to a high - stakes "bet" based on the balance sheet. The competition in AI has officially entered the stage of financing competition. Compared with the previous "stable" situation, enterprises are now willing to pay a higher risk premium for AI.
From the perspective of the Weighted Average Cost of Capital (WACC), we can roughly divide enterprises' investment into two stages from low to high.
In the first stage, the low - cost funds that enterprises can use are mainly cash and free cash flow. These funds are like "internal funds" of the company. They are just sitting idle on the books. As long as the enterprise expects the investment return to be higher than the interest earned by leaving the money idle, it is reasonable to use these funds.
This was the stage that most technology giants were in before. Shareholders allowed enterprises to use their "idle funds" to buy GPUs. When shareholders found that the enterprises were not investing enough in AI, they would complain about the lack of attention to AI, and it might even trigger an "AI Capex too low penalty", leading to a decline in the enterprise's valuation.
In the second stage, enterprises start to invest in AI through debt financing, equity financing, and flexible financing methods. Using these methods allows enterprises to quickly obtain large amounts of funds, but the disadvantage is that the financing cost increases significantly, and enterprises must believe that AI investment can bring higher returns.
This is the stage that technology giants are in currently. For example, in the cooperation between AMD and OpenAI, on the surface, AMD is "selling cards" to OpenAI. But in fact, AMD has almost "relinquished" 10% of its shares, and OpenAI uses these shares to buy cards. AMD is essentially using equity to lock in orders, which can be regarded as equity financing. Similarly, the cooperation between Musk's XAI and NVIDIA is essentially a form of flexible financing. Specifically, XAI borrows money from a financing platform (SPV) and sells a portion of its equity to NVIDIA. XAI then uses these two sources of funds to buy cards from NVIDIA.
The change in the investment method is like a player in Texas Hold'em choosing to go all - in with debt. If other players cannot match the chips, they may end up with nothing.
The 2.0 era of AI investment has arrived.
OpenAI is selling "tickets" to the AI era
Behind these phenomena, we can't help but have some doubts.
Why are these technology giants willing to take risks? Why are they increasing their AI investment at this time?
On the surface, OpenAI's "Midas touch" has pushed up the valuations of its partner companies. Although these orders cannot be cashed in immediately, through calculations of expected revenue and profit margins, they can be directly converted into the company's PE (Price - Earnings ratio). From the perspective of previous financing investments, the increase in market value is beneficial for the company's subsequent financing, further paving the way for AI construction.
From another perspective, OpenAI's related orders present a certain ecological relationship.
For example, among OpenAI, NVIDIA, and Oracle, OpenAI provides AI, NVIDIA provides computing power infrastructure, and Oracle provides data, forming a relatively complete ecosystem. Looking at OpenAI's cooperation with NVIDIA, AMD, and Broadcom, NVIDIA provides mature GPUs, AMD provides supplementary GPU computing power, and Broadcom provides customized AI chips and related network facilities.
Summarizing these interrelated clues, a hidden thread emerges - OpenAI is positioning itself as a platform to coordinate various resources and selling "tickets" to board the "AI Ark".
On this ark, some are responsible for chip development (such as NVIDIA and AMD), some are responsible for user products (such as integrating third - party apps through ChatGPT and Sora), and some are responsible for infrastructure construction (such as Oracle and Broadcom's data centers). Everyone is aiming for the era of ASI (Artificial Superintelligence).
The entrance to this ark, whether it is software or hardware, may be more unified than in the Internet era.
For example, in the field of edge - side intelligence, Qualcomm's first AI trend proposed this year is that "AI is the new UI (User Interface)". AI will replace the previous UI - centered interaction mode of intelligent terminals. Users no longer need to click on specific icons, and AI will provide intelligent interaction entrances around the user.
"ChatGPT aims to become a unified entrance, connecting users with various services." As Sam said in an interview, OpenAI has become the world's largest AI application. In line with this trend, Google's Gemini and Anthropic's Claude are both vying for the position of the general user entrance this year.
According to data from research firm Menlo, a quarter of users in the United States choose ChatGPT as their primary entrance. Currently, the market competition is extremely fierce. Although OpenAI is in the leading position, its advantage is not significant. Regarding the importance of the competition for the entrance, Menlo mentioned in its report that "consumers almost always try their default tools first."
What functions an AI that can provide continuous services to users can actually achieve and how much it can change users' lives may not be the most concerning story in the current AI industry.
Because, without a sufficiently large AI infrastructure and AI - related hardware ecosystem, it's like trying to do business in the Internet industry before the millennium. Without network infrastructure, billions of intelligent terminals, and users, even if you propose to do business in areas such as food delivery, online shopping, and cloud services with stable returns, the chances of success will be slim.
"You have to generate electricity, build all the physical infrastructure, sort out all the power equipment and everything outside the data center. You have to ensure the production capacity of chip manufacturing, set up the racks, and you also need consumer demand and a business model to pay for all this. A whole bunch of things need to be done simultaneously."
As Sam said, joining the "OpenAI Alliance" means jointly "blowing up" the scale of the AI industry with technology giants. Before the large - scale monetization of AI, there is always a risk of the "bubble" bursting. On the other hand, if players do not reach a consensus and perform their respective duties, the popularization of AI will face difficulties at present.
Currently, the marginal cost of AI may be much higher than that in the Internet era.
Generating each token in AI incurs real - world computing power and electricity costs. Most AI entrepreneurs today mention in their business logic that they need to "figure out how to monetize from day one" and that the "marginal cost is obvious". Without a complete infrastructure, AI can hardly reach its full technological potential. As Sam said, "Insufficient computing power has already affected the launch of new features and products."
In fact, users also find it difficult to experience the best AI capabilities.
For example, the Sora2 video - generation model, which became extremely popular during the National Day holiday this year, saw its download volume skyrocket and then peak within just three days after its launch. After the surge in the number of users, the "strained" computing power could hardly ensure that every user could enjoy the same effect as the official demonstration video.
The same is true for domestic AI products. For example, Doubao claims that its model can support multi - round conversations with a total input and output of hundreds of thousands of words. But in reality, after users name Doubao, without creating a new conversation, Doubao will forget after just a few dozen rounds of conversation. It may even misunderstand that the user wants Doubao to address the user with that name.
Perhaps, the players accelerating their investment in AI construction are aware of the regret of AI having "great potential but lacking follow - through".
Conclusion
The AI era is accelerating towards a crazy direction. But at the current stage, perhaps we don't need to be overly vigilant about the bubble for now.
In the era of large models, we are witnessing AI transforming all industries and experiencing another consensus on technological construction among humans.
"During the bubble period, optimistic analysts would justify high price - to - earnings ratios by saying that technology would increase productivity. They might have been wrong about specific companies, but the underlying principle was correct." (During the Bubble, optimistic analysts used to justify high price to earnings ratios by saying that technology was going to increase productivity dramatically. They were wrong about the specific companies, but not so wrong about the underlying principle.)
As Paul Graham, the "godfather of Silicon Valley startups", summarized the Internet era in 2004.
With the popularization of AI, AI is becoming an important productivity tool. For example, AI - assisted office work eliminates the need to "type out" meeting minutes manually; AI Agents can provide us with the latest industry briefings every day. "Technology is a multiplicative lever. If the current productivity ranges from 0 to 100, a ten - fold increase can amplify productivity to 1000." Graham said. (Technology is a lever. It doesn't add; it multiplies. If the present range of productivity is 0 to 100, introducing a multiple of 10 increases the range from 0 to 1000.)
On the eve of a new technology era, it is very difficult to estimate a market worth over a trillion dollars. Perhaps, we can refer to the history of the Internet era as a prospect for the AI era.
Since the basic element of data transmission in the AI era is the token, we can choose the IP traffic in the Internet era as a reference for comparison. Suppose we were in 1990 on the eve of the Internet era and estimated the current Internet IP traffic from that perspective. Grok's conclusion is that the actual Internet IP traffic in 2024 is more than four times the "optimistic estimate of the past".