Two new judgments from Wu Yongming and Alibaba Cloud's intensified and aggressive investment
Text by | Deng Yongyi
Edited by | Su Jianxun
In late September, Hangzhou was shrouded in a light drizzle, but the AI fever in Yunqi Town made it feel as if the summer heat had not subsided at all.
On September 24, the 2025 Yunqi Conference was held as scheduled. At the press conference, Wu Yongming, CEO of Alibaba Group and Chairman and CEO of Alibaba Cloud Intelligence Group, delivered a speech titled "The Road to Super Artificial Intelligence."
The 2024 Yunqi Conference was Wu Yongming's first public appearance after taking charge of Alibaba Cloud for more than a year. At that time, he said, "The greatest imagination of generative AI is 'by no means creating one or two new super apps on the mobile phone screen, but taking over the digital world and changing the physical world.'"
If this statement a year ago was more of a vision, then a year later, this vision has turned into a more concrete roadmap and radical actions.
At this Yunqi Conference, Alibaba Cloud unveiled a dizzying array of new products. For example, the newly released flagship model, Qwen3-Max, is currently the most powerful model in Alibaba's Tongyi large model family. Its performance surpasses that of GPT5, Claude Opus 4, etc., ranking among the top three globally on LMArena.
In addition to the flagship model, Alibaba also released six new models, including: the next-generation basic model architecture Qwen3-Next and its series of models, the Qianwen programming model Qwen3-Coder, the visual understanding model Qwen3-VL, the full-modal model Qwen3-Omni, the visual basic model Wan2.5-preview, and the large speech model Tongyi Bailing.
△Source: Alibaba Cloud
What's even more noteworthy are two rather radical new judgments made by Wu Yongming.
One definite judgment he put forward is that large models are the next-generation operating systems. Large models will engulf software, allowing anyone to create an infinite number of applications using natural language. In the future, almost all software that interacts with the computing world may be Agents generated by large models, rather than the current commercial software.
That's why, in the past few years, Alibaba Cloud has been undergoing a round of reconstruction of all operating systems, from the underlying computing power, the middle infra, to the upper - layer cloud, to adapt to the changes in the underlying technology stack brought about by large models.
The second judgment is also based on this logic: Super AI Cloud is the next - generation computer. Drawing an analogy to the development stages of computers, natural language is the programming language in the AI era, Agents are the new software, Context is the new Memory, and LLM will be the intermediate layer that bears the interaction and scheduling between users, software, and AI computing resources, becoming the OS in the AI era.
Alibaba Cloud's goal is to build a "Super AI Cloud" to provide a global intelligent computing power network.
In February this year, Alibaba proposed a three - year, 380 - billion - yuan AI infrastructure construction plan. Today, Wu Yongming added a new plan - in order to embrace the arrival of the ASI era, compared with 2022, by 2032, the energy consumption scale of Alibaba Cloud's global data centers will increase tenfold.
Alibaba Cloud also proposed for the first time the development strategy and goal of AI: not the much - discussed AGI (General Artificial Intelligence) in the past, but the more advanced ASI (Super Artificial Intelligence).
Wu Yongming specifically explained the three stages of the road to super artificial intelligence:
1. "Intelligent Emergence": AI acquires the ability of generalized intelligence by learning from humanity, such as the global knowledge collection, and gradually develops reasoning ability;
2. "Autonomous Action": AI masters tool - using and programming abilities to "assist humans," which is the current stage of the industry;
3. "Self - Iteration": AI achieves autonomous learning by connecting to the full - scale raw data of the physical world and can ultimately "surpass humans."
In 2025, the global large model field moved forward amidst uncertainties. In 2025, after OpenAI launched GPT - 5, its performance fell far short of market expectations, and there were constant comments about the stagnation and setbacks in model innovation. On the other hand, Meta and OpenAI made more radical capital investments - no one wants to miss this wave of technological revolution.
Now, Alibaba Cloud has proven with practical actions that it not only wants to invest but also double down on radical investments.
The market also responded enthusiastically to Alibaba Cloud's new strategy. Today, Alibaba's Hong Kong - listed shares continued to rise, surging more than 9% during intraday trading, hitting a new high since October 2021.
Seven Model Launches, Saturated Investment
Before the Yunqi Conference, Lin Juyang, the person - in - charge of Alibaba's Qwen large model team, announced on Twitter that they would release more than six new products, and none of them were "small things."
When the models were officially launched, there were even more than expected. It was a very sincere release. Zhou Jingren, CTO of Alibaba Cloud, flipped through the slides in the PPT very quickly during his sharing at the Yunqi Conference. He was rushing to speak but still went over the time limit.
Alibaba Cloud launched a total of seven new models. The scale and performance improvement of each model can be regarded as a major release:
Qwen3 - Max: The flagship model, with a pre - trained data volume of 36T tokens and a total number of parameters exceeding one trillion. It has significantly improved coding and Agent tool - calling capabilities;
Qwen - Next: The next - generation model architecture and its series of models. The total number of model parameters is 80B, and activating only 3B can achieve performance comparable to that of the Qianwen 3 flagship version with 235B parameters. The model training cost is more than 90% lower than that of the dense model Qwen3 - 32B;
Qwen 3 - VL (Visual Understanding): It can not only accurately interpret the content of pictures and charts. A breakthrough is that it also has the innovative "visual programming" ability, which can directly convert visual design drafts into front - end code and operate mobile phones and computers, advancing from "seeing" to the stage of understanding and execution;
Qwen3 - Coder (Code Model): It has significantly improved the generation speed, code quality, and security, making it easier to complete complex tasks such as code completion, bug fixing, and one - click generation of complete projects;
Qwen3 - Omni, a native multi - modal model. Simply put, it is an all - in - one machine that can "listen, speak, see, and write." It can communicate as naturally as a human, understand audio and video, and maintain its text and image capabilities. It is suitable for use as a personal AI in vehicles, glasses, and mobile phones;
Tongyi Wanxiang Wan2.5 - preview: A brand - new visual basic model with capabilities such as text - to - video, image - to - video, text - to - image, and image editing. It can generate human voices, sound effects, and music BGM that match the pictures;
Tongyi Bailing: A brand - new speech model family, including sub - models such as speech recognition and speech synthesis. For example, Fun - CosyVoice can provide hundreds of pre - set voices, which can be used in scenarios such as customer service, sales, live - streaming e - commerce, consumer electronics, audiobooks, and children's entertainment.
△Source: Alibaba Cloud
Alibaba Cloud doesn't just use static datasets to prove the capabilities of its models. In blind - tested authoritative rankings such as LMArena, the preview version of Alibaba's flagship model, Qwen3 - Max, has ranked third on the Chatbot Arena ranking list.
After DeepSeek set off a global AI industry storm, it also sparked a battle over domestic open - source models, in sharp contrast to the situation last year when each company worked in isolation.
Both at home and abroad, there has been a round of open - source model battles this year. Almost all manufacturers still investing in models have increased their efforts in open - source. Among domestic giants, Alibaba is the most radical in the open - source route.
On the one hand, this is because Alibaba was one of the first domestic companies to open - source models and build a model ecosystem. These investments have now yielded tangible returns, giving Alibaba the motivation to make more radical investments.
DeepSeek and Qwen are among the few models that have truly gained global recognition. After DeepSeek triggered the open - source trend, Qwen has once again caught the attention of the global AI community and is experiencing a new round of growth.
As of now, Alibaba's Tongyi has open - sourced more than 300 models, covering "full - scale" models of different sizes and "full - modal" models such as LLM, programming, image, speech, and video.
Globally, the Tongyi large model is also the number one open - source model, with a global download volume exceeding 600 million times and more than 170,000 derivative models globally.
△Source: Alibaba Cloud
In addition to models, Alibaba Cloud also released a brand - new Agent development framework, ModelStudio - ADK this year. Agents can autonomously plan and call models, which will lead to more computing power consumption. Alibaba Cloud also revealed a figure. With the continuous improvement of model capabilities and the explosion of Agent applications, the average daily model call volume on Alibaba Cloud's Bailian platform has increased 15 - fold in the past year.
The investment in model open - source not only accelerates model iteration but has also translated into cloud - based revenue. Alibaba has initially established a commercial closed - loop in the AI era. The latest quarterly financial report shows that Alibaba Cloud's quarterly revenue increased by 26% year - on - year, and AI - related revenue has achieved triple - digit growth for eight consecutive quarters.
According to a report by the international authoritative market research institution Omdia, in the first half of 2025, the scale of China's AI cloud market reached 22.3 billion yuan, and Alibaba Cloud accounted for 35.8%, ranking first, with a market share higher than the sum of the second to fourth - ranked companies.
"Be the Android in the LLM Era"
In 2024, the release of OpenAI's Sora and the stagnation of GPT - 5 R & D, along with discussions on the technical route, once brought the global large model field into a brief period of low - morale.
However, this sentiment has basically dissipated now. Just a few days before the Yunqi Conference, NVIDIA announced a $100 - billion investment in OpenAI. At the conference, Wu Yongming also predicted that the cumulative global AI investment will exceed $4 trillion in the next five years.
Zhou Jingren, CTO of Alibaba Cloud, also admitted to "Intelligent Emergence" in a media interview after the Yunqi Conference: Currently, there are few major differences in the general direction of the technical route in the entire industry. Almost all global companies are making radical investments in AI competition and quickly releasing models. The problem lies in how each manufacturer actually does it.
"The current model competition is a competition between systems," Zhou Jingren said. "There is no such thing as 'holding back a big move' in the development and innovation of models. It is complementary to the underlying infrastructure and the cloud."
How to understand the "system"? It may point more to an AI strategic choice.
After DeepSeek changed the global AI narrative, all major companies have increased their investment in AI, from the underlying computing power to cloud computing and open - source.
The differences in the AI routes of major companies have formed an interesting contrast. Take the recent Tencent Ecosystem Conference as an example. Tencent talks more about scenarios and implementation in the B - and C - ends. It first applies AI to its own business and then expands externally. ByteDance is more like iOS, adopting a legion - style approach from models to applications. The difference is that ByteDance is used to first developing a better - performing version in a closed - source environment and then slowing down the pace of open - source.
2023 was a crucial turning point for Alibaba Cloud. After Wu Yongming took over as CEO of Alibaba Cloud, he proposed the strategy of "AI - driven, public cloud first."
Since then, Alibaba Cloud has mainly accomplished several things in the past few years: First, it returned to public cloud and cut projects with high implementation costs and low profits. Then, it invested a large amount of budget in AI. It not only made external investments in startups such as the six AI unicorns but also made great efforts in model self - research, open - source, and infrastructure reconstruction.
Alibaba Cloud's current route is closer to Google's. From the underlying computing power infrastructure, the middle - layer cloud computing, to the upper - layer models, both Alibaba and Google adopt a full - stack self - research and independent construction strategy and ensure that each layer is leading globally.
The ASI proposed by Alibaba today is not a new term. In March this year, Google DeepMind revealed a "six - level roadmap for AGI," which corresponds to Alibaba's three - step ASI plan in many aspects. The third stage of ASI, "surpassing humans," is quite similar to AGI Level 6 defined by DeepMind.
△Source: DeepMind
The radical investment in AI also stems from the inseparable relationship between AI and cloud computing. Alibaba Cloud even announced a new positioning today, becoming a "full - stack artificial intelligence service provider." "Tokens are the electricity in the future AI world," Wu Yongming said.
There is no doubt that we are still in the very early stage of the AI era. Currently, the model call volume accounts for a very small proportion of the enterprise cloud consumption, but the trend change is important.
In an interview after the conference, Xu Dong, General Manager of Alibaba Cloud's Tongyi large model business, told the media that a year ago, most of the large - model call volume came from data labeling for offline tasks. But a year later, the call volume for online tasks has increased dozens of times. Enterprises in various industries are integrating large models into their production processes, which proves that large models are rapidly bringing incremental growth to the cloud market.
In the past 16 years, providing the "water and electricity" of the digital world was Alibaba Cloud's explanation of its own market value for a long time. This is actually in line with the ecological position of the "Android in the LLM era" that Alibaba Cloud is now advocating.