Lin Junyang, the person in charge of the Qianwen model, proposed to leave the company. Alibaba executives urgently answered questions | Exclusive from Intelligence Emergence
Author | Deng Yongyi
Editors | Su Jianxun, Yang Xuan
"I should have known this earlier."
At around 13:00 PM Beijing Time on March 4th, Tongyi Lab urgently convened an All Hands meeting. Wu Yongming, the CEO of Alibaba Group, candidly told the Qianwen employees.
12 hours earlier (at 00:11 AM Beijing Time on March 4th), Lin Junyang, the technical leader of Alibaba's Qianwen large - model, suddenly announced his departure on X. Lin Junyang was the core promoter of Alibaba's AI open - source models and one of Alibaba's youngest P10s. When the industry was in an uproar, many members of the Qwen team couldn't accept the sudden departure of the team's soul figure.
"On the premise of having far fewer resources than competitors, Junyang's leadership is one of the core factors for achieving today's results," more than one Qianwen member commented to 36Kr.
At the meeting, represented by Liu Dayiheng (Qwen Pretrain Lead), Qwen members raised multiple questions to Alibaba's senior management regarding aspects such as team split, new member Zhou Hao, model route selection, resource investment, etc.
The participants in this meeting included several senior executives of Alibaba, the Qwen team, and other team members of Tongyi Lab. Regarding key issues such as team adjustment and strategic direction, Wu Yongming, the CEO of Alibaba Group, Jiang Fang, the Chief Talent Officer of Alibaba, and Zhou Jingren, the CTO of Alibaba Cloud, made multiple responses.
Regarding this adjustment, the core definition given by Alibaba's senior management is that Qwen is not shrinking. This is a team expansion, unrelated to any political struggles. Instead, more resources need to be invested.
"We are developing rapidly. This wave of adjustment is to recruit more talents and provide more resources," Jiang Fang, the Chief Talent Officer of Alibaba, also admitted that there was a lack of communication. "We didn't communicate well about this organizational form. The introduction of new people will definitely bring about changes in the team structure. These are inevitable during the expansion process, and we may not have handled it well."
There are rumors that Zhou Hao will directly lead Lin Junyang and his related teams. However, according to Intelligence Emergence, details such as Zhou Hao's position and reporting line are still under discussion.
At the meeting, Alibaba's senior management emphasized several times that the Qianwen basic model is the most important thing for the group currently. The competition in large - models is not just the matter of the Qwen team but the entire Alibaba Group. Whether it is the research and development of basic models or the construction of underlying infra, it will be coordinated and promoted at the group level. "We must surpass."
Zhou Jingren, the CTO of Alibaba Cloud, responded to sharp questions such as recruitment quotas and shortage of computing power: Why do external customers (such as large - model startups) have a smooth experience when purchasing Alibaba Cloud's computing power, while internal teams are struggling with computing power and recruitment quotas?
Zhou Jingren said that the team has been in a state of "constant resource shortage." There are many historical reasons for the differences between internal and external teams. They are making an overall plan for the future but did not elaborate further.
There was no updated conclusion about Lin Junyang's whereabouts at the meeting. But around 2 PM, Lin Junyang posted on his WeChat Moments again, saying, "Guys in Qwen, keep working as originally planned. It'll be okay," without clearly stating whether he would return.
Source: Lin Junyang's personal WeChat Moments
Several days ago, Alibaba just completed a round of AI strategy update. Internally, the general term and core brand of AI were unified as "Qianwen," and the organization also underwent a new round of adjustment.
Intelligence Emergence learned that previously, Qwen had its own pre - training, post - training, and Infra teams. In terms of model modalities, there were also multiple directions such as language models, multi - modalities, and code.
In the past, training single - modality models was the mainstream in the industry. However, with the increasing demand for visual understanding, the Vision Language Model emerged, making the in - depth integration of various modalities a major development trend.
A person familiar with the matter once told Intelligence Emergence that since 2025, Lin Junyang has been seeking to have employees in the fields of language, image, video, and code work together to improve model training efficiency. The Qwen team once proposed to merge with the Wanxiang team but failed, so they started to develop their own qwen - image model.
But in this round of adjustment, Tongyi Lab hopes to split the Qwen team according to dimensions such as pre - training, post - training, visual understanding, and image, and merge them with teams in Tongyi Lab (such as Tongyi Wanxiang and Tongyi Bailing teams) to work together. But without sufficient information communication, conflicts broke out.
"Lin Junyang is worth one hundred million US dollars on his own."
On the evening of March 2nd, Qianwen announced the open - sourcing of four small - sized Qwen 3.5 models on X. Elon Musk liked the tweet and said, "Amazing density of intelligence."
The sudden and unexpected departure of Lin Junyang, the soul figure of the Qianwen model, left the Qwen team in confusion.
After Zhou Chang, the former technical leader of Alibaba's Qianwen, left, Lin Junyang, born in 1993, took over Alibaba's Qwen team in 2022 and was responsible for the overall technical work.
In recent years, Alibaba's Qwen model family has also been developing rapidly. From the initial Tongyi model family to the Qwen 2.5, Qwen 3.5 and other series of models, it has grown into a world - class model team. In many aspects, Qwen is undoubtedly the number one family of open - source models.
Many former members of Alibaba's model team told Intelligence Emergence that when large - models in China were just starting in 2023, there were also differences among domestic large enterprises regarding whether to open - source and the extent of open - sourcing. But Alibaba's open - source strategy was implemented early enough and with sufficient strategic determination, which was largely due to the active promotion and implementation by Zhou Chang, Lin Junyang and others.
Along with Lin Junyang's departure, several Qwen members also announced their resignations. Among them were core leaders in various sub - directions of Qwen models, such as:
- Hui Binyuan: The person in charge of the Qwen code direction, the person in charge of the Qwen - Coder series of models, responsible for the full - process Agent training from pre - training to post - training, and also participated in related research on Embodied Intelligence (Robotics) recently.
- Yu Bowen: The person in charge of Qwen post - training research, graduated from the University of Chinese Academy of Sciences, and led the development of the Qwen - Instruct series of models.
- Kaixin Li: A core contributor to Qwen 3.5/VL/Coder, graduated with a doctorate from the National University of Singapore.
In addition to the above - mentioned people, several young researchers also submitted their resignations on the same day.
More than one Qwen researcher posted low - mood posts on Twitter and Xiaohongshu. "Qwen is nothing without its people." This statement echoed the posts of OpenAI employees on Twitter when OpenAI was in the turmoil of CEO replacement in 2024.
Source: X
Lin Junyang's official announcement of departure caused a huge sensation in the AI community. Many feedbacks came from overseas developers, all expressing their gratitude to Lin Junyang for promoting the open - sourcing work of Qwen. "The end of an era," said Yuchen Jin, the founder and CTO of Hyperbolic Labs.
Source: X
"If this group of people really leave, the Qwen model will be delayed for at least half a year to one year, and the team will need to be reorganized and retrained," an investor commented to Intelligence Emergence. Another person in the AI industry said, "Lin Junyang is at least a talent worth over one hundred million US dollars."
Many rumors claim that Lin Junyang's departure was "involuntary." But according to Intelligence Emergence, Lin Junyang submitted his resignation on March 3rd, and the details have not been finalized with Alibaba. The Qwen team members got this news on the afternoon of March 4th.
The latest news obtained by Intelligence Emergence is that Alibaba's senior management is still in close communication with Lin Junyang. It is still unknown whether Lin Junyang will confirm his departure from Alibaba.
After Lin Junyang's departure, the new member who will take over the post - training work of Alibaba's Qwen is Zhou Hao from Google DeepMind. A Qwen team member told Intelligence Emergence that Zhou Hao briefly joined Quark in January 2026 and was then transferred to Tongyi Lab. In terms of the reporting line, Zhou Hao reports directly to Zhou Jingren. Many people say that he will take over the post - training work of Qwen.
Zhou Hao graduated from the University of Science and Technology of China with a bachelor's degree and from the University of Wisconsin - Madison (UW - Madison) with a doctorate. According to his LinkedIn profile, he worked at Meta for 3 years and at Google DeepMind for about 4 years. He was a core contributor to the Gemini 3.0 model, led the implementation of the multi - step RL direction with tools and thinking chains, and was deeply involved in projects such as Gemini 1.0, AI Mode, and Deep Research.
Alibaba has won a good reputation in open - sourcing, but it wants more
On March 3rd, Lin Junyang just released several small - sized open - source models on X (Twitter). These models are suitable for mobile phones and other devices, and also follow Qwen's long - standing open - source route.
Many people simply misunderstand open - sourcing models as "charity," but this is not fair.
Good open - sourcing first serves the entire developer ecosystem of Alibaba Cloud. Because of earlier open - sourcing, the Qwen model family was able to quickly get community feedback in the early stage of model development, grow rapidly, and in turn, promote model training.
The full - size and full - modality model route of Qwen allows many technical practitioners in enterprises and schools to quickly select suitable models for themselves, establishing a good reputation. After these models are put into the production environment, many enterprise owners will also tend to purchase Qwen's model services, which are indirectly converted into Alibaba Cloud's commercial revenue.
But it is difficult to prove the commercial logic behind this, which is also a long - standing problem for open - sourcing both at home and abroad. Meta spent billions of dollars training Llama but made it freely available. The outside world is still arguing about how to calculate this cost, and it has never been reflected in Meta's financial reports.
Although Alibaba still maintains an excellent reputation in open - sourcing, on the side of closed - source flagship models, the Qwen 3 and Qwen - 3.5 series of flagship models released by Alibaba in 2025, despite remaining in the first echelon, are showing signs of struggling.
The departure of core members such as Lin Junyang is largely due to the rapid change of Alibaba's current AI strategy, which has created a gap with the goals of the basic model team.
Both catching up with flagship models and maintaining the leading position in open - sourcing are important, but the training resources of Alibaba's basic model team are relatively limited.
Since 2023, the Qwen family has cumulatively open - sourced more than 400 models, covering multiple parameter scales from 0.5B to 235B. It's hard to imagine that the Qwen team, which is the main force supporting these model updates, has only more than 100 people. Including other teams in Tongyi Lab, the total scale is only a few hundred people.
In contrast, ByteDance's Seed team, which is only responsible for base model training, already has nearly 1000 people. In all aspects, the absolute number of people invested by Alibaba is only a fraction of that of its competitors. Many Qwen members have told 36Kr that Qwen has long lacked resources and support in computing power and Infra construction, which has hindered the model's iteration speed.
This is a fierce snapshot of Alibaba's rapid march in AI strategy. In November 2025, the Qianwen App was launched, starting a big Spring Festival battle, which was just the beginning of the AI To C war. Doubao under ByteDance is already approaching the 200 million daily active user mark, and Tencent has not fully exerted its strength. At the same time, Alibaba cannot fall behind in flagship models, which is related to the commercial closed - loop of Alibaba Cloud and the future of the entire Alibaba Group.
(36Kr author Zhou Xinyu has contributed to this article)