首页文章详情

The most expensive Chinese in Silicon Valley has emerged. Pang Ruoming, an alumnus of Shanghai Jiao Tong University, has a salary soaring to over $200 million, outshining Yu Jiahui and Tim Cook.

新智元2025-07-10 15:33
Another Chinese AI superstar has emerged in Silicon Valley! He is Pang Ruoming, the former head of Apple's foundation model team. His compensation package worth $200 million has far outstripped Tim Cook's annual salary.

Besides Yu Jiahui, another Chinese AI superstar has emerged overnight in Silicon Valley.

This time, the highest salary record has been broken, soaring directly from $100 million to over $200 million!

He is Pang Ruoming, the former head of Apple's AI/ML foundation model team.

This salary is beyond Apple's affordability. After all, Apple CEO Tim Cook's annual salary is only $74.6 million.

What kind of charm does this Chinese scholar have that can make Mark Zuckerberg personally recruit him and offer such a huge salary?

Graduated from Shanghai Jiao Tong University and Led Apple's Large Model Team

Public information shows that Pang Ruoming graduated from Shanghai Jiao Tong University with a major in computer science. Subsequently, he completed his master's and doctoral degrees at the University of Southern California and Princeton University.

It is worth mentioning that he attended Shanghai's key high school, Xiangming High School, in high school.

After graduation, Pang Ruoming joined Google and served as the chief software engineer for 15 years.

During his time at Google, he participated in the Bigtable indexed structured search and ZipIt projects. The results of these projects were ultimately adopted by more than 1,000 internal Google projects.

Subsequently, in 2012, he co-founded Google's global consistency authorization system, Zanzibar, with Abhishek Parmar and Zhifeng Chen.

From 2014 to 2017, as the sole person in charge, Pang Ruoming led his team to improve the reliability of the system to 99.999%.

After 2017, he switched to "brain application research" and led the speech recognition research and product implementation of the Google Brain team.

He, together with Wu Yonghui and Zhifeng Chen, jointly led the development of the Babelfish/Lingvo framework. This framework has become the most widely used deep learning platform on Google's TPU, surpassing AdBrain and DeepMind in usage.

In addition, Pang Ruoming is also a core contributor to the Tacotron 2 system.

Tacotron 2 is an advanced end-to-end neural network speech synthesis (TTS) system that can directly generate highly natural human-like speech from text.

Paper link: https://arxiv.org/pdf/1712.05884

In 2021, Pang Ruoming officially joined Apple as a distinguished software engineer and led the foundation model research and development team (AFM).

Previously, Bloomberg reported that Pang Ruoming led a team of about 100 people internally to build a large model to support Apple's AI and the next-generation Siri.

During his time at Apple, his main work involved the full-process development of large models, including the design of pre-training architectures, post-training tuning, and improvement of inference efficiency, as well as the construction of multimodal capabilities, that is, the development of core technologies for simultaneously understanding and generating multimodal content such as text and images.

Now, it is no exaggeration to describe Apple as being in a difficult situation, in a hurry, and extremely worried.

The strict review of the AFM team by Apple's new leadership, Daphne Luong, has put unprecedented pressure on the engineers.

In addition, the company has been discussing the introduction of third-party models from OpenAI and Anthropic to support the new version of Siri, which has seriously demoralized Pang Ruoming's team.

During his time at Apple, Pang Ruoming led his team to achieve many remarkable results, such as the multimodal large model MM1 and Apple's AI foundation model.

He also participated in the open-source deep learning training framework, AXLearn project, which is specifically designed for the efficient training of large-scale AI models. AXLearn has received 2.1k stars on GitHub.

Pang Ruoming has published more than 100 influential research papers on his personal Google Scholar page, and his papers have been cited more than 46,000 times.

Moreover, he once co-authored a paper with Yu Jiahui.

Liangliang Cao, the current head of Google's Gemini, once praised Pang Ruoming:

Ruoming is a rare engineer who is highly respected and loved by everyone. He is an outstanding machine learning expert who not only has a deep understanding of ML but also has a profound understanding of infrastructure.

At the same time, he is humble and helpful. I believe he has helped colleagues in many teams at Google, including the Brain, Speech, and Advertising teams (and perhaps many other areas I'm not aware of).

Ruoming has excellent software skills and machine learning expertise. Every time I read his code, I can't help but be amazed. I feel very honored to have worked with Ruoming.

It is obvious that Mark Zuckerberg's recruitment of Pang Ruoming from Apple has added a great general to the Super Intelligence Lab team.

He not only has a deep understanding of machine learning algorithms but also is proficient in the underlying infrastructure architecture. He is a top expert who "both knows how to train models and how to build the infrastructure."

A $200 Million Salary, Outshining Cook?

One person, $200 million.

It can be said that Pang Ruoming's strength fully justifies this sky-high salary.

Foreign media reported that Meta's salary package mainly consists of the following three parts: basic salary, signing bonus, and Meta stock.

Signing bonus: For employees who partially give up their equity in startups, the signing bonus will be correspondingly increased to compensate for the potential benefits they may have missed.

Stock: This is the largest component of the salary package, and its distribution is linked to specific performance indicators.

The basic salary and signing bonus are both paid directly in cash.

In order to build an AI system that can match or even surpass humans, Meta has spared no expense in recruiting talent and has assembled a "dream team."

As of now, most of the members of Mark Zuckerberg's AI team are Chinese scholars. The majority of them come from OpenAI, including:

· Jiahui Yu: Graduated from Tsinghua University with a bachelor's degree and obtained a doctorate from UIUC. He was the head of perception research at OpenAI and led the development of GPT-4.1.

· Shengjia Zhao: Graduated from Tsinghua University with a bachelor's degree and obtained a doctorate from Stanford University. He participated in the development of core projects such as GPT-4 and o1.

· Shuchao Bi: Graduated from Zhejiang University with a bachelor's degree and obtained a doctorate from UC Berkeley. He is in charge of the multimodal post-training team.

· Hongyu Ren: Graduated from Peking University with a bachelor's degree and obtained a doctorate from Stanford University. He is the creator and core contributor of models such as o3-mini and o1-mini.

· Ji Lin: Graduated from Tsinghua University with a bachelor's degree and obtained a doctorate from MIT. He participated in the construction of o3/o4-mini, GPT-4o, GPT-4.1, GPT-4.5, 4o-imagegen, and the Operator inference stack.

· Huiwen Chang: Graduated from Tsinghua University with a bachelor's degree and obtained a doctorate from Princeton University. He is the co-creator of GPT-4o image generation.

The salary levels of these core members are among the top in the industry.

It is reported that their salary packages far exceed those of the CEOs of major global banks and are among the highest in the entire business world