HomeArticle

Andrew Ng interprets the sky - high salaries in AI: The $100 million piled up by capital isn't just a passing fad.

大数据文摘2025-08-08 15:14
Meta has further raised the ceiling a bit.

A new salary earthquake has hit the artificial intelligence community. Andrew Ng has made comments on this.

Andrew Ng said on Twitter that Meta has offered AI large model developers a compensation package exceeding $100 million, which has shocked the entire technology industry. Although most of this compensation will be paid over several years, it is still rare enough to make headlines.

Andrew Ng pointed out that this move is not impulsive but based on precise capital logic.

Meta plans to invest $66 billion to $72 billion in capital expenditures this year, a large part of which will be used to build AI infrastructure, such as data centers and GPU clusters. In the face of such huge investments, spending hundreds of millions of dollars to "recruit talent" is just a small part of the cost structure.

He said that this is in sharp contrast to traditional software startups. Ordinary startups spend about 70% to 80% of their budgets on employee salaries, while in AI large model companies, salaries have become the smallest item of expenditure.

If a company has decided to spend billions of dollars on GPUs, it is completely reasonable to spend hundreds of millions of dollars to hire people to make good use of these hardware.

High salaries are not an emotional expression but an investment allocation.

Andrew Ng observed that a company building AI foundation models has relatively few employees but extremely high capital investment. This structure of "fewer people, more money" provides a natural environment for super-high salaries.

Before Meta's move, the salaries of some AI model trainers had reached as high as $5 million to $10 million per year. Now, Meta has pushed this figure to a new height.

This is not just Meta's strategy.

Netflix is also a representative of a similar model. It plans to invest $18 billion in content production this year, while the company has only 14,000 employees. Against the background of an extremely low proportion of labor costs, the company can continuously pay salaries higher than the market average and form a strong corporate culture positioning, such as "We are a sports team, not a family."

In contrast, Foxconn, a manufacturing giant, has more than 1 million employees globally. In this labor-intensive structure, every increase in salary will multiply the overall cost pressure.

Andrew Ng pointed out that the salary logic of AI companies and traditional companies has taken two completely different paths.

Capital-intensive companies can use capital to exchange for talent and leverage huge computing power with a small number of people; labor-intensive companies can only use people to exchange for output value and maintain operations with a large amount of labor.

Andrew Ng reminded that although Meta has complex businesses, including Facebook, Instagram, WhatsApp, Oculus, etc., its AI training system is becoming the most costly and crucial part.

He specifically mentioned that platforms such as Meta have long relied on user-generated content (UGC) to attract users' attention and then generate revenue through advertising. Now, AI-generated content (AIGC) is quietly approaching.

AI can automatically produce text, images, videos, and even interactive content. When AI content begins to compete with human content for attention, the logic on which UGC depends for survival is shaken.

Meta is not the only one feeling anxious.

Similar platforms such as TikTok and YouTube have also realized that AIGC may completely reshape the entire social ecosystem. Therefore, they are competing to invest and deploy AI strategies, and "recruiting talent with high salaries" has become the most direct manifestation of this war.

Andrew Ng analyzed that when Meta hires AI talent at a high price, in addition to valuing their future contributions, it may also aim at "technical profiling" to gain insights into competitors' technical layouts by introducing key figures.

This is a common business game in Silicon Valley, called "recruiting talent to obtain technical intelligence." As long as this practice does not damage the corporate culture, it is a reasonable strategic expenditure.

Andrew Ng also mentioned that as early as ten years ago, he had built a budget model for an AI team to evaluate how many employees to hire and how many GPUs to purchase with a given amount of funds to achieve optimal productivity.

Today, such a model is almost no longer needed. The answer is already written in the industry logic: Everything is tilted towards hardware, and everything prioritizes expansion.

But even so, talent is still the key.

He is happy for those AI practitioners who have received high salaries and also emphasizes that every person engaged in the AI industry deserves respect. Their work is participating in a far-reaching technological revolution.

The following is the original Twitter post, compiled by DeepSeek:

Recently, Meta has made headlines for offering "off-the-chart" compensation packages to large model R & D talents: the total package for a single person often exceeds $100 million (usually paid over several years). The company plans to invest $66 billion - $72 billion in capital expenditures such as data centers this year, a significant portion of which is directed towards AI. From a pure financial perspective, it is not irrational to spend billions more to retain top talent in order to maximize the utility of this hardware.

For typical application - oriented startups that do not participate in training foundation models, the expenditure structure is usually as follows: 70% - 80% is spent on human resources, 5% - 10% on rent, and another 10% - 25% on various operating costs (cloud services, software licenses, marketing, legal/financial, etc.). However, scaling up models is a highly capital - intensive task, and salaries only account for a small fraction of the total cost. So, even though the teams are not large, they have the ability to offer extremely generous compensation: if you're already spending tens of billions on GPU hardware, why not spend one - tenth of that on people? Before Meta recently raised the bar, salaries for large model training positions were already high, with many people earning $5 million - $10 million annually; and Meta has pushed the ceiling even higher.

Meta has many business lines, including Facebook, Instagram, WhatsApp, Oculus, etc., but the Llama/large model training line is particularly capital - intensive. Many of Meta's products rely on UGC (user - generated content) to attract attention and then monetize through advertising. AI is both a threat and an opportunity to this model: if AIGC (AI - generated content) starts to replace UGC as the main "attention - grabbing" supply and carry advertising, the landscape of social media will be reshaped.

This is why Meta, like platforms such as TikTok and YouTube, is highly focused on AIGC and it makes sense for them to invest heavily in AI. Furthermore, recruiting key talent not only gains their future output but may also provide insights into competitors' technologies; as long as it doesn't harm the company culture, high - paying salaries are a rational business choice.

High - paying employees in capital - intensive industries is not a new story. For example, Netflix plans to invest $18 billion in content this year, and paying salaries to its 14,000 employees is only a small part of the total cost, allowing it to offer above - market salaries over the long term. This way of spending money has also shaped its unique culture of "We're a team, not a family" (which works for Netflix but may not be applicable to all companies). On the contrary, labor - intensive manufacturing companies like Foxconn, with over a million employees globally, have to be more frugal with their salary spending.

As early as ten years ago, when I led an AI scaling - up project, I created a spreadsheet model: how much of the budget should go to people and how much to GPUs? We used a custom "people (N)+ machines (M)→ output" function to estimate productivity and optimize the combination of N and M within the budget constraints. Since then, the spending structure for AI scaling has clearly tilted towards GPUs.

I'm genuinely happy for those who are getting large compensation packages. Beyond individual numbers, I also want to thank every colleague in the AI field. Everyone working in AI deserves a decent return. Although the salary gap is widening, I think it reflects a broader historical context: right now, those in AI development are at a historical juncture that can have a huge impact and drive world - changing innovation.

This article is from the WeChat official account "Big Data Digest", and is published by 36Kr with authorization.