HomeArticle

Does anyone still not know about the "MCP" that Robin Li mentioned?

咏仪2025-04-28 09:23
MCP has become extremely popular, but the "beautiful traps" of AI applications still exist.

Text by | Deng Yongyi

Edited by | Su Jianxun

The trend of large models has now blown onto a new term: MCP.

There is no shortage of new things in the AI circle, but this time it's different. The internet seems to have returned to the spring of more than a decade ago. "Now, developing agents based on MCP is like developing mobile apps in 2010." On April 25th, Robin Li, the chairman of Baidu, said at the Baidu Create Conference.

If you haven't heard of MCP, you must have heard of the previous hot term: Agent. At the beginning of 2025, the sudden popularity of Chinese startup Manus instantly pushed this term into the public eye.

"Real, capable AI" is the key to the popularity of Agent. Before this, large models could answer questions, but they were just simple dialogue windows. Relying on the training the models had received, the data in large models was often not up - to - date. If there was only the large model itself, invoking external tools would go through a very cumbersome process.

The concept of MCP is inseparable from Agent. MCP is an important path for realizing the vision of Agent - large models can freely invoke external tools that support the MCP protocol to complete more specific tasks.

Now, applications including Gaode Maps and WeChat Reading have launched their official MCP Servers. This means that all developers can, like building blocks, first determine which large model to use, and then invoke the MCP servers of Gaode Maps and WeChat Reading, and the large model can complete tasks such as querying maps.

Since February, an MCP wave has been sweeping the world in full swing.

Almost all major tech companies - OpenAI, Google, Meta, and domestic ones like Alibaba, Tencent, ByteDance, Baidu, etc. - have announced their support for the MCP protocol and launched their own MCP platforms, inviting developers and application service providers from all walks of life to join.

If we review the most popular terms discussed in the domestic AI field in 2024, "super app" must be one of them. People generally believed that 2024 would see a great boom in AI applications, but it didn't develop as rapidly as expected. The innovation ecosystem in the AI field was still scattered here and there.

For this reason, the popularity of MCP is no less significant than when Emperor Qin Shi Huang unified the six states during the Spring and Autumn and Warring States periods - unifying the writing, transportation, and measurement standards of various states, thus greatly facilitating economic and commercial exchanges.

Many market evaluations believe that as protocols like MCP gradually become a consensus and trend, 2025 will witness a real explosion of AI applications.

MCP, the "Super Add - on" for AI

In fact, MCP is not a new thing. As early as November 2024, it was announced by Anthropic.

The full name of MCP is "Model Context Protocol", which is a model context protocol. It is an open standard. If an application based on a large model supports the MCP protocol, it's like learning a standardized "language" to interact with external data sources and tools.

If you still find this explanation complicated, then think about the data interfaces on mobile phones and computers - MCP is like installing a "universal socket" for large models, defining a standard "USB interface".

With this interface, developers can develop applications in an orderly manner under a more standardized framework and agreement, and connect to different data sources and workflows.

Before MCP became a trend, the threshold for developing AI applications remained high.

If a developer wants to develop an AI travel assistant, they need to make the large model complete at least several tasks: view maps, search for travel guides on the internet, and write a new travel plan based on user needs.

Then, in order to enable the large model to query maps and search for existing travel guides on the browser, the development process the developer goes through is as follows:

First, each AI provider (OpenAI, Anthropic, etc.) has slightly different implementations of Function Calling. If there is a switch between two large models in the middle, the developer also needs to rewrite the adaptation code for different models, which is equivalent to writing a "user manual" for the large model to use external tools. Only then can the large model learn to use better prompts to invoke external tools. Otherwise, the accuracy of the model's output will drop sharply.

In simple terms, when large models interact with the outside world, the lack of a unified standard leads to too low code reusability. Naturally, the development of the AI application ecosystem will lag behind.

"For any large - model application developer, before the emergence of MCP, developers not only needed to understand large models but also had to do secondary development themselves to embed external tools into their applications. Moreover, if the performance of the tools was not good, developers themselves had to study whether it was a problem with the application itself or the tools." Chen Ziqian, an algorithm technology expert from Alibaba Cloud's Model Zoo Community, told media such as "Intelligent Emergence".

Manus is also a typical example. Not long ago, "Intelligent Emergence" also conducted a review of Manus. Even for writing a simple news article, Manus easily needed to invoke more than a dozen tools, such as opening a browser, browsing and scraping web pages, writing, verifying, and delivering the final result.

In each link, if Manus chose to invoke an external tool, it needed to write a "function" to arrange how the external tool would run. As a result, Manus often aborted tasks due to overload, also because a single task consumed too many Tokens.

But after MCP, the most core change is that developers don't need to be responsible for the performance of external tools. They only need to maintain and debug the application itself, greatly reducing the development workload.

Correspondingly, each single - point server in the ecosystem will maintain its own MCP service - for example, application providers such as Alipay and Gaode Maps will maintain their own MCP servers, update them to the latest version, and wait for developers to connect.

However, the MCP ecosystem is still in its early stage, and it is far from a perfect solution. Many developers have already said that MCP seems to be built just for the sake of establishing a standard - perhaps API is a more concise solution. And if the MCP server is not officially launched and no one maintains it carefully, the security and service stability of connecting to MCP are also not optimistic.

Nevertheless, MCP can be said to be the first truly popular protocol for invoking tools, and its effects are quickly becoming apparent. According to the statistics of the MCP community PulseMCP, more than 4000 MCP servers have been launched globally, and this number is still growing rapidly.

It sounds ideal, but the MCP ecosystem is still in its early stage, and it is far from a perfect solution.

Many developers have already said that MCP seems to be built just to establish a standard - before that, API was already a more concise solution, and large models could already invoke APIs through many protocols. MCP seems a bit redundant.

The MCP services currently released by large companies are basically defined by the manufacturers themselves, determining what functions can be invoked by the LLM and in what way to schedule them. But the same problem will also occur with MCP - large companies are very likely not to give you the most core and real - time information.

And if the MCP server is not officially launched and no one maintains it carefully, the security and service stability of connecting to MCP are also not optimistic.

Independent developer Tang Shuang shared his own experience: The MCP Server of a certain map service has less than 20 tools. Five of them require longitude and latitude to be input. Another one for checking the weather requires users to provide administrative division IDs to query the weather, but no method or documentation on how to obtain these IDs is provided. The only solution is for users to return to the service provider's ecosystem and obtain information and permissions step by step.

It seems that the popularity of MCP is only on the surface, but the game behind it is far from over - although large - model manufacturers are willing to provide MCP services, the initiative still lies in the hands of the manufacturers. No one wants to make a wedding dress for Anthropic's ecosystem. If they don't have the intention to provide good services, developers will have to do double the work, and the logic of this ecosystem will not hold.

Another Victory for the Open Route

However, why has MCP only become popular now?

In the early days when Anthropic first launched the MCP protocol, there were few people paying attention. At that time, only a limited number of applications supported the MCP protocol, such as Anthropic's own Claude Desktop. Developers had not formed a unified AI development ecosystem and were basically working in isolation.

It was the acceptance of the developer community that made MCP gradually come to the center of the stage. Since February 2025, a group of star applications in the field of AI programming - including Cursor, VSCode, Cline, etc. - have announced their support for the MCP protocol, which made the MCP protocol well - known.

After creating a stir in the developer community, what really detonated the MCP protocol was the participation of large - model manufacturers.

The crucial step was undoubtedly on March 27th, when OpenAI announced its support for the MCP protocol, followed closely by Google.

Google CEO Sundar Pichai once expressed his hesitation about MCP. On March 31st, he posted a tweet saying, "To connect to MCP or not to connect to MCP, that is the question." But just four days after posting this tweet, Google also announced its connection to MCP.

△Source: X (Twitter)

The fact that large companies finally chose to embrace MCP is similar to the story of DeepSeek's impact on Silicon Valley: Connecting to MCP is essentially a strategic shift in the ecosystem of large - model manufacturers - Instead of working in isolation, it's better to seek common ground while reserving differences, embrace a more open protocol, and jointly expand the market.

In the past two years, large - model manufacturers' AI strategic layouts often focused on occupying their own territories. This logic is no different from the development history of the internet. For example, Apple became a platform manufacturer and established a strong developer ecosystem instead of simply providing single - point product services, thus gradually establishing a monopolistic advantage.

OpenAI also learned from Apple to do this.

Before connecting to the MCP protocol, the development path of OpenAI's developer ecosystem generally shifted from being open to becoming closed. ChatGPT Plugins were launched in March 2023. At that time, Plugins still allowed third - party developers to add specific functions to ChatGPT and supported the simultaneous use of multiple plugins, which was a relatively open extended ecosystem.

But after launching GPTs and its store in January 2024, OpenAI quickly terminated the service of Plugins. GPTs was designed as a more closed - store model - GPTs can only run on the OpenAI platform, and everything from the model to the application is controlled by OpenAI, profiting through platform commissions. And developers also need to develop for the ChatGPT platform, with a rather limited degree of openness.

So far, the effect of OpenAI's GPTs ecosystem has not been satisfactory. The store is filled with a large number of low - quality, simply - wrapped applications, and the commercialization closed - loop has far from been achieved.

Many ideas in Anthropic's MCP protocol are not industry - firsts. The Function Calling launched by OpenAI is actually also a mainstream standard for large models to invoke external tools, and many technical ideas of MCP are also in line with Function Calling.

But MCP is more user - friendly at the product level. The problem with Function Calling is that developers still need to do secondary programming and a lot of adaptation work, but MCP allows service providers to package these requirements into individual "Lego bricks", greatly reducing the development threshold of AI applications.

Moreover, MCP has a core advantage: it is more open and has a higher degree of abstraction. MCP is just a network protocol and does not limit the underlying models. Any AI model or platform can interact based on MCP and is also applicable to various deployment forms in the cloud or locally.

MCP is open enough and will not let any large company dominate, which allows large companies to find a relatively comfortable position and mindset to connect to this protocol.

To some extent, this is also Anthropic's attempt to regain the developer ecosystem with a more open attitude - OpenAI's closed - door strategy has once again been proven to be a strategic misjudgment. For the still - early - stage cutting - edge technology industry, from DeepSeek to today's MCP, all tell us that the open and open - source route is still the optimal solution at present.

Welcome to follow

Welcome to communicate

This article is from the WeChat official account "Intelligent Emergence", author: Deng Yongyi, published by 36Kr with authorization.