HomeArticle

Musk open-sources Grok 2.5: Chinese companies are xAI's biggest rivals.

量子位2025-08-25 07:18
The file size is 500GB.

Elon Musk's open - source move has attracted a great deal of attention from everyone.

xAI has now officially open - sourced Grok 2.5, and Grok 3 will be open - sourced in half a year.

Actually, as early as the beginning of this month, Elon Musk publicly stated:

It's time to open - source Grok, and it will be done next week.

Although the open - source time has exceeded the node he mentioned, as netizens said:

Better late than never.

42 files, 500GB

Currently, Grok can be downloaded on HuggingFace (the link is attached at the end of the article):

xAI officially recommends running Grok 2 with SGLang. The specific steps are as follows.

Step 1: Download the weight files.

You can replace /local/grok - 2 with any folder name you like:

  • hf download xai - org/grok - 2 —local - dir /local/grok - 2

The official said that some errors may occur during this download process. If an error occurs, you can try several times until the download is successful.

After the download is successful, there should be 42 files in the folder, with a size of approximately 500GB.

Step 2: Start the server.

xAI officially recommends installing the latest version of the SGLang inference engine (version number >= v0.5.1, address: https://github.com/sgl - project/sglang/).

Then use the following command to start the inference server:

  • python3 -m sglang.launch_server —model /local/grok - 2 —tokenizer - path /local/grok - 2/tokenizer.tok.json —tp 8 —quantization fp8 —attention - backend triton

It's worth mentioning that this model requires 8 GPUs (each with more than 40GB of video memory) to run.

The last step is to send a request.

This is a pre - trained model, so we need to ensure that the correct chat template is used:

  • python3 -m sglang.test.send_one —prompt “Human: What is your name?<|separator|>\n\nAssistant:”

After sending the request, we should be able to see that the model will reply with its name, which is Grok.

So, what's the level of Grok 2, the latest open - sourced model by xAI?

Although its capabilities are certainly not as good as those of various current state - of - the - art mainstream models, we can get a general idea from the technical blog of the Grok 2 - related models released by xAI last year.

At that time, on the LMSYS leaderboard, its overall Elo score had exceeded that of Claude and GPT - 4.

Moreover, in a series of academic benchmark tests, the Grok 2 series has achieved performance levels comparable to other cutting - edge models in areas such as postgraduate - level scientific knowledge (GPQA), general knowledge (MMLU, MMLU - Pro), and math competition problems (MATH).

To be honest, although netizens think that Elon Musk's open - source move is quite good, there are also many critical remarks.

For example, on HuggingFace, we didn't see xAI clearly indicate the parameter weights of the open - source model.

Therefore, netizens can only guess, based on past information, that it is a MoE model with 269 billion parameters.

Secondly, there is the issue of the open - source license. Because xAI's statement on HuggingFace is as follows:

In the words of netizens, this is basically a non - commercial license:

Mistral, Qwen, DeepSeek, Microsoft, and even OpenAI are using the MIT or Apache 2.0 license to open - source their models.

And, most importantly, there are the conditions for running this open - source model:

Thanks, I only need 8 GPUs with more than 40GB of video memory now...

Two More Things:

In addition to the open - source move, Elon Musk also released some new features on the Grok APP.

This update (v1.1.58) mainly focuses on AI video generation. The specific effects are as follows:

Interested friends can experience it on the APP.

And Elon Musk also made an interesting remark:

xAI will soon surpass Google, but Chinese companies are the biggest competitors.

Reference Links

[1]https://x.com/elonmusk/status/1959379349322313920

[2]https://x.com/HuggingPapers/status/1959345658361475564

[3]https://x.com/elonmusk/status/1959384678768447976

[4]https://x.com/elonmusk/status/1959388879888302363

This article is from the WeChat official account “QbitAI”, author: Jin Lei. It is published by 36Kr with authorization.