Elon Musks Grok 2 ist open source, aber es scheint, als sei es nicht ganz so.
According to ZDONGXI on August 25th, yesterday, Elon Musk officially announced on the overseas social media platform X that Grok 2 is officially open - sourced, and Grok 3 will be open - sourced in half a year.
It's unclear if it was a slip of the pen, but Musk wrote Grok 2.5 in his tweet, while the actually open - sourced version is Grok 2.
▲Musk's official announcement tweet (Source: X)
Grok 2 has open - sourced its model weight files, totaling 500GB, and it has been launched on Hugging Face:
https://huggingface.co/xai-org/grok-2
▲Grok 2's Hugging Face homepage (Source: Hugging Face)
Currently, Grok 2 ranks 3rd on the Hugging Face trending list, being outperformed by two Chinese open - source models, Qwen Image Edit and DeepSeek V3.1.
▲Hugging Face trending list (Source: Hugging Face)
Grok 2 is a large - scale model launched by xAI on August 13th last year. Compared with its predecessor, Grok 2 has comprehensively improved in terms of intuitive response, instruction following, and multi - task adaptability. At that time, Musk promised that whenever a new version of Grok was created, the previous version would be open - sourced.
In July this year, the latest Grok 4 model was officially released, and the open - sourcing of Grok 2, though late, has finally arrived.
01. The open - sourced Grok 2 cannot be used to train AI models
Differently, Grok - 2 does not adopt the Apache 2.0 license (Grok - 1 adopted this license). Instead, it uses a license called the Grok 2 Community License Agreement, which is not a commonly used open - source license in the industry.
According to the Grok 2 Community License Agreement, the model is only for non - commercial and research purposes. It is allowed for commercial use only when the developer agrees to xAI's acceptable commercial policy.
Moreover, (developers) are not allowed to use this model to train, create, or improve any foundational, large - language, or general AI models, but they can make modifications and fine - tune it according to the license regulations.
If (developers) distribute materials, derivatives, or products/services containing such content, they need to prominently mark "Powered by xAI" on the relevant materials or interfaces.
02. Elo ranking drops to 68th, and the Arena comprehensive list drops to 75th
Can this model, which has been out for a year, still perform well in terms of performance?
When Grok 2 was launched, the data from the LMSYS ranking list showed that this model had surpassed the Claude and GPT - 4 series models in terms of the comprehensive Elo rating.
▲The Elo rating ranking announced when Grok 2 was launched last year (Source: xAI)
Now, on the latest comprehensive Elo rating list, Grok 2 has dropped to the 68th place with 1306 points, while the newly launched Grok 4 ranks 2nd with 1433 points, only behind Gemini 2.5 Pro.
▲The latest comprehensive Elo rating list as of August 21st (Source: OpenLM)
On the Arena comprehensive list, Grok 2 has dropped to the 75th place, and the newly launched Grok 4 ranks 12th.
▲Arena comprehensive list (Source: Lmarena)
03. Conclusion: The marketing significance of open - sourcing Grok 2 may be greater than its practical value
One year is enough to reshuffle the landscape of the large - scale AI model market.
Compared with the newly launched open - source models this year, the performance of Grok 2 is not that outstanding, and the many restrictions in its open - source agreement have also posed numerous obstacles for developers to use Grok 2.
There are two speculated reasons why Musk chose to open - source Grok 2 at this time: one is to fulfill the previous promise of "open - sourcing the old version when a new version is released", and the other is to create topics for the open - sourcing of Grok 3 and keep the Grok series models active in the public eye.
This article is from the WeChat official account "ZDONGXI" (ID: zhidxcom), written by Wang Han, and is published by 36Kr with authorization.