xAI Opens Up Grok's Base Model to the Public—Training Code Not Included

Elon Musk's xAI has made an exciting move by open sourcing the base code for the Grok AI model, though it has not included the training code. This model, boasting 314 billion parameters, is categorized as a Mixture-of-Expert model on GitHub.

In a recent blog post, xAI clarified that the model has not been fine-tuned for specific applications, such as conversational use. They mentioned that Grok-1 was trained on a "custom" stack but did not provide further details. The model is available under the Apache License 2.0, allowing for various commercial applications.

Just last week, Musk announced on X that the open-sourcing of the Grok model was imminent. Originally launched in a chatbot format for Premium+ users of the X social network last year, the Grok chatbot was capable of accessing certain X data. However, the newly open-sourced version does not feature these connections to the social platform.

Several prominent companies have released their AI models to the public, including Meta's LLaMa, Mistral, Falcon, and AI2. Additionally, in February, Google unveiled two new open models: Gemma2B and Gemma7B.

Some AI tool developers are already expressing interest in incorporating Grok into their products. Aravind Srinivas, CEO of Perplexity, announced on X that his company plans to fine-tune Grok for conversational search, making it available for Pro users.

"Thanks to @elonmusk and the xAI team for open-sourcing the Grok base model! We will optimize it for conversational search and inference, providing access for all Pro users!" Aravind tweeted on March 17, 2024.

Meanwhile, Musk is currently engaged in a legal dispute with OpenAI, having filed a lawsuit earlier this month citing a "betrayal" of the nonprofit AI mission. He has since publicly criticized OpenAI and its CEO, Sam Altman, on X multiple times.

Most people like

Find AI tools in YBX