Paris-based AI startup Mistral AI is making strides in the competitive landscape of artificial intelligence, positioning itself as a notable alternative to leading companies like OpenAI and Anthropic. The startup's latest announcement reveals the launch of Mistral Large, a powerful new large language model aimed at matching the reasoning abilities of top-tier models such as GPT-4 and Claude 2.
Alongside Mistral Large, the company is introducing Le Chat, its own chat assistant currently available in beta. Users can sign up to experience its capabilities at chat.mistral.ai.
If you’re unfamiliar with Mistral AI, the company has quickly garnered attention for its impressive funding achievements. Incorporated in May 2023, Mistral AI raised a substantial $113 million seed round shortly thereafter. By December, the startup secured an additional $415 million, with prominent venture capital firm Andreessen Horowitz (a16z) leading the round.
Founded by former employees of Google’s DeepMind and Meta, Mistral AI initially emphasized an open-source approach in AI development. While its first model was released under an open-source license, granting access to model weights, the same does not apply to its larger models.
Mistral AI's business strategy increasingly mirrors that of OpenAI, as it offers Mistral Large through a paid API with a usage-based pricing model. Users currently pay $8 per million input tokens and $24 per million output tokens for queries. In the context of AI, tokens represent segments of words—for example, "TechTom" is considered two tokens, "Tech" and "Tom."
By default, Mistral AI provides context windows of 32k tokens, which typically equates to over 20,000 words in English. Mistral Large supports multiple languages, including English, French, Spanish, German, and Italian.
For comparison, GPT-4 Turbo, which features a 128k-token context window, charges $10 per million input tokens and $30 per million output tokens. Consequently, Mistral Large is currently 20% less expensive than GPT-4 Turbo. Given the fast-evolving nature of the AI sector, pricing is subject to frequent updates.
So, how does Mistral Large compare to GPT-4 and Claude 2? While Mistral AI claims to rank second behind GPT-4 in various benchmarks, potential disparities in real-world applications and possible cherry-picking of benchmarks necessitate further examination. We will conduct tests to evaluate its performance accurately.
Introducing Le Chat
Mistral AI is launching its chat assistant, Le Chat, available for anyone to try during its beta phase. Users can sign up for free and choose from three models: Mistral Small, Mistral Large, and a concise prototype named Mistral Next. It is important to note that Le Chat does not have web access during usage.
There are plans to roll out a paid version of Le Chat aimed at enterprise clients, which will include central billing options and customizable moderation tools.
Mistral AI Partners with Microsoft
In tandem with today’s announcements, Mistral AI has revealed a partnership with Microsoft. Beyond offering Mistral’s API, Microsoft plans to provide Mistral models to its Azure customers, thereby expanding its model catalog.
While this may seem like a minor addition, it signals the beginning of collaborative discussions between Mistral AI and Microsoft. This partnership will likely enhance Mistral AI’s customer reach through a new distribution channel.
Microsoft, a significant investor in OpenAI’s capped profit subsidiary, also embraces various AI models within its cloud platform, including partnerships with Meta to offer Llama large language models on Azure. This open partnership approach effectively keeps Azure customers engaged within their ecosystem, potentially mitigating concerns regarding anticompetitive practices.
Correction: A prior version of this article mistakenly compared Mistral Large’s pricing to an outdated version of OpenAI’s GPT API. Mistral Large is, in fact, 20% cheaper than the current iteration of GPT, known as GPT-4 Turbo.