Snowflake Partners with Mistral to Enhance AI Capabilities
Today, Snowflake announced a multi-year partnership with Mistral, the Paris-based AI startup that achieved Europe’s largest seed funding round in June 2023, rapidly establishing itself in the global AI landscape.
This collaboration will integrate Mistral’s open large language models (LLMs) into Snowflake’s data cloud, allowing customers to leverage these models for LLM app development. Additionally, Snowflake is investing in Mistral through its corporate venture capital arm, although the investment amount remains undisclosed.
What to Expect from the Snowflake-Mistral Partnership?
Since its inception, Snowflake has focused on building a robust data infrastructure known as the Data Cloud, which supports various downstream applications, including AI and analytics. In response to the demand for generative AI solutions, Snowflake launched Cortex, a fully managed service designed specifically for LLM app development.
Introduced at last year’s Snowday event, Snowflake Cortex equips enterprises with a suite of AI resources, including open-source LLMs, to analyze their data securely while building applications for specific business needs. The service initially featured LLMs for tasks like sentiment analysis, translation, and summarization, including Meta’s widely used Llama 2 model.
With the new partnership with Mistral, Snowflake will enhance Cortex with high-performance models, including the Mistral Large model, known for its exceptional capabilities. This model ranks just below GPT-4 and outperforms Claude 2, Gemini Pro, and GPT 3.5, offering native proficiency across five languages and a context window of 32K tokens. Additional models included in the integration are the Mixtral 8x7B and Mistral 7B.
Sridhar Ramaswamy, CEO of Snowflake, stated, “By partnering with Mistral AI, we are providing our customers access to some of the most powerful LLMs available, enabling them to create innovative, AI-driven applications effortlessly.” He emphasized Snowflake’s commitment to transforming how enterprises utilize LLMs through Snowflake Cortex while maintaining security and privacy within the Data Cloud.
Although Snowflake declined to disclose its investment amount and the names of early customers testing Mistral’s models, Baris Gultekin, Head of Product for AI at Snowflake, noted that many clients are eager for Mistral Large. Gultekin also confirmed that Google’s new Gemma 7B open model will join Cortex alongside Mistral 7B. The aim is to provide customers with curated access to top-performing conversational models tailored to their needs.
“Snowflake is dedicated to offering flexibility without complexity, enabling organizations to swiftly and securely harness the value of generative AI,” Gultekin added. He noted that the company’s engineering teams are exploring ways to implement LLMs in additional areas, though specifics remain under wraps.
Mistral’s Growing Industry Presence
The agreement with Snowflake marks another significant achievement for Mistral, further solidifying its role as a credible player in the AI sector. Last week, the startup received a $16 million investment from Microsoft, allowing its models to be featured on the Azure cloud platform. Mistral is now the second company to offer models on Microsoft's platform, following OpenAI.
Additionally, Mistral has formed partnerships with IBM to make its Mistral 8x7B model available on WatsonX, and with Perplexity and Amazon, although the latter has yet to introduce Mistral’s models on its Bedrock platform.
As Mistral continues to expand its partnerships, it will be intriguing to see how it enhances its influence and drives AI applications across various sectors. For context, Databricks, a competitor of Snowflake in the data ecosystem, also provides open-source, commercially viable language models for developing generative AI applications, including models acquired from MosaicML for $1.3 billion last year.