Ola Founder Bhavish Aggarwal Invests $230 Million in Indian AI Startup Krutrim to Boost Local AI Development

Bhavish Aggarwal, the founder of Ola, is injecting $230 million into Krutrim, an AI startup he founded, as India pushes to establish its presence in a market historically dominated by U.S. and Chinese tech giants.

Aggarwal is funding the investment primarily through his family office, with plans to raise an additional $1.15 billion by next year. According to sources familiar with the matter, he aims to secure the rest of the funding from external investors.

The timing of the announcement coincides with Krutrim's unveiling of open-sourced AI models and its ambitious plans to build what it claims will be India's largest supercomputer in collaboration with Nvidia.

Krutrim has also launched Krutrim-2, a 12-billion-parameter language model optimized for Indian languages. The model has demonstrated impressive results, with a sentiment analysis score of 0.95, far surpassing the 0.70 score achieved by competing models. Additionally, the model has an 80% success rate in code generation tasks.

In a bid to foster collaboration within India's AI ecosystem, Krutrim has made several specialized models open-source, including those for image processing, speech translation, and text search, all fine-tuned for Indian languages.

Aggarwal, whose other ventures have received backing from SoftBank, expressed confidence in the company’s progress on X, stating, "We’re nowhere close to global benchmarks yet, but have made good progress in one year. By open-sourcing our models, we hope the entire Indian AI community collaborates to create a world-class Indian AI ecosystem."

India's move to become a significant player in the AI field follows recent breakthroughs such as DeepSeek’s R1 “reasoning” model, which has stirred the tech industry. The country recently praised DeepSeek’s success and announced that it would host the company’s large language models on domestic servers, with Krutrim's cloud service already offering DeepSeek on Indian servers.

In addition, Krutrim has developed BharatBench, an evaluation framework to assess AI models' proficiency in Indian languages. This initiative seeks to address the gap in existing AI benchmarks, which largely focus on English and Chinese.

Krutrim’s technical approach includes utilizing a 128,000-token context window, allowing for more complex conversations and longer text processing. The startup’s performance metrics show Krutrim-2 scoring highly in grammar correction (0.98) and multi-turn conversations (0.91).

The new investment follows the January launch of Krutrim-1, a 7-billion-parameter language model that marked India’s first significant entry into the large language model space. The partnership with Nvidia to deploy the supercomputer is expected to go live in March, with plans for further expansion throughout the year.

Most people like

Find AI tools in YBX

Related Articles
Refresh Articles