The intense competition in the large language model (LLM) market has shifted from deep taboo to open discussion over the past two months. During the 2024 World Artificial Intelligence Conference held from July 4 to 6, the price war surrounding LLMs remained a hot topic. However, stakeholders emphasized the importance of business models and sustainability rather than merely focusing on prices. "In the last two months, the download rate for the Tongyi Qianwen open-source model has doubled, surpassing 20 million," highlighted industry players. "In a fiercely competitive environment, it's crucial to improve business efficiency while lowering costs compared to competitors."
With the conference concluding on July 6, the positioning battle among major tech firms and startups in the LLM space is far from over. No one can definitively say which commercial model will succeed, but surviving this price war may provide clearer insights.
In early May, AI company DeepSeek boldly claimed to offer performance on par with GPT-4 at just a fraction of the cost, breaking the previous understanding that higher capacity means higher prices. Shortly after, Chinese LLM companies began to aggressively cut prices, with media outlets sensationalizing steep discounts that eventually led to free services—a scenario reminiscent of ride-hailing and group-buying wars. "Lower LLM prices are a very positive development because they should decline," stated Yan Junjie, founder and CEO of MiniMax, at the conference. Since 2024, MiniMax has focused more on productivity-driven scenarios and has developed closer ties with enterprise clients. Yan emphasized that the reduction in LLM prices benefits most businesses by attracting more users and increasing online engagement, ultimately creating greater value.
According to Moore's Law, the number of transistors that can fit on an integrated circuit roughly doubles every 18-24 months, leading to enhanced performance. The principle of technological upgrades resulting in lower costs remains true, regardless of whether Moore's Law still applies in the LLM industry. "Price reductions are driven by technology; as technology improves, costs decrease, but going too far can be detrimental," noted Zhang Peng, CEO of Zhipu AI, who referenced Michael Porter's value chain theory. He stated that a healthy value chain involves all parties continuously adding value while providing better services for greater collective gains—an essential aspect often disrupted during price wars.
CEO Yang Zhilin of Yuezhi Anmo linked price competition with value creation as well. He argued that the LLM business model should depend on the computational power needed for inference significantly exceeding that required for training. When the cost of inference for consumers is much lower than customer acquisition costs, new business models may emerge, moving away from the current price wars among enterprises.
Every major player in the LLM market aims to weave a network of collaborations with ecosystem partners, whether through complementary efforts or alliances, creating multiple pathways for both consumer (C-end) and enterprise (B-end) markets. Recently, Yan Junjie and Yang Zhilin partnered with DingTalk to execute an ecological strategy, announcing collaborations with six major LLM firms, including MiniMax and Yuezhi Anmo. DingTalk plans to choose different collaboration partners based on the unique characteristics of each LLM model.
Demand for effective applications also drives innovation in the LLM space. Before participating in a panel at the conference, Xie Ling, founder and CEO of Yufeng Future, received a request from a foreign client regarding solutions for oil leak detection in pipelines. He expressed the wish for lower AI costs to better solve customer problems and increase sales.
Large companies are showcasing their data capabilities. During the conference, Alibaba Cloud's CTO, Zhou Jingren, revealed that many clients have begun utilizing LLMs following recent price reductions. Over the past two months, downloads of the Tongyi Qianwen open-source model doubled to over 20 million while Alibaba Cloud's customer base for its Bai Lian service rose dramatically. Similarly, Baidu revealed impressive metrics for its LLM services, including significant increases in daily usage and enterprise clients served.
Discussions about open-source versus closed-source models continue to be a hot topic. Baidu's CEO, Li Yanhong, noted the confusion surrounding the concepts of model open-sourcing versus code open-sourcing. He emphasized that merely having access to model parameters doesn't guarantee effective use without robust systems for fine-tuning and secure alignment. This often results in models that cannot leverage continuous upgrades or share computational resources.
Zhou Jingren reaffirmed Alibaba Cloud's commitment to open-sourcing, pointing out that their series of core models has achieved genuine full-scale, multimodal open-source status, successfully narrowing the gap between open-source and closed-source models.
Overall, the convergence of lower costs, enhanced collaboration, and evolving business models signals exciting developments in the landscape of artificial intelligence as the industry navigates its transformative price war.