LangChain Secures $25M Funding Round, Unveils New Platform to Enhance the Complete LLM Application Lifecycle

Today, LangChain, the startup revolutionizing large language model (LLM) application development through its open-source framework, announced a $25 million Series A funding round led by Sequoia Capital. The company is also launching LangSmith, its first commercially available LLMOps product.

LangSmith is designed as an all-in-one platform that accelerates LLM application workflows, covering the complete project lifecycle—from development and testing to deployment and monitoring. Launched in closed beta in July 2023, it is already used by thousands of enterprises on a monthly basis.

The introduction of LangSmith comes at a critical time as developers seek robust solutions for building and maintaining high-performance, reliable LLM applications.

Key Features of LangChain’s LangSmith

LangChain's open-source framework empowers developers with a comprehensive toolkit, providing a common set of best practices and modular building blocks for creating LLM-powered applications. Developers can integrate multiple LLMs through APIs, chain them together, and connect them with data sources to perform a variety of tasks. What began as a side project has now become the backbone of over 5,000 LLM applications, spanning domains such as internal tools, autonomous agents, gaming, and chat automation.

Nevertheless, simply providing a development toolkit isn’t enough to overcome the numerous challenges of bringing LLM applications into production. This is where LangSmith steps in, offering capabilities to debug, test, and monitor LLM applications effectively.

Supported Workflows with LangSmith

During the prototyping phase, LangSmith provides developers with complete visibility into the sequence of LLM calls, allowing them to identify errors and performance bottlenecks in real time. They can collaborate with subject-matter experts to refine application behavior and utilize human feedback or AI-assisted evaluation to ensure relevance, correctness, and sensitivity.

Once a prototype is refined, LangSmith simplifies the deployment process through hosted LangServe while offering real-time insights into production metrics such as costs, latency, anomalies, and errors. This empowers enterprises to deliver LLM applications that optimize both performance and cost efficiency.

Strong Early Adoption

In a recent blog post, Sonya Huang and Romie Boyd from Sequoia reported that LangSmith has garnered over 70,000 signups since its closed beta launch. More than 5,000 companies, including renowned names like Rakuten, Elastic, Moody’s, and Retool, are currently leveraging the technology.

“Elastic uses LangChain to power their Elastic AI Assistant for security and relies on LangSmith for visibility, enabling rapid production. Rakuten employs LangSmith for rigorous testing and benchmarking of their Rakuten AI for Business, built on LangChain. Moody’s benefits from LangSmith’s automated evaluation for quick iteration and innovation,” Huang and Boyd highlighted.

While LangSmith is already gaining traction, its adoption is expected to surge as it becomes publicly available in a fast-evolving AI landscape.

“The team is addressing a rich problem space, guided by a passionate user community eager for solutions,” the Sequoia executives remarked, emphasizing that this is only the beginning for LangChain.

Moving forward, LangChain plans to expand LangSmith's functionality with features like regression testing, online evaluators using production data, enhanced filtering, conversation support, and simplified application deployment with hosted LangServe. The company also aims to introduce enterprise-grade features for administration and security.

Following this funding round from Sequoia, LangChain has raised a total of $35 million, having previously secured $10 million with Benchmark leading that round. Other solutions for evaluating and monitoring LLM applications include TruEra’s TruLens, W&B Prompts, and Arize’s Phoenix.

Most people like

Find AI tools in YBX