Tidalflow Enables Seamless Integration of Any Software with ChatGPT and Other LLM Ecosystems

As businesses increasingly adapt their software to operate across diverse desktop, mobile, and cloud environments, they must now also optimize their systems for the rapid evolution of artificial intelligence (AI). This shift has been particularly marked by the rise of large language models (LLMs), which are transforming how we interpret and generate human-like text for advanced AI applications.

Although companies can create an LLM instance of their software using existing API documentation, the challenge lies in ensuring that the wider LLM ecosystem can effectively utilize it, while also gaining visibility into its performance in real-world conditions.

Tidalflow aims to address this challenge with a comprehensive platform designed to help developers integrate their existing software with the LLM ecosystem seamlessly. Emerging from stealth mode, Tidalflow has secured $1.7 million in funding, co-led by Google’s Gradient Ventures and Dig Ventures, a VC firm founded by MuleSoft co-founder Ross Mason, with additional support from Antler.

Building Confidence in AI Integration

Imagine an online travel platform looking to implement LLM-enabled chatbots, such as ChatGPT and Google Bard, for customers to easily request airfares and book tickets through natural language queries. The company sets up an LLM instance for each platform, but they face uncertainty—perhaps 2% of ChatGPT’s responses lead to irrelevant destinations, and the error rate on Bard is even higher. They lack insight into the error rates which complicates their decision-making.

If a company has a fail tolerance of less than 1%, they may postpone adopting generative AI until they can better understand their LLM instance's performance. This is where Tidalflow plays a vital role, providing modules that not only assist in creating LLM instances but also enable testing, deployment, monitoring, security, and eventual monetization. Companies can refine their LLM instances in a controlled sandbox environment until they meet their acceptable fail-tolerance levels.

“The key issue is that when you deploy something like ChatGPT, you don’t have a clear understanding of user interactions,” said Tidalflow CEO Sebastian Jorna. “This lack of confidence in software reliability is a significant barrier to implementing tools in LLM ecosystems. Tidalflow’s testing and simulation module addresses this challenge.”

Tidalflow can be characterized as an application lifecycle management (ALM) platform that integrates with developers' OpenAPI specifications. The platform then produces a “battle-tested LLM instance” of their product, providing comprehensive monitoring and observability for performance in the real world.

Testing Beyond Conventional Methods

“With traditional software testing, you typically run a select number of scenarios,” noted Jorna. “In our stochastic environment, however, we must employ higher volumes of data to achieve statistical significance. Our testing and simulation module allows us to run simulations as if the product were live, revealing potential user interactions.”

Tidalflow's platform empowers businesses to explore countless edge cases that might challenge their generative AI capabilities. This becomes critical for larger enterprises, where any compromise in software reliability can lead to substantial risks.

“Larger clients simply cannot afford to launch products without the assurance that they function correctly,” Jorna emphasized.

Rapid Growth and Future Plans

Formed just three months ago, Tidalflow's founders, Jorna (CEO) and Coen Stevens (CTO), connected through Antler’s entrepreneur-in-residence program in Amsterdam. “Once our official program kicked off in the summer, Tidalflow secured funding faster than any other company in Antler Netherlands’ history,” Jorna revealed.

Currently, Tidalflow consists of a three-member team, including co-founders and Chief Product Officer (CPO) Henry Wynaendts. With their recent $1.7 million in funding, the company is now hiring for various engineering roles in preparation for a full commercial launch.

This swift transition from inception to funding exemplifies the current generative AI boom. With ChatGPT offering an API and supporting third-party plugins, and Google following suit for Bard, in addition to Microsoft's integration of Copilot within Microsoft 365, businesses now have unprecedented opportunities to leverage generative AI to enhance their products and engage with a broader user base.

“Much like the iPhone revolutionized mobile software in 2007, we are at a pivotal moment where software must become compatible with LLMs,” Jorna concluded.

Tidalflow will remain in closed beta for the time being, with plans to launch publicly by the end of 2023.

Most people like

Find AI tools in YBX

Related Articles
Refresh Articles