Wells Fargo's AI Assistant, Powered by Google's Technology, Set to Achieve 100 Million Interactions Each Year

Wells Fargo's Chief Information Officer Chintan Mehta shared insights into the bank's use of generative AI applications, highlighting that their virtual assistant app, Fargo, has handled 20 million interactions since its launch in March.

“We believe it has the potential to manage close to 100 million interactions annually,” Mehta stated at an event in San Francisco, emphasizing that this will increase as they enhance capabilities and conversation types.

The bank’s progress in AI is noteworthy, especially compared to many large companies still in the proof-of-concept phase. Despite expectations that major banks would proceed cautiously due to regulatory concerns, Wells Fargo is advancing rapidly. The bank has enrolled 4,000 employees in Stanford’s Human-centered AI program and has numerous generative AI projects already in production, focusing on increasing back-office efficiency.

Mehta spoke at the AI Impact Tour event, which aimed to help enterprise companies establish an AI governance blueprint, particularly concerning generative AI applications that leverage large language models (LLMs) for intelligent responses. As one of the top three banks in the U.S., with $1.7 trillion in assets, Wells Fargo is actively utilizing LLMs across its services.

Fargo, the smartphone-based virtual assistant, offers customers real-time answers to banking inquiries via voice or text. It currently averages 2.7 interactions per session and can perform tasks such as bill payments and transaction inquiries. Built on Google Dialogflow and utilizing Google's PaLM 2 LLM, Fargo is evolving to incorporate multiple LLMs for different functionalities—“you don’t need the same large model for everything,” Mehta asserted.

Another application, Livesync, aids customers in goal-setting and planning. Launched recently, it quickly attracted one million monthly active users in its first month.

Wells Fargo has also embraced open-source LLMs, including Meta’s Llama 2 model, for internal applications. Although adoption of open-source models has been slow since the surge of interest in OpenAI’s ChatGPT in late 2022, these models allow for greater customization and control, which is beneficial for specific use cases, according to Mehta.

The bank developed an AI platform named Tachyon to support its AI initiatives. This platform is built on the principles that no single AI model will dominate, that the bank will use multiple cloud service providers, and that data transfer challenges exist between different databases. Tachyon is adaptable, enabling the incorporation of new, larger models while maintaining performance and resilience. Techniques like model sharding and tensor sharding enhance training efficiency and reduce computational demands.

Looking ahead, Mehta mentioned that multimodal LLMs, which facilitate communication through images, video, and text, will be crucial. He provided a hypothetical scenario of a commerce app where users could upload images and, using a virtual assistant, seamlessly book services related to those images. While current multimodal models require significant text input for context, he noted that increasing the model's ability to understand intent with less text is a key area of interest.

Mehta asserted that the fundamental value of banking—matching capital with customer needs—remains stable, with innovation focused on enhancing user experience. He described the potential for LLMs to become more "agentic," enabling users to complete tasks smoothly through multimodal inputs.

Addressing AI governance, Mehta emphasized the importance of clearly defining the purpose of each application. Though much of the governance challenge has been addressed, concerns about application security, including cybersecurity and fraud, persist.

Mehta expressed concern regarding the lag in banking regulations, which struggle to keep pace with generative AI advancements and decentralized finance. “There is a growing gap between our aspirations and current regulations,” he said, noting that regulatory changes could significantly affect Wells Fargo's operations and economic strategies.

To navigate this landscape, the bank is investing considerable resources in explainable AI, a research area focused on understanding the reasoning behind AI model conclusions.

Most people like

Find AI tools in YBX