AI Training Costs Soar: IBM Suggests Quantum Computing as a Viable Solution

Earlier this month, the Wall Street Journal reported that one-third of nuclear power plants are negotiating with tech companies to power new data centers. In parallel, Goldman Sachs projected a staggering 160% increase in electricity consumption by data centers due to AI by 2030, which could more than double current carbon dioxide emissions levels. Notably, each query made to AI models like ChatGPT is estimated to consume at least ten times more energy than a standard Google search. This raises a critical question: Will the soaring costs of training AI models constrain AI's potential?

At VB Transform 2024, a panel led by Hyunjun Park, co-founder and CEO of CATALOG, addressed these pressing issues. Park invited expert insights from Dr. Jamie Garcia, IBM’s director of quantum algorithms and partnerships; Paul Roberts, director of strategic accounts at AWS; and Kirk Bresniker, chief architect at Hewlett Packard Labs and an HPE Fellow.

Unsustainable Resources and Inequitable Technology

“The target year of 2030 gives us a window to make necessary adjustments, yet it's tangible enough that we need to consider the consequences of our current actions,” said Bresniker. He indicated that by 2030, the cost of training a single AI model could exceed U.S. GDP and surpass global IT spending, marking a significant barrier that demands urgent decision-making.

“Embedded in the discussion of sustainability is equity,” he continued. “If a practice is unsustainable, it’s also inequitable. Thus, as we strive for broader access to AI technology, we must examine what changes are necessary to make this technology universally accessible.”

The Role of Corporate Responsibility

Several corporations are stepping up to address the environmental challenges of increased power consumption. AWS is implementing Nvidia’s innovative liquid cooling solutions and is exploring sustainable materials for construction.

“We are evaluating enhancements in both steel and concrete to reduce our carbon footprint,” Roberts explained. “Additionally, we are considering alternative fuels, such as hydro-treated vegetable oils, to replace conventional diesel in our generators.”

AWS is also prioritizing efficient hardware solutions, including their silicon, Trainium, which significantly outperforms traditional alternatives. The launch of Inferentia chips aims to improve performance per watt by over 50%, further decreasing operational costs.

Their second-generation ultra-cluster network enhances training capabilities, supporting up to 20,000 GPUs and achieving a network throughput of about 10 petabits per second while reducing latency by 25%. This allows for quicker model training at a significantly lower cost.

Can Quantum Computing Change the Future?

Garcia discussed the intersecting potential of quantum computing and AI. Quantum machine learning could drive efficiency through three main applications: quantum models using classical data, quantum models utilizing quantum data, and classical methods applied to quantum datasets.

“Various theoretical proofs indicate that quantum computers excel in addressing specific challenges, particularly with limited or interconnected data,” Garcia noted, highlighting promising applications in healthcare and life sciences.

IBM is pioneering research in quantum machine learning, already developing applications across life sciences, materials science, and more. The company is also creating Watson Code Assist to help users navigate quantum computing effectively.

“We’re leveraging AI to help optimize quantum circuits and frame problems effectively for quantum solutions,” she added.

The future lies in a seamless integration of classical computing and quantum technology. “We need CPUs, GPUs, and quantum processors working in harmony to maximize resource efficiency and unlock the potential of advanced algorithms,” she explained.

However, significant infrastructure improvements are necessary for quantum computing to fulfill its promise. “We must tackle energy consumption and component engineering challenges to realize a unified quantum framework,” Garcia emphasized.

Choice and the Hard Ceiling

“Radical transparency is crucial for decision-makers to gain insight into the sustainability, energy, privacy, and security attributes of the technologies we utilize,” Bresniker stated. “This understanding allows us to gauge the actual return on our investments in these technologies.”

Roberts added that organizations are rapidly adopting large language models (LLMs) and generative AI. It’s vital to select the right performance characteristics and silicon types for specific applications.

“From a sustainability perspective, organizations should evaluate their use cases and choose the right silicon to drive inference efficiently,” he advised.

Fostering choice in software and infrastructure is essential. “The ability to control these choices enables organizations to optimize deployments based on cost and efficiency,” he noted.

Bresniker concluded with a warning about the unsustainable growth of AI: “Simply increasing data and resources does not guarantee improvements for enterprises. As we move forward, we must demand transparency about data origins and energy consumption, exploring alternatives that prevent us from hitting a hard ceiling in resource usage.”

Most people like

Find AI tools in YBX