Sam Altman Seeks $7 Trillion for AI Chips: The 'Mind-Boggling' Natural Resources Needed

The Wall Street Journal reported yesterday that OpenAI CEO Sam Altman aims to raise up to $7 trillion for an ambitious tech project designed to significantly enhance global chip capacity. This initiative, backed by investors including the U.A.E., could drastically improve the power available for AI models.

While some may view Altman's goal as overly ambitious or reminiscent of Elon Musk's grandiose aspirations, the environmental ramifications of such a large-scale project are striking, according to Sasha Luccioni, climate lead at Hugging Face. “If this project materializes, the demand for natural resources will be staggering,” she noted. “Even if renewable energy is utilized—which is not a certainty—the required amount of water and rare earth minerals will be astronomical.”

In a comparison, a September 2023 report by Fortune highlighted the environmental costs tied to AI advancements. Microsoft experienced a 34% surge in water consumption due to AI tools; Meta's Llama 2 model reportedly used twice as much water as Llama 1; and a 2023 study revealed OpenAI's GPT-3 training consumed 700,000 liters of water. Additionally, the scarcity of essential minerals like gallium and germanium has intensified the global chip conflict with China.

Luccioni criticized Altman for favoring brute force solutions rather than exploring more efficient methods for developing AI. “He’s being hailed as visionary, but his approach is concerning,” she argued.

Amid these challenges, Altman's pursuit to address GPU shortages and transform the semiconductor landscape reflects a broader trend in Silicon Valley. Last summer, reports surfaced regarding the extreme demand for Nvidia’s H100 GPU for large language model training, which has become a hot topic in the tech industry.

In a recent earnings call, Meta's CEO Mark Zuckerberg emphasized that building "full general intelligence" requires "world-class compute infrastructure." He announced plans to have approximately 350,000 H100s by year-end, with a total of around 600,000 H100 equivalents once other GPUs are factored in. The company intends to continue its aggressive investment in cutting-edge computational resources and custom silicon for specialized workloads.

Luccioni has also raised concerns about Nvidia’s lack of transparency regarding the environmental impact of its products. “Nvidia hasn’t published any details about the carbon footprint related to their manufacturing processes,” she explained. Furthermore, e-waste poses a significant challenge, as consumers often discard old GPUs soon after acquiring new ones.

Although Nvidia’s 2023 Corporate Responsibility Report acknowledged emissions generated throughout the product lifecycle, Luccioni believes more transparency is needed. She noted that there has been a decline in the environmental information shared by tech companies over time. For instance, Google’s PaLM 1 paper in 2022 provided sufficient details for energy-use estimates, whereas the subsequent PaLM 2 release lacked essential information on training duration or chip usage.

Despite her concerns, Luccioni remains skeptical about Altman’s ambitious project. “I see this as a lofty venture that likely won’t succeed,” she remarked. “However, it does place Altman alongside Elon Musk in terms of audacious projects that capture public attention and generate buzz.”

Most people like

Find AI tools in YBX