Elon Musk leads at least six pioneering companies: Tesla, SpaceX, Starlink, X (formerly Twitter), Neuralink, and xAI. Yet, he has his eyes set on more.
Today on X, Musk announced that xAI—known for its large language models (LLMs) called Grok, and a corresponding chatbot available to paid subscribers—has begun training at the Memphis Supercluster in Tennessee, touted as "the most powerful AI training cluster in the world."
According to local news outlet WREG, the Memphis Supercluster, located in the city’s southwest, is the largest capital investment from a new market entrant in the area's history. However, xAI currently lacks a contract with the Tennessee Valley Authority, which is required to provide power for projects exceeding 100 megawatts.
Equipped with Nvidia GPUs
Musk revealed that the supercluster boasts 100,000 liquid-cooled Nvidia H100 graphics processing units (GPUs), sought after by AI developers, including Musk’s rivals at OpenAI. It's noteworthy that the cluster operates on a single Remote Direct Memory Access (RDMA) fabric, facilitating efficient data transfers between compute nodes without taxing the central processing unit (CPU).
xAI aims for cutting-edge AI by 2024
xAI’s objective is to train its own LLMs at the supercluster. Musk expressed ambition in a reply that the company intends to develop "the world’s most powerful AI by every metric" by December 2024, with the Memphis Supercluster providing a “significant advantage.”
Skeptical of timelines
Despite Musk's numerous achievements, he is known for missing deadlines on projects like fully autonomous vehicles and space missions. Therefore, skepticism surrounds the December 2024 rollout of the new Grok LLM. However, if it materializes on schedule, it could greatly enhance xAI’s competitive edge.
With heavyweights like OpenAI, Anthropic, Google, Microsoft, and Meta all vying for advancements in LLMs and SLMs, xAI must deliver an innovative and valuable model to keep pace in the fierce AI race for users and market share.
Additionally, reports suggest that Microsoft is collaborating with OpenAI CEO Sam Altman on a $100 billion AI training supercomputer, codenamed Stargate. Given these developments, xAI’s Memphis Supercluster might not hold the title of the most powerful for long.