Dell reported earnings after the market closed on Thursday, surpassing both earnings and revenue estimates. However, the results indicate that AI adoption among enterprise and tier-2 cloud service providers is progressing more slowly than anticipated.
Following the earnings announcement, Dell's stock dropped 17.78% in after-hours trading, adding to a 5.18% loss during the regular trading session. Nevertheless, the stock is still up 86.79% year-to-date.
“Data is the differentiator—83% of all data is on-premises, and 50% of it is generated at the edge,” remarked Jeff Clarke, Dell’s COO, during the earnings call. He emphasized that “AI is moving closer to the data for efficiency, effectiveness, and security,” adding that on-prem AI inferencing can be 75% more cost-effective than cloud solutions.
Dell's AI strategy centers on the assumption that enterprises will prefer deploying infrastructure on-premises rather than in the cloud to utilize local data effectively. This approach mirrors their tactics during the Great Cloud Wars when companies prioritized the agility of cloud services while wanting the control of owning their infrastructure. Ultimately, many enterprises were drawn to the advantages offered by hyperscale clouds.
Analyst Toni Sacconaghi from Bernstein scrutinized Dell’s AI server narrative, questioning whether the introduction of $1.7 billion in AI servers with flat operating profit hinted at zero profit margins for these products. In response, CFO Yvonne McGill acknowledged that while AI-optimized servers may be margin-rate dilutive, they contribute positively to overall margins.
Dell’s longstanding strategy involves selling loss-leading products in hopes of upselling higher-margin equipment shortly thereafter. This approach simplifies purchasing and support for customers, as Dell bundles AI servers with more profitable networking and storage solutions.
Jeff Clarke further outlined the hurdles impeding enterprise AI adoption. Many customers are still determining how to implement AI in their operations, necessitating significant service and consultative selling of Dell’s solutions. Clarke noted six primary use cases that frequently arise in discussions: content creation, support assistance, natural language search, data design and creation, code generation, and document automation. Helping customers prepare their data for these applications is a critical focus of Dell's efforts.
Clarke's remarks underscore that enterprise AI projects remain in their infancy. The complexity of the data processing, training, and deployment pipelines poses significant challenges, akin to a fragile Rube Goldberg machine. Unlike the relatively smoother journey to cloud adoption, the current AI landscape is beset with issues, including a skills gap that echoes the difficulties faced during the Great Cloud Wars.
Today’s AI infrastructure demands deeper domain expertise, making it essential for vendors to bridge this skills gap. Although some argue that on-prem infrastructure costs can be lower at scale, most enterprises weigh those arguments against the operational expenses and complexities that come with managing such setups.
Moreover, supply constraints present an additional challenge. Companies are in competition for Nvidia GPUs, which hyperscale and tier-2 cloud providers are already purchasing in large volumes. While Dell has a strong track record in component sourcing, customers may experience longer lead times for GPU servers.
Dell appears to be playing a long game by betting on the essential need for on-prem AI infrastructure, particularly for latency-sensitive workloads. The company aims to help enterprises navigate the barriers to AI adoption, accepting short-term margin sacrifices on GPU servers to achieve this goal.
Ultimately, whether Dell's strategy will succeed remains uncertain, particularly as cloud providers are already delivering numerous enterprise AI solutions with minimal customer-facing infrastructure requirements. The coming quarters will reveal if Dell’s approach can effectively compete with the robust offerings from hyperscale cloud providers.