Dell CTO's Insights: 2024 Predictions on AI's Practical Applications, the Rise of Zero Trust, and More

Absolutely, 2023 was undeniably the year of AI.

And as John Roese, global CTO of Dell, noted in a year-end forecast, “Next year, just like this year, will continue to focus on AI.”

Are you prepared for AI agents? While this year’s AI narrative has been largely experimental and aspirational, its evolution outpaces traditional technology by sevenfold. Enterprises will quickly transition from theoretical concepts to practical implementations, with all technology becoming centered around AI’s rapidly growing adoption.

“Next year marks year two of the AI era,” Roese stated. “The first wave of operational AI systems will begin to emerge in enterprises.”

Identifying the Core Areas for AI Application

In 2024, enterprises must adopt a top-down strategy as they implement AI in production.

“You need to identify your core competencies,” Roese advised. “Focus on what defines your organization; that's where to apply the significant efforts of AI.”

For example, Dell has around 380 AI projects in the pipeline, but even a large company can realistically manage only a few at a time. Rushing to complete the first few initiatives can lead to neglecting more transformative projects down the line.

“You have to prioritize,” Roese emphasized. “Determine which ideas are most critical to your business.”

Transitioning to Inference and Operational Costs

As enterprises transition to inference in 2024, designing and positioning infrastructure will be vital.

“Organizations must consider their topology carefully,” he explained. “As technology is distributed, AI will likely follow suit.”

Security remains paramount as malicious actors target inference directly. Enterprises need to ask: “What’s the security structure surrounding our AI implementation?”

Moreover, the economic focus of AI will shift from training costs to operational expenses. While model fine-tuning can be costly, this represents only a fraction of the overall investment. Training costs are associated with one-time model sizes and data usage, whereas inference expenses depend on ongoing utilization, data types, user bases, and maintenance.

“The overarching theme is: AI is becoming much more tangible, and that has significant implications,” Roese remarked.

Advancements in Generative AI Supply Chains

Generative AI systems are vast, requiring “more tools, more technology, and a broader ecosystem” to operate effectively, Roese noted.

Despite earlier discussions around tool availability, he anticipates an “abundance” of AI resources in 2024.

“Our ecosystem of AI tools and services is expanding, diversifying, and scaling,” he added.

Improved system-building tools and diversified AI frameworks, such as the new Linux Foundation UXL project, will increase both closed and open-source models. Developers will more easily create interfaces to various accelerated computing and integrated frameworks, including PyTorch on the client side and ONNX on the infrastructure side.

“Next year, we'll have more diverse options at every layer,” Roese stated.

The Realization of Zero Trust Security

Despite incorporating advanced security measures, cybersecurity remains a significant challenge for enterprises.

“Zero trust architecture is essential,” Roese explained. “Every element must be authenticated and authorized in real-time.”

While zero trust has mostly been viewed as a concept, its implementation can be complex, especially within existing infrastructures.

“It’s challenging to transition an established enterprise to a zero-trust model,” Roese acknowledged. “You’d need to reverse every security decision made previously.”

Fortunately, with AI’s continuous evolution, zero trust can be integrated into new systems from the outset. Roese highlighted Dell's upcoming zero trust tool, Project Fort Zero, which aims to undergo validation by the U.S. Department of Defense in 2024.

“We are currently losing the cyber battle,” Roese warned. “The solution lies in the adoption of zero trust.”

Emergence of the Common Edge

To maximize data value, enterprises should process data closer to its source.

“In the future, we will perform more data processing in real-world environments than in traditional data centers,” Roese stated.

This trend will usher in what Dell refers to as “modern edge” multi-cloud platforms.

As enterprises leverage various cloud services, the landscape of edge solutions is becoming increasingly complex. Roese pointed out that if each cloud and workload demands its own architecture, the result would be unmanageable.

To mitigate this, Dell recently introduced NativeEdge, a common edge platform designed to support software-defined workloads from any IT, cloud, or IoT system. Roese predicts that this approach will gain traction in 2024 as enterprises recognize the pitfalls of “mono-edges.”

“Now, most edge service providers prefer to deliver edge services as containerized code instead of focusing on hardware development,” he noted.

Looking Ahead: Quantum Computing and AI

Large-scale AI represents a “massive parallel problem,” according to Roese.

“Generative AI methods, including transformers and diffusion models, require extensive computational resources,” he explained.

While the full potential of quantum computing in AI may not manifest for several years, Roese believes it will play a pivotal role in addressing AI's intricate challenges.

“Quantum computing is exceptionally suited for highly-scaled optimization problems, making it ideal for generative AI applications,” he asserted.

Although quantum has been discussed for some time, Roese envisions a future where mature quantum systems are readily available.

“When that happens, the impact on AI will be profound, surpassing even the disruption created by ChatGPT,” he concluded.

Most people like

Find AI tools in YBX

Related Articles
Refresh Articles