Red Hat Summit 2024: Empowering Enterprises with Generative AI
At the Red Hat Summit 2024 in Denver, Colorado, the open source leader unveiled significant initiatives to harness generative AI's potential for enterprises.
Key Announcements: RHEL AI and InstructLab
The spotlight was on Red Hat Enterprise Linux AI (RHEL AI), a foundational platform for developing and deploying open source language models, alongside InstructLab, a community-driven project for domain experts to enhance AI models with their expertise.
Red Hat's Unique Position in AI Integration
Red Hat CEO Matt Hicks highlighted how RHEL AI stands out from competitors through several unique aspects:
- Open Source and Hybrid Flexibility: Red Hat prioritizes an open source approach and hybrid infrastructure. Hicks stated, "AI is similar to applications—requiring training in some locations and deployment in others. We’re hardware-agnostic and aim to operate anywhere."
- Optimized Hardware Performance: With a strong track record of maximizing performance across various hardware platforms, Hicks emphasized, "We can optimize Nvidia, AMD, Intel, and Gaudi to achieve peak efficiency without producing GPUs."
- Intellectual Property Ownership: Red Hat’s open source model ensures customers maintain ownership of their intellectual property. "You keep your IP while benefiting from our services and subscriptions," Hicks added.
In a rapidly evolving AI landscape, Red Hat asserts that this combination of openness, flexibility, optimized performance, and customer IP ownership will be crucial differentiators for RHEL AI.
RHEL AI: A New Era for Enterprises
RHEL AI merges open source language models, including the Granite models developed by IBM Research, with capabilities from the InstructLab project, facilitating model customization and enhancement.
It features an optimized RHEL operating system image with hardware acceleration and enterprise-grade support. Chris Wright, Chief Technology Officer at Red Hat, explained, "We’re enabling our customers to leverage existing infrastructure investments for enterprise AI, predictive analytics, and generative AI."
The aim is to provide reliability and confidence within a unified platform while improving hybrid cloud infrastructure and advancing cloud-native application development.
InstructLab: Enhancing Language Models
The InstructLab project allows domain experts, even those without data science skills, to enrich language models by contributing their knowledge. Leveraging IBM's LAB (Large-scale Alignment for chatBots) method, InstructLab generates high-quality synthetic training data from limited examples through a simple four-step process:
1. Experts submit their knowledge examples.
2. A "teacher" AI model generates similar training data.
3. The synthetic data undergoes quality checks.
4. The language model learns from approved data, fostering continuous improvement through community contributions.
This method provides a cost-effective way to enhance models while IBM has used LAB for improving popular open source models like Meta's Llama and various Mistral models.
Getting Started with InstructLab
Developers can begin with InstructLab for free using the open source InstructLab CLI on their laptops, then transition to RHEL AI for larger models, scaling up with Red Hat’s OpenShift AI platform.
OpenShift AI 2.9 Updates
OpenShift AI will also upgrade to version 2.9, introducing new features for deploying predictive and generative models and expanding the partner ecosystem, reaffirming Red Hat's commitment to providing flexibility in AI deployments.
Red Hat’s AI offerings will roll out in phases. Developers can start enhancing open source models immediately via InstructLab, with RHEL AI in developer preview, and the latest MLOps features in OpenShift AI now available.
A Community-Centric Vision for AI
Through RHEL AI and InstructLab, Red Hat aims to replicate its success with Linux and Kubernetes, making powerful AI technologies accessible through open source. If successful, this initiative could accelerate generative AI adoption in enterprises by empowering domain experts to contribute their knowledge and deploy AI models with confidence.
Badani remarked, "This illustrates our commitment to the power of open source and community," while Wright added, "We're thrilled to expand the definition of 'open' in this context."