Controlling Shadow AI: Strategic Deployment of Generative AI for Enhanced Security and Performance

Presented by Dell Technologies

Understanding Shadow AI and Its Challenges for IT Departments

Shadow AI is becoming a significant challenge for IT departments tasked with deploying and managing generative AI services. Alarmingly, 45% of enterprises lack a formal policy governing the use of generative AI, leaving organizations vulnerable to risks.

As large language models (LLMs) proliferate and become more accessible, IT staff can easily access LLMs and GPUs-as-a-service from various public cloud providers or specialized API offerings. This accessibility has made it simple for knowledge workers to create their own digital assistants through user-friendly interfaces on laptops or mobile devices.

The Rise of Shadow AI

The current state of shadow AI mirrors the emergence of SaaS applications and public cloud services, which saw IT leaders grappling with business units and developers procuring software without approval. IT departments often responded by either restricting shadow IT or forming uneasy agreements with employees favoring their preferred applications.

Simultaneously, cloud consumption spiraled out of control, leading to excess costs due to misconfigurations and overprovisioning. As IT began evaluating investments against business value, the focus shifted towards optimizing cloud spending.

Rebalancing IT workloads became essential as organizations realized many applications might perform better on-premises or in alternative cloud environments. With some cloud vendors reconsidering data egress fees, IT leaders are rethinking their strategies.

While the public cloud is an excellent environment for rapid application testing and scaling, it also increases susceptibility to unauthorized workloads.

Navigating the Governance of AI

The democratization of AI capabilities presents a governance conundrum for IT leaders. Despite the challenges, CEOs are eager to embrace generative AI services, making outright cancellation unfeasible.

Instead, IT leaders must find a balance between supporting employee initiatives in generative AI and implementing responsible governance that respects budget constraints.

Identifying Ideal AI Use Cases

To achieve this, IT leaders should partner with business executives to identify optimal generative AI use cases. This process will require both parties to make compromises, with IT narrowing down service options and standardizing tools.

Each use case should be assessed for cost-effectiveness and performance, whether deployed in local or hosted environments. Some applications may perform better in public cloud settings, but many will thrive on-premises, benefiting from enhanced oversight and security.

Deploying an LLM on-premises can also lead to cost savings. A recent study by Enterprise Strategy Group (ESG) revealed that conducting inferencing with an open-source LLM using retrieval-augmented generation (RAG) on-premises is often more economical than utilizing public cloud resources or API-based services.

In ESG's tests, they found:

- An instance of the open-source Mistral 7B model tested against Amazon Web Services (AWS) EC2 was 38% to 48% more cost-effective, with savings increasing with user growth.

- Comparing a 70 billion parameter Meta Llama 2 instance to AWS EC2 showed a 69% to 75% cost advantage.

- Testing Llama 2 (70B parameters) against OpenAI’s GPT-4 Turbo for 50,000 enterprise users indicated it was 81% to 88% more cost-effective.

While deploying generative AI services on-premises won’t eliminate shadow AI, it can help mitigate its impact. Monitoring models in-house makes it easier for IT teams to address issues stemming from unexpected outputs, emphasizing the importance of aligning AI initiatives with organizational data.

Collaborating for Success

Organizations will likely operate generative AI workloads across various environments, including public and private clouds and edge locations. Deciding where to deploy LLMs can be complex, and trusted partners like Dell Technologies can provide critical support on this journey, offering AI-optimized servers, modern client devices, and professional services.

Shadow AI poses significant challenges, but with the right strategy and partnerships, businesses can forge a responsible generative AI framework. The right partner can illuminate the path forward.

Learn more about Dell AI solutions.

Clint Boulton

Senior Advisor, Portfolio Marketing, APEX at Dell Technologies

Most people like

Find AI tools in YBX

Related Articles
Refresh Articles