Edge AI Chip Market Projected to Reach $60B by 2028 Driven by Demand from Compact Models and Personal Computers

New research from Omdia reveals that the edge AI processor market is set to achieve an impressive $60.2 billion in revenue by 2028, boasting a robust compound annual growth rate (CAGR) of 11%. Their latest forecast, titled “Processors at the Edge,” attributes this revenue growth to the rising demand for advanced hardware as various industries and devices increasingly adopt artificial intelligence (AI).

Among the key sectors driving market expansion, the personal computer (PC) industry stands out, supported by an increase in product availability from major manufacturers such as Intel, AMD, and Apple. Omdia's findings indicate that PC vendors are increasingly marketing AI processors as a distinctive feature, highlighting their growing significance in modern computing.

In addition to the PC market, the report underscores the swift adoption of AI processors in diverse applications, including automotive technology, drones, security cameras, and robotics. While graphics processing units (GPUs) currently dominate this landscape, there is potential disruption on the horizon from AI accelerator chips, such as application-specific standard products (ASSPs) from industry players like Qualcomm.

"AI ASSPs are anticipated to expand their market share from 19% to 28%, primarily at the expense of GPUs," explains Alexander Harrowell, principal analyst for advanced computing at Omdia. He notes a transformative trend where PCs are increasingly resembling smartphones and tablets by adopting the familiar CPU-GPU-NPU architecture seen in modern mobile devices. However, he mentions that the adoption of in-CPU acceleration has been slower than expected.

Omdia also forecasts an emerging “architecture split” between data centers and edge computing. Traditionally, AI training has been confined to data centers, but there is now a growing demand for inference capabilities at the edge, primarily driven by programmers’ needs. "The developer experience is crucial, and there is a pressing requirement for software tools that bridge the gap between cloud training and edge inference," the report states.

A notable trend highlighted in the findings is the rise of smaller language models. Companies are increasingly introducing powerful models that occupy significantly less space and require far less computational power than their larger counterparts. For instance, Microsoft recently showcased Orca-Math, which, despite having only seven billion parameters, outperforms GPT-3.5, Gemini Pro, and Llama 2 70B in solving elementary math problems.

The acceleration of small models is creating a surge in demand for edge processors capable of executing these models locally for inference and fine-tuning. Omdia emphasizes, "Applications that previously necessitated bulky models can now be efficiently serviced by smaller models capable of operating on standard PCs or smartphones."

Furthermore, vendors are actively exploring models within the burgeoning small model sector, which typically has parameters ranging from one billion to ten billion, targeted at spatial and multi-modal applications. As small language models push towards the edge, they are expected to enhance flexibility through multi-modal AI and streamlined plain language prompts.

Most people like

Find AI tools in YBX