Apple Unveils M4 Chip, Infusing AI-Powered PC Performance into iPads

Apple has officially launched the M4 high-performance chip for its latest iPad lineup, claiming that these new tablets surpass any existing AI PCs. This exciting announcement was made during the company's "Let Loose" event on May 7. The M4 chip, designed as a system on a chip (SoC), integrates a powerful CPU and GPU to deliver exceptional performance while maximizing power efficiency in the new iPad Pro.

The M4 chips showcase a remarkable 1.5 times increase in CPU performance compared to the previous M2 chips found in earlier iPad Pro models. Additionally, the new SoCs feature a neural engine capable of processing up to 38 trillion operations per second (TOPS), which Apple asserts is faster than the neural processing units of any AI PC on the market today. Johny Srouji, Apple's senior vice president of hardware technologies, emphasized that these advancements position the iPad Pro as “the most powerful device of its kind.”

Srouji outlined that the energy-efficient performance of the M4 chip, complemented by a new display engine, enables the iPad Pro's sleek design and transformative display. He noted that significant enhancements to the CPU, GPU, neural engine, and memory system make the M4 exceptionally well-suited for the latest applications leveraging artificial intelligence.

Alexander Harrowell, principal analyst for advanced computing at Omdia, remarked that Apple's impressive hardware upgrades should not come as a surprise. He referenced the A17 Pro chip in the iPhone 15, which delivers 35 TOPS through its neural engine ASIC block. Harrowell pointed out that the overarching Apple Silicon lineup has its roots in mobile technology, originally developed as A-series chips before transitioning to M-series.

He added, “The key distinction between the A17 Pro and the M4 seems to be the additional cores in both the CPU and GPU. This allows for more power, price flexibility, greater area, and enhanced thermal capacity within a tablet form factor.” The M4's innovative neural engine empowers the new iPad Pro to achieve 38 TOPS in performance, a significant feat for a tablet.

According to a recent report from Omdia, Apple has emerged as a leading force in the AI PC sector, particularly noted for its performance in creative workloads. Harrowell acknowledged the upgraded performance of the M4 chips, describing them as a "nice to have" but emphasized the importance of increased memory bandwidth as a standout improvement. “As Transformer inference performance relies heavily on memory for model size and memory I/O for token processing, the reported memory bandwidth of 120GB per second is intriguing but slightly confusing, given that the earlier M3 chips had a capacity of 150GB per second,” he explained.

He further highlighted that comparisons should focus on the previous generation, specifically the baseline M2 chip in the 2022 iPad Pro, which provided 100GB per second and 15.8 TOPS. Harrowell noted that a 20% improvement in memory I/O and a doubling of accelerator TOPS would undoubtedly enhance the user experience.

During a recent earnings call, CEO Tim Cook announced that Apple is set to make “significant investments” in generative AI in the upcoming quarter, with more insights expected in June at the WWDC event. Lian Jye Su, Omdia’s chief analyst of applied intelligence, suggested that the M4 chips indicate Apple’s readiness to support large language models across its tablet lineup.

Su noted, “Apple has referenced diffusion and generative models, which are commonly used in foundational models explored by other AI chipset vendors.” While he speculated that the power of Apple’s large language model might not compete directly with Qualcomm and Intel, he acknowledged that Apple typically does not prioritize first-mover advantage but instead selects the most suitable technologies to enhance its product ecosystem.

Despite recently unveiling its own OpenELM model, Apple is rumored to be exploring partnerships with other technology vendors for large language model integration into its devices. Su remarked, “It will be interesting to see which companies Apple collaborates with for on-device LLM deployment.” There are whispers about potential partnerships with Google for the global market and Baidu for China, alongside the possibility of further utilizing its OpenELM model. A competitive landscape for technology partnerships could be on the horizon.

Most people like

Find AI tools in YBX