Microsoft Copilot AI: Soon Available for Local Use on Your PC

Microsoft's Copilot AI service is transitioning to operate locally on PCs, according to Intel. The company revealed that next-generation AI PCs will need built-in neural processing units (NPUs) capable of delivering over 40 trillion operations per second (TOPS) — a requirement that exceeds the capabilities of current consumer processors.

Intel indicated that with local NPU processing, AI PCs would support "more elements of Copilot" directly on the device. Currently, Copilot relies heavily on cloud computing, causing noticeable lag with smaller tasks. Local processing would not only reduce this lag but also enhance performance and privacy.

Previous rumors suggested that Microsoft intends to mandate 40 TOPS for future AI PCs, alongside 16GB of RAM. Currently, Windows utilizes NPUs minimally, primarily for video effects like background blurring in Surface Studio webcams. In contrast, ChromeOS and macOS leverage NPU capabilities for various functions, including video and audio processing, optical character recognition (OCR), translation, and live transcription.

The Apple M3 processor currently leads with the highest NPU speed at 18 TOPS across its lineup (M3, M3 Pro, M3 Ultra). AMD's Ryzen 8040 and 7040 laptop chips deliver 16 and 10 TOPS, respectively, while Intel's Meteor Lake laptop chips also reach 10 TOPS. Qualcomm's Snapdragon X Elite is anticipated to be the first processor to meet the 40 TOPS requirement necessary for Copilot.

Looking ahead, Intel's Lunar Lake chips, expected in 2025, will feature triple the current NPU speeds. Recently, Intel announced 300 new AI features tailored for its OpenVino platform and introduced an AI PC development kit based on the ASUS NUC Pro, utilizing Meteor Lake technology.

Intel noted its plans for both standard and next-gen AI PCs, covering all segments of the market with a strategic roadmap.

Most people like

Find AI tools in YBX