Samsung Unveils Ultra-Slim Chips for Accelerated On-Device AI

Samsung has begun mass production of its slimmest LPDDR5X DRAM chips, measuring only 12 nanometers in thickness, with capacities of 12GB and 16GB. These chips are specifically designed to enhance the performance of mobile devices, allowing for faster processing of AI workloads directly on the device. The ultra-slim design of Samsung's new LPDDR5X units creates more space within mobile devices for a larger processor dedicated to AI tasks, improving performance and airflow while increasing heat resistance by 21%. Samsung is integrating AI capabilities across its mobile devices, including its Galaxy AI software that offers generative AI applications like Circle to Search, enabling intuitive web searches based on objects in photos. These DRAM chips are intended to optimize memory handling for users running Galaxy AI workloads on upcoming Samsung mobiles. Moreover, these chips can also be utilized in smartwatches or IoT devices for quicker on-device memory processing. Samsung is not only delivering superior LPDDR performance with advanced thermal management in a compact package but also planning to develop 24GB and 32GB modules for future devices. The company emphasizes its commitment to innovation and customer collaboration in meeting the evolving needs of the low-power DRAM market.

Most people like

Find AI tools in YBX