Micron is joining the LPCAMM2 race dominated by Samsung and SK Hynix in the next-generation AI notebook memory market. Following the high-bandwidth memory (HBM) surge, the AI notebook semiconductor sector is evolving into a three-way competition.
Dt.co.kr reported that Micron has raised the data transfer speed of its LPCAMM2 modules to 8.5Gbps, aligning performance with rival products from Samsung and SK Hynix. Lenovo and Dell are expected to equip their next-generation AI notebooks with Micron's LPCAMM2.
Samsung was the first among the three memory leaders to unveil LPCAMM in September 2023. Compared with SO-DIMM, the module delivers up to 50% higher performance and 70% greater power efficiency. Samsung's initial LPCAMM supported 7.5Gbps, later boosted to 8.5Gbps with the 2024 release of LPCAMM2.
SK Hynix showcased its LPCAMM2 at CES 2025 in January, delivering a similar performance of about 8.5Gbps. The company confirmed it started supplying high-performance LPCAMM2 modules for AI PCs to select clients in the first quarter of 2025.
With AI demand surging, LPCAMM2 is increasingly seen as the next generation of DRAM.
Most notebooks still rely on SO-DIMM, valued for low power use and easy replacement, but constrained by its bulk, which limits slimmer designs. LPCAMM2 addresses this, meeting AI notebook demand for high performance and thinner, lighter builds, fuelling fast-rising adoption.
In addition, SO-DIMM transmits data through connectors, sockets, and the motherboard, which can cause data loss. LPCAMM2 places its connector underneath the module, enabling direct transmission without sockets and reducing signal loss.
Market studies show that as AI server demand soars, new memory solutions like LPCAMM2 are gaining traction. With low power consumption and modular architecture, LPCAMM2 is poised for large-scale deployment as major customers diversify their supplier base.
Article edited by Jack Wu