CONNECT WITH US

Micron ships HBM4 to key customers, targets solid market share in 2025

Siu Han, Taipei; Charlene Chen, DIGITIMES Asia 0

Credit: DIGITIMES

As data centers face increasing demands for AI training and inference workloads, high-bandwidth memory (HBM) has become a critical competitive edge for memory manufacturers. Micron announced that its 12-layer stacked 36GB HBM4 samples have been delivered to multiple major customers. Utilizing the 1β DRAM process technology, mass production is expected to begin in 2026 to support customer growth on next-generation AI platforms.

Targeting market share parity

Sumit Sadana, Micron's executive vice president and chief business officer, told DIGITIMES that AI is fundamentally reshaping the memory industry, creating significant growth opportunities. Micron continues to ramp up HBM capacity with strong quarterly growth, resulting in notable increases in both HBM revenue contribution and market share each quarter.

Previously, Micron projected that its HBM market share would reach parity with its DRAM market share—around 20% to 25%—in the second half of 2025.

Sadana stated that the HBM market share will catch up with Micron's overall DRAM share and is currently on track to meet this goal. The company plans to disclose further growth targets for 2026 at an appropriate time and will invest in customized HBM designs aligned with customer needs.

Expanding global manufacturing footprint

Sadana highlighted that Micron maintains a crucial DRAM center in Taiwan, covering manufacturing and packaging technologies, and has established a vital partnership with TSMC.

Micron is expanding DRAM capacity with wafer fabrication facilities in Taiwan and Japan, preparing for mass production. Additionally, a new fabrication facility in Idaho is planned to start production in 2027.

Regarding HBM backend packaging, operations are split between Taiwan and Singapore. Given HBM's need for more cleanroom space, Micron will expand backend capacity in Singapore to meet future sustained growth requirements.

Power efficiency advantages

With data centers consuming massive amounts of power and impacting global grids, Micron noted that its existing HBM3E consumes 30% less power than competitors' products, representing a major advantage. Furthermore, Micron is currently the only company worldwide capable of mass-producing LPDRAM shipments for data center applications.

HBM4 performance breakthrough

Amid growing adoption of generative AI applications, Micron stated that HBM4 memory will feature a 2048-bit interface with per-stack transfer rates exceeding 2.0TB/s, delivering over 60% performance improvement compared to previous generations.

This expanded interface will facilitate high-speed communication and high-throughput design, enhancing inference performance for large language models and chain-of-thought reasoning systems. In other words, HBM4 will enable AI accelerators to achieve faster response times and more efficient inference capabilities.

Additionally, HBM4 improves energy efficiency by more than 20%, providing maximum throughput at minimal power consumption, thereby maximizing data center efficiency.

Customization and next-generation development

Raj Narasimhan, senior vice president and general manager of Micron's cloud memory business unit, said that HBM4 production schedules will closely align with customers' next-generation AI platform readiness to ensure seamless integration and timely scaling of output to meet market demand.

Besides delivering the latest HBM4 to mainstream customers, Sadana mentioned that customers seek customized versions, and development of the next-generation HBM4E is underway. Collaborative efforts with specific customers to co-develop tailored HBM solutions will add further value to memory offerings.

Article edited by Jerry Chen