Compared to other DRAM products, HBM, formed by stacking multiple DRAMs, offers high bandwidth suitable for high-performance computing (HPC) applications, and therefore, chipmakers Nvidia, AMD and Intel are actively introducing products with HBM to seize the windfalls of rising AI demand.
Such a development has prompted memory device makers to accelerate HBM advancement, with SK Hynix, Samsung, and Micron all planning to launch HBM3E in 2024, narrowing the product generation gap.
In addition to expanding their facilities to ramp up capacities, the three firms have unveiled mass-production roadmaps for their next-generation products. Micron’s product deployment projects and specifications are the most comprehensive, indicating its eagerness to accelerate HBM deployment.
Chart 1: Bandwidth and specifications of each generation of HBM (GB/s)
Table 3: HBM roadmap of SK Hynix, Samsung and Micron, 2015-2024
Table 7: Specifications of AI chips by Korean IC designers to equip HBM from SK Hynix and Samsung
Table 8: Improvement in stacked layers and latest packaging technologies
Table 9: Status in player competition and future technologies