CONNECT WITH US

What's behind Samsung and SK Hynix's intense HBM competition?

Daniel Chiang, Taipei, DIGITIMES Asia 0

Credit: Samsung

Samsung Electronics and SK Hynix are locked in fierce competition in the High-Bandwidth Memory (HBM) market, with attention turning to Nvidia, the leader in Artificial Intelligence (AI) semiconductors. Recently, Nvidia has been interacting frequently with the two South Korean manufacturers, which is seen as an effort to stimulate price and technological competition.

Analysts point out that while Nvidia has been engaging with HBM suppliers recently, there hasn't been clear information about actual orders, sparking industry suspicion of deliberately stimulating the competition between two HBM makers.

Previously, Nvidia's CEO Jensen Huang mentioned testing Samsung's 12-layer HBM3E product, but as of now, there's no clear information. However, in April 2024, Huang suddenly met with SK Group chairman Chey Tae-won in Silicon Valley, interpreted as an attempt to push Samsung to accelerate production and reduce prices.

Industry analysts note that the price of HBM3 has increased more than fivefold since 2023, and the upcoming generation of HBM3E products will likely be even more expensive. Therefore, for Nvidia, this could lead to excessive research and production costs, which is one of the reasons for Nvdia to stimulate competition between the two South Korean chipmakers.

According to reports from South Korea's Economic Daily, Samsung has mobilized "400 experts" recently to form a task force of 100 people, dedicated to improving the yield of 12-layer HBM3E and aiming to pass Nvidia's quality certification tests within May. Additionally, more than 300 people have been assigned to HBM4 development, with plans to complete HBM4 research by the end of 2024 and officially supply Nvidia in 2025.

Previously, SK Hynix announced plans to provide 12-layer HBM3E samples in May 2024, with mass production scheduled for the third quarter of 2024. SK Hynix also revealed a roadmap to mass-produce 12-layer HBM4 in 2025 and supply 16-layer products in 2026.

Since Samsung announced the development of 12-layer HBM3E ahead of its competitors in February 2024, there have been continuous rumors of early production. The industry generally agrees on mass production in the second quarter of 2024, earlier than SK Hynix's announcement for the third quarter (July to September). This investment of 400 experts highlights Samsung's determination to claim the title of "first mass-produced 12-layer HBM3E".

Samsung lagged behind SK Hynix in the HBM business in the past because Samsung disbanded its HBM business unit in 2019, leaving SK Hynix as the only stable producer of HBM3 when major tech firms began investing in generative AI development in 2023.

Currently, SK Hynix holds over 90% market share in the high-yield HBM3 market and has established a deep partnership with Nvidia, while Samsung mainly supplies lower-priced HBM2 and HBM2E. Therefore, Samsung attaches great importance to leading in the mass production of 12-layer HBM3E, aiming to seize the market and further increase profitability.

It is reported that Nvidia's new-generation AI accelerators, B200 and GB200 upgrades, may feature more than eight 12-layer HBM3E chips. Samsung believes that if it can compete with SK Hynix under the same conditions, there is a strong possibility of reversal, and Nvidia's intention to increase HBM suppliers also favors Samsung.

Some analysts also point out that Samsung started mass production of 8-layer HBM3E about a month later than SK Hynix, and currently, its production volume is lower. In 2024, where demand for 12-layer HBM3E isn't pronounced, stable yield rather than early mass production is crucial for ensuring profitability and technological competitiveness.