CONNECT WITH US

Samsung breaks through memory bottleneck; announces research results on HBM-PIM and LPDDR-PIM

Jessica Tsai, Taipei; Jack Wu, DIGITIMES Asia 0

At the 2023 Hot Chips forum, in addition to Intel's announcement of its data center chip product, the latest report from Korea's TheElec also pointed out Samsung Electronics has announced its research results on high bandwidth memory (HBM)-processing-in-memory (PIM) and low power DDR (LPDDR)-PIM as part of its efforts to focus on the AI sector.

Previously, Samsung and AMD began a collaboration related to PIM technology. Samsung equipped HBM-PIM memory onto AMD's commercial GPU accelerator card, the MI-100. According to Samsung's research results, applying HBM-PIM to generative AI will more than double the accelerator's performance and power consumption efficiency compared to existing HBMs.

To solve the memory bottleneck that has begun to appear in the AI semiconductor sector in recent years, next-gen memory technologies like HBM-PIM have received significant attention. The HBM-PIM conducts computation processing within the memory itself through PIM technology. This simplifies the data movement steps, thereby enhancing performance and power efficiency.

Furthermore, to verify the Mixture of Experts (MoE) model, Samsung used 96 HBM-PIM-equipped MI-100 units to build a HBM-PIM cluster. In the MoE model, the HBM-PIM accelerator doubled the performance and tripled the power efficiency compared to HBM.

Industry sources explained that the speed of memory development has been slower compared to the advancements in AI accelerator technology. To alleviate this memory bottleneck, it's necessary to expand the application of next-gen semiconductors like HBM-PIM. Additionally, in sectors like LLM, many data sets are frequently reused. Therefore, utilizing HBM-PIM computation can also reduce data movement.

On the other hand, Samsung also introduced the "LPDDR-PIM," which combines mobile DRAM with PIM to enable direct processing and computing within edge devices. Notably, because LPDDR-PIM is designed for edge devices, it offers lower bandwidth (102.4GB/s) and saves 72% of power compared to DRAMs.

Previously, Samsung revealed its AI memory plans during its 2Q23 earnings call. It not only mentioned that the HBM3 supply was undergoing customer verification but also stated that it's actively developing new edge AI memory products and PIM technology. Looking ahead, both HBM-PIM and LPDDR-PIM are still some time away from commercialization. Compared to existing HBMs, PIMs are quite expensive.

The Hot Chips forum is a prominent academic event in the semiconductor industry. It's typically held in late August. Apart from Samsung, other major companies like SK Hynix, Intel, AMD, and Nvidia also participated in this event.