Seoul, January 6, 2026 – SK hynix Inc. (or "the company", www.skhynix.com) announced today that it will open a customer exhibition booth at Venetian Expo and showcase its next generation AI memory solution at CES 2026, in Las Vegas from January 6 to 9 (local time).
The company said that, "Under the theme 'Innovative AI, Sustainable tomorrow' we plans to showcase a wide range of next generation memory solutions optimized for AI and will work closely with customers to create new value in the AI era."
SK hynix has previously operated both a SK Group joint exhibition and a customer exhibition booth at CES. This year, the company will focus on the customer exhibition booth to expand touchpoint with key customers to discuss potential collaboration.
The company showcases 16-layer HBM4 product with 48GB, next generation HBM product, for the first time during the exhibition. The product is the next generation product of 12-layer HBM4 product with 36GB, which demonstrated industry's fastest speed of 11.7Gbps, and is under development aligned with customers' schedules.
12-layer HBM3E product with 36GB which will drive the market this year will also be presented. In particular, the company will jointly exhibit GPU modules that have adopted HBM3E for AI servers with customer and demonstrate its role within AI systems.
In addition to HBM, the company plans to showcase SOCAMM2, a low-power memory module specialized for AI servers, to demonstrate the competitiveness of its diverse product portfolio in response to the rapidly growing demand for AI servers.
Also, SK hynix will exhibit its lineup of conventional memory products optimized for AI, demonstrating its technological leadership across the market. The company will present its LPDDR6, optimized for on-device AI, offering significantly improved data processing speed and power efficiency compared to previous generations.
In NAND flash, the company will present its 321-layer 2Tb QLC product, optimized for ultra-high capacity eSSDs, as demand surges from rapid expansion of AI data centers. With best-in-industry integration, this product significantly improves power efficiency and performance compared to previous generation QLC products, making it particularly advantageous in AI data center environments where lower power consumption is needed.
The company will set up an 'AI System Demo Zone' where visitors can experience how its AI system memory solution that is being prepared for the future, interconnect to form AI ecosystem.
In this zone, the company will present customized cHBM optimized for specific AI chip or system, PIM based AiMX, CuD which conducts computing in memory, CMM-Ax that integrated computing capabilities into CXL memory, and Data-aware CSD.
For cHBM(Custom HBM), due to specific interest from customers, a large-scale mock-up has been prepared to allow visitors to visually sight its innovative structure. As the competition of the AI market shifts from mere performance to inference efficiency and cost optimization, this visualizes a new design approach that integrates part of computation and control functions into HBM which was handled by conventional GPU or ASIC in the past.
"As innovation triggered by AI accelerates further, customers' technical requirements are evolving rapidly," Justin Kim, President & Head of AI Infra at SK hynix, said. "We will meet customer needs with differentiated memory solutions. With close cooperation with customers, the company will create new value to contribute to the advancement of the AI ecosystem."

SK hynix's next-generation AI memory products on display at CES 2026. Credit: SK hynix



