CONNECT WITH US

Memory bottlenecks threaten data-center GPU efficiency as AI inference scales, says Micron SVP

Amanda Liang
0

Credit: AFP

Micron's senior vice president, Jeremy Werner, told The Circuit Podcast that memory has become a strategic bottleneck for data-center inference, warning that insufficient memory can sharply cut GPU utilization while faster, larger memory can theoretically...

The article requires paid subscription. Subscribe Now