Despite unfavorable factors such as the ongoing global tariff wars, persistent doubts about the slow return on generative AI investments, and the US's continued expansion of technical and product restrictions on China's AI computing capabilities, major IT players remain undeterred, continuing to invest heavily in AI datacenter development.
Meanwhile, the performance and applications of large language models (LLMs) are steadily advancing, driving sustained investment and demand for AI servers. As a result, global AI server shipments are projected to reach 1.81 million units in 2025, with shipments of high-end models equipped with high bandwidth memory (HBM) surpassing one million units, a 40% increase from the previous year, according to DIGITIMES estimates.
Regarding the procurement of high-end AI servers, due to rapid advancements in commercial GPU cluster performance and the accelerated adoption of AI ASICs by some cloud users, North America's major cloud service providers are expected to account for 70% of procurement in 2025.
Chart 3: Global high-end and general AI server shipments, 2022-2025 (k units)
Chart 4: Global high-end AI server shipments by major CSPs and brands, 2023-2025 (k units)
Chart 5: Global high-end AI server shipment share by customer type, 2023-2025
Chart 6: Global high-end AI server shipments by end customers, 2023-2024 (k units)
Chart 7: Global high-end AI server shipment share by major configuration, 2023-2025
Chart 8: Global high-end AI server shipments by L6 manufacturers, 2023-2025 (k units)
Chart 9: Global high-end AI server shipment share by L6 manufacturers, 2023-2025
Chart 10: Global high-end AI server shipments by L10 manufacturers, 2023-2025 (k units)
Chart 11: Global high-end AI server shipment share by L10 manufacturers, 2023-2025