CONNECT WITH US

Digitimes Research: FPGAs getting a foothold in deep learning inference chip market

Osiris Hu, DIGITIMES Research, Taipei

While Nvidia is currently a leading player in the deep learning inference chip market, powered by its GPGPU (general-purpose computing on graphics processing unit) processors, other vendors are offering FPGA-based solutions to contest for market shares.

For example, Alibaba Cloud has chosen Intel Altera Arria 10 FPGAs to power its F1 instance as well as Xilinx KU115 FPGAs for its F2 instance.

Microsoft's FPGA-based Project Catapult servers are also designed to improve its Bing search engine services and Azure cloud computing services, further highlighting increasing influence of FPGAs in the deep learning inference processor market.

While the inherent ability of an FPGA to be reconfigured and reprogrammed at any time allows FPGA-based products to be developed rapidly to shorten time to market, the use of Hardware Description Language (HDL) also enables FPGAs to accelerate deep learning inference processes and adapt to the rapid evolving of different algorithms.

Meanwhile, Google is also developing its custom ASIC chips for deep learning inference processing. Such ASIC-based solutions will be able to optimize algorithm configuration for specific tasks and also allow streamlined hardware designs and minimized chip sizes to enhance the advantage of terminal-end devices, Digitimes Research believes.