With silicon-based computing gradually constrained by the bottleneck of Moore's Law, many IT companies have started turning their development focuses to AI computing which requires a new architecture to resolve its performance and power consumption issues. Another focus is quantum computing, which is completely different from the existing silicon-architected systems, but has strong capability in resolving complicated problems, according to DIGITIMES Research's latest study.
As cloud-related applications are increasingly reliant on AI computing and high-performance computing (HPC), while the demand for better power consumption from smart vehicles, AR/VR and IoT devices is rising, the development of new computing technologies is expected to take off in the next couple of years.
AI computing, thanks to the training on deep learning models since 2012 and GPU's strong computing power, is already outperforming humans in segments related to computer vision and natural language processing (NLP). It has also helped related application markets to grow, DIGITIMES Research's finding shows.
The training of AI models is heading in two major directions at the moment: first, medium to ultra-large neural networks conduct pre-train via supercomputers or systems with massive GPU clusters. Second, small low-power consumption systems such as smartphones and AR/VR conduct training via deep learning accelerators (DLA) and microcontrollers (MCUs) to achieve the most efficient per-watt performance.
Different heterogeneous processors output different results in performance and per-watt efficiency, but they all follow the same rule that the higher the performance, the lower the per-watt efficiency is. Currently, processors with ultra-high per-watt efficiency (near or large than 1petaOPS/W) are still at the stage of R&D with neuromorphic computing's compute in memory (CIM) and analog AI processors being key development focuses.
Quantum computing, through the design of quantum circuits and quantum gates, uses qubit's strong physical properties to resolve simulation, search, or algebra problems that are unable to be resolved by contemporary computers. A qubit can be generated via electrons, atoms or photons. Superconducting technology, which is developed mainly by IBM and Google, is the most mainstream method to form a quantum computer at the moment, but other technologies such as photons and trapped ions still have an opportunity to catch up.
A quantum computer's performance is judged by the number of its qubits. IBM's newest quantum processor Osprey announced in November 2022 already features 433 qubits. The US company is set to launch a quantum computer with over 1,000 qubits and will commercialize the product in 2023.