With high-tech industries focusing on artificial intelligence (AI) developments, businesses across all sectors are eager to use AI to help them raise operational efficiency and to uncover new business models. However, they must overcome high barriers to be able to build, train and optimize their AI models using either cloud-based systems or high-end workstations. Not only is the installation and configuration of hardware and software a cumbersome task but the system also consumes a great amount of power and network bandwidth, squeezing profit margins and hindering widespread AI development.
To meet these challenges, QNAP and IEI (with the "speeding into the future" initiative) are committed to developing high-performance, easy-to-use AI deep learning training and inference solutions to provide AI benefits to more people. As part of this initiative, QNAP's pioneering QuAI solution particularly grabs attention. QNAP and IEI demonstrated the ingenuities of QuAI at Computex 2018, showing how it helps with diagnosing macular degeneration as a real-world use case example.
"The development of an AI application essentially includes building and training models, adjusting parameters and putting them to real use," said Ruei-De Chang, AI Product Manager of QNAP. "QuAI is designed to help data scientists, engineers and even students easily control these processes without having to spend time installing or dealing with drivers, libraries, containers or virtual machines. Nor do they need to worry about data backup, data sharing or network settings," Chang added.
QNAP NAS coupled with QuAI delivers a deep learning solution with low TCO
QNAP believes the success of deep learning hinges on three critical factors: data, computing power, and algorithms. As NAS has always been ideal for data storage, QNAP leverages its accumulated advantages in this field and adds QTS support for deep-learning accelerator cards to equip NAS with additional computing abilities. QNAP NAS users can now easily tap into GPU accelerated computing to increase the performance of AI model training and inference.
QuAI is built with QNAP NAS as the underlying structure and the QTS operating system on top of that. It further includes Container Station, supporting Docker and other container technologies. By making good use of containers, users can freely select AI frameworks and libraries such as TensorFlow, Caffe, MXNet, Neon, the Microsoft Cognitive Toolkit, and Torch for easier development.
It's easy to start using QuAI. First, install and activate QuAI from the QTS App Center. Then install a deep learning accelerator card into NAS and the corresponding driver from the QTS App Center. Finally, build the needed containers in Container Station and start developing your AI application. Resources of the deep learning accelerator card can be allocated for use by QTS throughout the execution process.
QNAP emphasizes that building, training and optimizing AI models are the first and fundamental steps to AI application development. Deep learning accelerator cards, whether installed on workstations, cloud-based systems or QNAP NAS, are the key component. The important consideration is convenience and ease-of-use. QNAP NAS has the advantage of being close to the data, eliminating the need to move data around. Furthermore, with comprehensive data management features including data compression, deduplication, automated storage tiering, SSD caching, all-round data protection technologies, and quick, deployment, QNAP NAS is an incredibly cost-effective option.
QNAP is well aware that QuAI and cost-effective NAS are not quite enough to bring AI implementations to reality. It needs one last push - putting them to practical use in vertical applications. For this purpose, QNAP embraced the healthcare industry and developed a medical diagnosis system for age-related macular degeneration using QNAP NAS coupled with QuAI. The system can significantly shorten the time it takes to read OCT images, providing a successful use case that has impressed the market.
QNAP provides a wide range of QuAI solutions that target different applications including data analysis, image classification/object detection, and image segmentation. "IEI also provides bare-metal AI solutions for users to combine with QuAI libraries to meet wide-ranging AI model training needs," added Bo-Hong Yu, Product Management Director of IEI. For inference systems, supported models include the Intel OpenVINO TANK AIoT Dev Kit, RACK-500AI and PAC-400 supporting high-end NVIDIA graphics cards. For training systems, there is the GRAND-C422 with Intel Xeon W. Furthermore, AI accelerator cards for different architectures are also available, including CPU (Intel Kabylake ULT), Intel FPGA and Intel VPU.
QNAP presents its latest AI achievements and applications at Computex 2018
DIGITIMES' editorial team was not involved in the creation or production of this content. Companies looking to contribute commercial news or press releases are welcome to contact us.