CONNECT WITH US

Introducing AI computing into cloud and edge computing servers, Chenbro facilitates smart transportation applications

Press release 0

With the high bandwidth, low latency and expansive connectivity of 5G, the combination of edge computing, 5G and AI will maximize the digitization of smart cities and become one of the driving forces of growth for the server market in the future. Within the architecture of smart transportation applications, "cloud computing" is equivalent to the brain and can handle relatively complex processes, while "edge computing" acts as the nerve endings, which can directly handle real time computing processes at the site of data generation and adopt appropriate responses. Introducing AI computing into cloud and edge is the next step in the overall development of smart cities and smart transportation. It can realize the implementation of AI smart applications from the edge to the cloud.

AI integrated with edge computing upgrades smart transportation applications

Edge computing in smart transportation is the delivery of real time transportation information to edge computing servers for preliminary assessment through the 5G network. The data is then standardized for further analysis and handling. The edge computing servers are also capable of running AI processes and can store, filter and process the transportation data. Using the edge computing capabilities, feedback can be monitored in real time, thus realizing smart management for transportation.

Chenbro has introduced the RM352 3U GPGPU Short Depth Edge Computing Server Chassis that can be fitted with up to 4 GPUs or FPGA accelerators, which can improve the performance of edge AI computing and machine learning. The edge computing server chassis also provide FIO (front I/O) and RIO (rear I/O) connection mechanisms, which are easy to maintain and manage and can satisfy different application needs. Many edge computing devices must be built in a limited space; therefore, Chenbro uses the 450mm ultra-short chassis to meet the requirements of edge computing deployment, combining high-performance computing, sufficient storage and network kits in a shallow-depth and small-sized chassis for flexible applications in the deployment of edge computing devices.

Cloud servers with AI computing analyze smart transportation big data

During the data processing of smart transportation, information can be acquired and sent back to the cloud server for deep learning when the recognition and judgment accuracies of AI edge computing equipment need to be improved. Cloud servers rely heavily on computing performance. Servers using GPUs/GPGPUs can satisfy the deep learning needs of AI in the big data era, especially for the application of image recognition in vehicle flow identification or smart transportation data analysis.

Chenbro launched the SR113 4U Rackable Tower Server Chassis, a GPGPU server chassis specialized for AI processing and deep learning. It can support up to 5 GPGPU cards. The AI processing server can use high performance computing to satisfy large scale processing needs and use pretrained deep learning models to analyze data. Maintaining high performance computing and chassis heat dissipation mechanisms are the keys to cloud server stability. In order to improve the cooling effects, the server chassis is designed with 4 pre-installed fans and 2 additional fans can be installed in the rear for CPU cooling and 2 exterior fans can be installed for GPU cooling. Chenbro provides AI processing server chassis with good cooling and tested integrated electromechanical components through powerful hardware technical support.

From edge to cloud, hardware performance improvements accelerate AI implementation

Edge computing will not replace cloud computing. In fact, edge computing supplements cloud computing. Currently, the trend of the two technologies working together is becoming more apparent, especially with the introduction of AI. Smart and high performance edge computing will be the last step in implementing AI applications. Therefore, Chenbro is actively investing in the development of edge computing servers and AI server chassis. Recently, Chenbro has introduced server chassis products that can be installed with a GPU card in the RM238 and RM245 series 2U Versatile Storage Server Chassis. These products can satisfy the application needs of customers from terminals to clouds, assisting customers in implementing AI in smart transportation and various aspects of smart cities.

As 5G, AI and edge computing technologies revolutionize the industry, Chenbro will continue the development of AI and edge computing server chassis and realize support for multi-platform motherboards and GPU cards through the design concept of "compatibility". Chenbro provides compatible server chassis and helps system integrators choose different motherboards based on corporate user needs. Besides providing OTS standard products to white box server providers, system integrators and channel partners, Chenbro also designs customized server solutions according to the needs of different customers and develops new products with its partners. Through many years of research and development and manufacturing experience, Chenbro has developed diverse service models to satisfy customer needs and to seize the new business opportunities of the 5G and AIoT age.

Chenbro has developed diverse service models to satisfy customer needs and to seize the new business opportunities of the 5G and AIoT age

Chenbro has developed diverse service models to satisfy customer needs and to seize the new business opportunities of the 5G and AIoT age