CONNECT WITH US
Tuesday 20 May 2025
From cloud to edge computing, Silicon Motion enables high-performance and low-power storage for AI Era
With 2025 Computex Taipei focusing on the three major themes of "AI & Robotics", "Next-Gen Tech" and "Future Mobility", global technology giants have gathered to display their AI technology prowess, focusing on the core concept of "AI Next". The rapid deployment of AI applications has also accelerated urgent demand for high-efficiency storage technologies across various application scenarios. As the global leader of NAND Flash controllers, Silicon Motion plays a key role in AI ecosystem development. Meeting Diverse Storage Requirements, from Low Latency and Power Efficiency to High Data Throughput, to Support Edge AI Growth"The emergence of DeepSeek has greatly lowered the threshold for AI applications," pointed out Mr. Kou, President and CEO of Silicon Motion. As an open source technology, DeepSeek has been able to reduce the cost of language model training. It has gradually subverted the industry's traditional views on AI and led to the accelerating popularization of edge applications. He emphasizes that a wave of AI adoption has already begun for devices from smartphones and laptops to wearable devices, and that storage technologies are crucial in supporting this revolution.In his analysis of AI storage architecture, Mr. Kou remarked that the storage system requirements for each stage of the implementation process differ when implementing AI applications in various scenarios, from initial data ingestion to the preparation, training, and inference stages. For instance, data ingestion requires import of a large amount of data, meaning that high write throughput is required. On the other hand, low latency performance and support for a wide variety of I/O sizes has greater importance in the model training stage. Although these requirements vary, the overall architecture must still possess five core characteristics: high data throughput, low latency, low power, scalability, and high reliability, in order to meet the needs of AI applications.In response to the massive data demands of AI applications, Silicon Motion leads innovation in storage technologies by upgrading NAND controller technology. Mr. Kou said that data application processes can be effectively optimized through hierarchical management and smart identification mechanisms. Flexible data placement (FDP) technology can also serve to improve efficiency and durability, while also offering the advantages of being low latency and low cost. For data security and reliability, the product also adopts advanced encryption standards and a tamper-proof hardware design. In combination with end-to-end data path protection mechanisms and Silicon Motion's proprietary NANDXtend™ technology, this enhances data integrity and prolongs the SSD's lifespan. In addition, Silicon Motion supports 2Tb QLC NAND and 6/8-Plane NAND, combining smart power management controllers (PMC) with advanced process technology to effectively reduce energy consumption while improving storage density.Not only that, it can also be paired with Silicon Motion's unique PerformaShape technology, which utilizes a multi-stage architecture algorithm to help optimize SSD performance based on user-defined QoS sets. Together, FDP and PerformaShape can not only help users effectively manage data and reduce latency, but also significantly improve overall performance by approximately 20-30%. These technologies are specifically suited for AI data pipelines in multi-tenant environments, including key stages such as the data ingestion, data preparation, model training, and inference processes.Creating Comprehensive Solutions to Realize Customer AI Applications Across Cloud and Edge ComputingIn response to data center and cloud storage needs, Silicon Motion has launched the world's first 128TB QLC PCIe Gen5 enterprise SSD reference design kit. By adopting the MonTitan SSD development platform, which comes equipped with an SM8366 controller, it is able to support PCIe Gen5 x4, NVMe 2.0, and OCP 2.5 standards. With a continuous read speed of over 14 Gb/s and a random access performance of over 3.3 million IOPS, it boasts a performance improvement of over 25%. This design is able to speed up training of large language models (LLM) and graph neural networks (GNN) while also reducing AI GPU energy consumption, allowing it to meet high-speed data processing demands.For edge storage solutions, Mr. Kou stated that the number of edge devices with AI capabilities will grow rapidly. He forecast: "The AI humanoid robot market will see explosive growth in the next 5 to 10 years." Systems at different levels have different storage requirements. For example, at the sensor level, data needs to be processed and filtered in real time to ensure accurate data sensing, while decision-making relies on multi-modal fusion reasoning, which entails more demanding storage performance and data integration capabilities. Meanwhile, at the execution level, various calibration parameters must be stored to enable the robot to act and think more similarly to humans. In response, Silicon Motion has actively deployed NVMe SSD, UFS, eMMC, and BGA SSD storage solutions, and values greater cross-industry collaboration to build a shared eco-system, in order to promote the further evolution of smart terminal storage technologies.Additionally, Silicon Motion has launched a variety of high-efficiency, low-power controller to meet the AI application needs of edge devices: The SM2508 PCIe Gen5 controller is designed for AI laptops and gaming consoles, featuring up to 50% lower power consumption compared to similar products. The SM2324 supports USB 4.0 high-speed portable storage devices up to 16TB in size. The SM2756 UFS 4.1 controller has a 65% higher power efficiency compared to UFS 3.1, providing an excellent storage experience for AI smartphones. In response to the urgent need for high-speed and high-capacity storage required for self-driving cars, Silicon Motion has also joined hands with global NAND manufacturers and module makers to jointly create storage solutions for smart automobiles."Storage technology undoubtedly acts as a core link in the AI ecosystem," emphasized Mr. Kou. Taiwan has a complete and highly integrated semiconductor and information and communications industry chain. It is capable not only of building AI servers, but also possesses great potential for promoting the development of AI applications. He believes that more practical AI edge computing devices and groundbreaking applications will be launched at a rapid pace in the future, and that storage solutions will face increasingly demanding requirements due to challenges in processing massive amounts of data. Silicon Motion will continue to use technological innovation as a driving force to actively support AI development.Mr. Kou expressed that the fast-paced development of generative AI has led to lower barriers to adoption for related applications. Silicon Motion aims to satisfy the market's needs through offering a diverse range of high-efficiency, low-power storage solutions.Photo: Silicon Motion Technology
Tuesday 20 May 2025
Montage Technology embraces CXL innovation to scale memory and bandwidth for enhanced data center performance
Montage Technology is a global fabless semiconductor design company specializing in data processing and interconnect chip solutions. Founded in 2004, the company is headquartered in Shanghai, with operations and partnerships spanning key international markets. The company focuses on DRAM memory modules and data processing solutions for cloud computing and data center markets, addressing the soaring demand for high-bandwidth and large-capacity memory driven by artificial intelligence (AI) and enterprise workloads. Its product portfolio includes Memory Interface chips, Memory Module Controller ICs, PCIe Retimer chips, and more.Driven by the increasing demand for generative AI, machine learning, big data analytics, datacenter construction, one of the recent product development trends of Montage Technology is aiming the Compute Express Link® (CXL®) technology, which enables high-speed interconnect between CPUs and DRAM memory, addressing memory challenges by enhancing system performance, scalability, and cost-efficiency. Built on the PCIe physical layer, CXL introduces innovative capabilities such as DRAM memory expansion, sharing, pooling, and dynamic configuration, effectively eliminating traditional data processing bottlenecks in data-intensive systems and data center servers.In this interview, Geof Findley, World Wide VP of Sales & Business Development at Montage Technology, discusses the recent advancements in memory technologies aimed at increasing datacenter performance. Montage delivers versatile memory solutions that unlock next-gen memory bandwidth and performance, specifically tailored for AI and data-intensive workloads. These solutions are already in use by major global DRAM manufacturers and electronics OEMs/ODMs, underscoring strong partnerships with companies such as SK hynix, Samsung Electronics and Micron Technology.Credit: Montage TechnologySpecialty memory buffers and MRDIMM modulesWith more than 20 years of experience in memory products, the company maintains stable profits and proven track records. According to Findley, there are three key product lines driving growth for Montage, the first being specialty DRAM buffers. Montage started to develop its DRAM module buffer technologies very early. Given the buffer chip's critical role between the processor and DRAM memory, Montage collaborates with major CPU giants such as Intel and AMD, leading silicon IP vendors, as well as the world's top three memory manufacturers -- Micron, Samsung and SK hynix.Currently, Montage attributes its strong growth to the rapidly increasing demand for DDR5 memory used in data centers. Shipments of its DDR5 Registering Clock Driver (RCD) chips have grown substantially. Its 4th-gen DDR5 RCD chips deliver data rates of up to 7200 MT/s, a 50% increase over the 1st-gen products. Alongside its RCD portfolio, the company also provides DDR5 Data Buffers (DB) and other essential DDR5 module supporting chips like SPD EEPROM with Hubs, Temperature Sensors, and Power Management ICs.Considering the high throughput and low latency data processing use cases, MRDIMM (Multiplexed Rank Dual In-Line Memory Modules) are particularly useful to handle larger datasets such as large-scale databases, virtualization, and real-time analytic operations. In January 2025, Montage has successfully sampled its Gen2 Multiplexed Rank Registering Clock Driver (MRCD) and Multiplexed Rank Data Buffer (MDB) chipset to global leading memory manufacturers in South Korea, Japan and North America. These series IC solutions ensure interoperability and system-level performance for customers seeking to leverage the DDR5 MRDIMM Gen2 standard in high-throughput data-intensive applications. Montage's MRCD and MDB chips are fundamental to MRDIMM operationCredit: Montage TechnologyCXL adoption in data centers has moved from crawling to toddlingThe second product focus for Montage is CXL memory. CXL memory expansion is particularly valuable in scenarios where high DRAM capacity is required with only one DIMM per channel, or when servers lack available DIMM slots. Montage delivers the CXL Memory eXpander Controller (MXC) chips supporting the CXL 1.0, 2.0 and 3.1 specifications. These MXC chips comply with JEDEC specifications for both DDR4 and DDR5 memory. The mass-produced MXC Gen1 chips support CXL 2.0 with DDR4-3200/DDR5-5600, while the MXC Gen2 chips support CXL 2.0 and are compatible with DDR5-6400 memory.The development of CXL controllers is closely tied to PCIe interface technology. Current CXL 1.0 and 2.0 specifications primarily align with PCIe 5.0, while future CXL 3.x specifications are expected to align with PCIe 6.x to support even higher-speed memory channels. As memory pooling becomes more ambitious and widely adopted, the industry anticipates a surge in CXL deployment and volume scaling starting in 2026.The MXC product line is designed for use in Add-in Cards (AICs), backplanes or EDSFF memory modules to enable significant scalability in both memory capacity and bandwidth. Montage's MXC controllers are currently deployed by the world's top 3 memory manufacturers in E3.S form factor CXL memory modules. In parallel, Montage has launched several new product development projects in collaboration with Taiwan OEM/ODM partners. One such project involves working with a Taiwanese memory module manufacturer to develop CXL expansion card solutions. Another involves co-designing CXL memory AICs with major Taiwanese motherboard and server manufacturers, targeting OEM/ODM opportunities with global cloud data center providers.Credit: Montage TechnologyAs the second largest PCIe 5.0 and PCIe 4.0 Retimer supplierThe third product line for Montage is its Retimer chips, originally designed to enhance connectivity performance between GPUs, AI accelerators, CPUs, and other components within server systems. Retimer chips regenerate high-speed digital signals to extend reach and improve signal integrity in high-speed data processing systems. Currently, a typical AI server – often equipped with 8 GPUs -- requires 8 or even 16 PCIe 5.0 Retimer chips.Montage started delivering its PCIe 5.0/CXL 2.0 Retimer chips in January 2023, putting massive effort to extensive interoperability testing with a variety of compute, storage and networking components, such as CPUs, PCIe switches, SSDs, GPUs and NICs. As a result, Montage is now the second-largest PCIe 5.0 and PCIe 4.0 Retimer supplier globally and is dominating the market in China.Montage's Retimer chips are integrated into a variety of systems, including AI accelerator baseboards, server motherboards and riser cards. Montage is now providing the customer samples of its 16-lane PCIe 6.x/CXL 3.x Retimer chips as part of its new product roadmap.Activities in COMPUTEX Taipei 2025Montage has a global team of over 700 employees. During the trade show season and COMPUTEX Taipei 2025, Montage and its Taiwan partners will showcase a series of silicon products in a hotel suite showroom at the Place Taipei Hotel in the Nangang district. Findley describes this initiative as a strategic marketing campaign focused on highlighting the company's latest products -- including Retimer chips -- and introducing Montage brand to attract new customers from Taiwan's electronics supply chain and server manufacturing sector. In addition, the company will host advanced product training sessions to capture new business opportunities.AI applications are rapidly increasing in computational demands, doubling every few months, and now represent the primary driver of the PCIe roadmap–making PCIe Gen 6 a key requirement for the next generation of data centers. Meanwhile, CXL technology is reshaping the industry with its innovative memory architecture. Montage looks forward to working with Taiwan-based electronics OEMs/ODMs, server manufacturers and ecosystem partners to unlock the transformative potential of CXL technology and drive future success.
Tuesday 20 May 2025
IEI Insight Days 2025, May 21–23: Industrial AI, resilient networking, and smart healthcare in action
As Computex 2025 draws global attention to Taipei, IEI Integration Corp., in collaboration with QNAP Systems, Inc., will host its annual technology showcase, IEI Insight Days, from May 21 to 23 at POPOP Taipei. Designed for industry professionals, this three-day event focuses on actionable solutions in Industrial AI, resilient network infrastructure, and smart healthcare applications—bringing together real-world use cases, expert insights, and emerging technologies that address the evolving needs of edge computing and system integration.Hosted alongside Computex, this exclusive event by IEI and QNAP invites industry professionals to explore edge innovations in a relaxed and focused environment at POPOP Taipei.A Dedicated Experience Beyond the Show FloorLocated just one MRT stop from the Computex exhibition hall, POPOP Taipei offers a refreshing alternative to traditional trade show venues. Combining historical charm with modern design, the event space provides an ideal setting for relaxed yet focused dialogue. Whether you're exploring partnership opportunities, seeking insights on deployment strategies, or simply taking a break from the busy show floor, IEI Insight Days offers a curated environment to connect, learn, and exchange ideas.Event Highlights🔹 Focus Areas: Edge AI / Network Integration / AI-powered Healthcare 🔹 Showcase Solutions:  • Redundancy-enabled and Recovery-Ready Edge Platform  • Enterprise-grade networking infrastructure  • AI healthcare computing with real-time image processing and voice command capabilities 🔹 Networking Space: Open demo zones and seating areas designed for spontaneous technical conversations and business engagementIEI Insight Days is more than a product showcase — it's a hub for collaboration and conversation. We warmly welcome Computex attendees, industry partners, and decision-makers to stop by and engage with us. Dive into the latest trends in edge computing, network infrastructure, and medical AI.