CONNECT WITH US
Wednesday 14 May 2025
VIA Labs presents USB4 gaming dock reference design compliant with the latest EU EUP/ERP regulations
VIA Labs, Inc. (VLI), a leading provider of USB4, SuperSpeed USB, USB PD Controllers, and Display Controllers, has unveiled a USB4 gaming dock reference design that meets the latest EU EUP/ERP energy efficiency requirements. The design features VL832, a USB-IF certified USB4 Device Controller; VL109, a USB PD 3.2 certified Power Delivery Controller; and VL605, a DisplayPort to HDMI 2.1 Protocol Converter. VIA Labs will showcase its latest product solution at Computex 2025 from May 20 to 23 at the Nangang Exhibition Center, Hall 1, USB-IF Community – VIA Labs, Inc. (Booth N0313).Since the launch of Nintendo's Switch in 2017, which became a massive hit with its support for both handheld and TV modes, the market has seen the emergence of various gaming handheld consoles. Valve's Steam Deck has attracted significant attention due to its impressive ability to run PC games. Subsequently, gaming handhelds such as ASUS ROG Ally, MSI CLAW, Lenovo Legion Go, and Acer Nitro Blaze series have also been released, rapidly expanding the handheld gaming market. There are even speculations that Xbox and PlayStation will launch handheld gaming consoles in 2027. These gaming handhelds are equipped with powerful and efficient graphics processors, capable of running various AAA game titles while using battery power. They also support VRR (Variable Refresh Rate) technology, which enhances the gaming experience by eliminating screen tearing, reducing input lag, and improving overall game smoothness. Moreover, these handhelds support USB-C connectivity, allowing users to switch from handheld mode to TV mode and enjoy gaming on a television or gaming monitor for a more immersive visual experience. Most of these gaming devices also support USB4 or Intel Thunderbolt, enabling high-speed video, audio, and USB data transmission at up to 40Gbps.Concurrently, within the wider regulatory environment for electronic goods, the EU's EUP (Energy-Using Products) and ERP (Energy-Related Products) regulations are also evolving. These aim to reduce the environmental impact of energy-consuming products and improve energy efficiency. The updated regulation (EU) 2023/826 expands the scope of control and imposes stricter power consumption limits for electronic and electrical equipment in off, standby, and networked standby modes. Starting May 9, 2025, products must comply with the new requirements, with power consumption not exceeding 0.5 watts in standby or off mode. From 2027 onward, the limit will be further reduced to 0.3 watts.In response to emerging trends in the gaming market and the upcoming EU 2027 EUP/ERP regulations, VIA Labs has introduced a USB4 gaming dock reference design that delivers new levels of functionality and performance while meeting these stringent energy efficiency standards. The design features the VL109, a 3-port USB PD 3.2 controller optimized for ultra-low standby power consumption. It supports the USB4 device controller VL832, enabling USB4 40Gbps or legacy DisplayPort Alternate Mode, and also provides USB PD EPR charge-through, delivering up to 140W of power to the host. In standby mode—when the upstream port is not connected to host—the design intelligently manages power to other key components on the board, reducing total power consumption to only 10 milliwatts. This is significantly below the 300 milliwatt limit mandated by the 2027 EUP/ERP standby/off-mode requirements, offering very large margin for developers to implement additional features or functionality.To provide flexible yet cost-effective display options for gamers, this reference design features an HDMI 2.1 port and a USB-C port supporting DisplayPort Alternate Mode. It avoids the cost of a DP MST hub (required for simultaneous dual output) by activating video output on a first-come, first-served basis. Gamers can easily connect to either HDMI Displays or the growing range of USB-C portable monitors. Through VL832's USB4 DP tunneling mode, either interface can deliver high refresh rates of up to 4K@240Hz.Another highlight of this reference design is the VL605, a fully VRR-capable HDMI 2.1 Protocol Converter. The VL605 converts DisplayPort signals from the VL832 into HDMI 2.1 output at up to 8K@60Hz or 4K@240Hz. It preserves VRR capabilities by translating DisplayPort's Adaptive Sync for seamless operation over HDMI 2.1. This ensures seamless VRR functionality is enabled across AMD, Intel, and NVIDIA G-SYNC Compatible platforms. Furthermore, VL605 has passed AMD FreeSync validation testing, and its OUI (Organizationally Unique Identifier) has been incorporated into AMD's latest GPU drivers, enabling FreeSync Premium Pro functionality on compatible AMD platforms as well. Additionally, VL605 supports LPCM and various compressed audio formats. It integrates a DSC decoder and complies with the latest DisplayPort 2.1 standard, supporting both Autonomous Mode and Source-controlled Mode, which provide enhanced control capabilities for the signal source. To safeguard against malicious firmware attacks, VL605 also enables secure firmware updates via built-in ECDSA authentication.Huineng Chang, Director of Product Management Division at VIA Labs, emphasized the potential of VL605 in the gaming market. He stated: "With increasingly powerful handheld game consoles entering the market, mobile chipsets from Apple, Qualcomm, and MediaTek now support hardware-based ray tracing. This marks a shift where gamers are no longer confined to traditional consoles like Xbox or PlayStation, but can enjoy AAA titles on handheld game consoles. On the other hand, many of today's TVs support high refresh rates and Game Mode, while portable monitors have also become more popular and widely adopted. In response to these trends, VIA Labs has introduced this reference design alongside a range of USB Hub and USB PD controllers, enabling VL605 to support USB-C multifunction docks with USB PD charging and compliance with the 2027 EUP/ERP energy regulations. With VRR support across platforms, VL605 empowers handheld game consoles to transform into high-end home gaming consoles, delivering a gaming experience that rivals traditional console systems."VIA Labs VL605: https://www.via-labs.com/product_show.php?id=123VIA Labs VL832: https://www.via-labs.com/product_show.php?id=119VIA Labs Exhibition InformationComputex Taipei 2025Date: 2025/5/20-23Location: Nangang Exhibition Hall, Hall 1, Taipei, TaiwanBooth: VIA Labs, Inc. at USB-IF Community (N0313)About VIA Labs, Inc.VIA Labs, Inc. (VLI) is a leading supplier of USB4, SuperSpeed USB, USB PD Controllers, and Display Controllers. A subsidiary VIA Technologies, Inc., VLI leverages its expertise in analog circuit design, high-speed serial interfaces, and systems integration to create a rich product portfolio that includes USB Host, Hub, and Device controllers in addition to USB PD and charging controllers. VIA Labs, Inc. has demonstrated technology and industry leadership through Standards Development and bringing newly developed USB Technologies to market. www.via-labs.com
Friday 9 May 2025
Is your AI system built to last—or bound to fail? Only DEEPX has the answer
DEEPX ensures unmatched AI reliability with lower power, lower heat, and a total cost of ownership lower than even "free" chips.For Lokwon Kim, founder and CEO of DEEPX, this isn't just an ambition—it's a foundational requirement for the AI era. A veteran chip engineer who once led advanced silicon development at Apple, Broadcom, and Cisco, Kim sees the coming decade as a defining moment to push the boundaries of technology and shape the future of AI. While others play pricing games, Kim is focused on building what the next era demands: AI systems that are truly reliable."This white paper," Kim says, holding up a recently published technology report, "isn't about bragging rights. It's about proving that what we're building actually solves the real-world challenges faced by factories, cities, and robots—right now."Credit: DEEPXA new class of reliability for AI systemsWhile GPGPUs continue to dominate cloud-based AI training, Kim argues that the true era of AI begins not in server racks, but in the everyday devices people actually use. From smart cameras and robots to kiosks and industrial sensors, AI must be embedded where life happens—close to the user, and always on.And because these devices operate in power-constrained, fanless, and sometimes battery-driven environments, low power isn't a preference—it's a hard requirement. Cloud-bound GPUs are too big, too hot, and too power-hungry to meet this reality. On-device AI demands silicon that is lean, efficient, and reliable enough to run continuously—without overheating, without delay, and without failure."You can't afford to lose a single frame in a smart camera, miss a barcode in a warehouse, or stall a robot on an assembly line," Kim explains. "These moments define success or failure."GPGPU-based and many NPU competitor systems fail this test. With high power draw, significant heat generation, the need for active cooling, and cloud latency issues, they are fundamentally ill-suited for the always-on, low-power edge. In contrast, DEEPX's DX-M1 runs under 3W, stays below 80°C with no fan, and delivers GPU-class inference accuracy with zero latency dependency.Under identical test conditions, the DX-M1 outperformed competing NPUs by up to 84%, while maintaining 38.9°C lower operating temperatures, and being 4.3× smaller in die size.This is made possible by rejecting the brute-force SRAM-heavy approach and instead using a lean, on-chip SRAM + LPDDR5 DRAM architecture that enables:• Higher manufacturing yield• Lower field failure rates• Elimination of PCIe bottlenecks• 100+ FPS inference even on small embedded boardsDEEPX also developed its own quantization pipeline, IQ8™, preserving <1% accuracy loss across 170+ models."We've proven you can dramatically reduce power and memory without sacrificing output quality," Kim says.Credit: DEEPXReal customers. Real deployments. Real impact.Kim uses a powerful metaphor to describe the company's strategic position."If cloud AI is a deep ocean ruled by GPGPU-powered ships, then on-device AI is the shallow sea—close to land, full of opportunities, and hard to navigate with heavy hardware."GPGPU, he argues, is structurally unsuited to play in this space. Their business model and product architecture are simply too heavy to pivot to low-power, high-flexibility edge scenarios."They're like battleships," Kim says. "We're speedboats—faster, more agile, and able to handle 50 design changes while they do one."DEEPX isn't building in a vacuum. The DX-M1 is already being validated by major companies like Hyundai Robotics Lab, POSCO DX and LG Uplus, which rejected GPGPU-based designs due to energy, cost, and cooling concerns. The companies found that even "free" chips resulted in a higher total cost of ownership (TCO) than the DX-M1—once you add electricity bills, cooling systems, and field failure risks.According to Kim, "Some of our collaborations realized that switching to DX-M1 saves up to 94% in power and cooling costs over five years. And that savings scales exponentially when you deploy millions of devices."Building on this momentum, DEEPX is now entering full-scale mass production of the DX-M1, its first-generation NPU built on a cutting-edge 5nm process. Unlike many competitors still relying on 10–20nm nodes, DEEPX has achieved an industry-leading 90% yield at 5nm, setting the stage for dominant performance, efficiency, and scalability in the edge AI market.Looking beyond current deployments, DEEPX is now developing its next-generation chip, the DX-M2—a groundbreaking on-device AI processor designed to run LLMs under 5W. As large language model technology evolves, the field is beginning to split in two directions: one track continues to scale up LLMs in cloud data centers in pursuit of AGI; the other, more practical path focuses on lightweight, efficient models optimized for local inference—such as DeepSeek and Meta's LLaMA 4. DEEPX's DX-M2 is purpose-built for this second future.With ultra-low power consumption, high performance, and a silicon architecture tailored for real-world deployment, the DX-M2 will support LLMs like DeepSeek and LLaMA 4 directly at the edge—no cloud dependency required. Most notably, DX-M2 is being developed to become the first AI inference chip built on the leading-edge 2nm process—marking a new era of performance-per-watt leadership. In short, DX-M2 isn't just about running LLMs efficiently—it's about unlocking the next stage of intelligent devices, fully autonomous and truly local.Credit: DEEPXIf ARM defined the mobile era, DEEPX will define the AI EraLooking ahead, Kim positions DEEPX not as a challenger to cloud chip giants, but as the foundational platform for the AI edge—just as ARM once was for mobile."We're not chasing the cloud," he says. "We're building the stack that powers AI where it actually interacts with the real world—at the edge."In the 1990s, ARM changed the trajectory of computing by creating power-efficient, always-on architectures for mobile devices. That shift didn't just enable smartphones—it redefined how and where computing happens."History repeats itself," Kim says. "Just as ARM silently powered the mobile revolution, DEEPX is quietly laying the groundwork for the AI revolution—starting from the edge."His 10-year vision is bold: to make DEEPX the "next ARM" of AI systems, enabling AI to live in the real world—not the cloud. From autonomous robots and smart city kiosks to factory lines and security systems, DEEPX aims to become the default infrastructure where AI must run reliably, locally, and on minimal power.Everyone keeps asking about the IPO. Here's what Kim really thinks.With DEEPX gaining attention as South Korea's most promising AI semiconductor company, one question keeps coming up: When's the IPO? But for founder and CEO Lokwon Kim, the answer is clear—and measured."Going public isn't the objective itself—it's a strategic step we'll take when it aligns with our vision for sustainable success." Kim says. "Our real focus is building proof—reliable products, real deployments, actual revenue. A unicorn company is one that earns its valuation through execution—especially in semiconductors, where expectations are unforgiving. The bar is high, and we intend to meet it."That milestone, Kim asserts, is no longer far away. In other words, DEEPX isn't rushing for headlines—it's building for history. DEEPX isn't just designing chips—it's designing trust.In an AI-powered world where milliseconds can mean millions, reliability is everything. As AI moves from cloud to edge, from theory to infrastructure, the companies that will define the next decade aren't those chasing faster clocks—but those building systems that never fail."We're not here to ride a trend," Kim concludes. "We're here to solve the hardest problems—the ones that actually matter."Credit: DEEPXWhen Reliability Matters Most—Industry Leaders Choose DEEPXVisit DEEPX at booth 4F, L0409 from May 20-23 at Taipei Nangang Exhibition Center to witness firsthand how we're setting new standards for reliable on-device AI.For more information, you can follow DEEPX on social media or visit their official website.
Tuesday 6 May 2025
Best and brightest AI minds celebrated at Taiwan's inaugural Best AI Awards
To cultivate a stronger foundation of artificial intelligence (AI) talent and encourage greater investment in AI research and development, Taiwan's Ministry of Economic Affairs (MOEA) hosted the inaugural 'Best AI Award Competition,' culminating in the finals held on May 3, 2025, at the Taipei World Trade Center, Hall 1. This year's competition attracted participation from 1,253 teams spanning 36 countries. From this pool, 233 teams advanced to the finals, where they competed for gold, silver, and bronze awards, as well as honorable mentions, across the 'AI Application' and 'IC Design' categories. These categories were further divided into groups: Public Corporations, SMEs and Startups, Students, and International Teams.The Gold Award was bestowed upon eight winners: HiTRUSTpay, EYS3D Microelectronics, Daya Yoo, Jmem Tek, National Central University, National Taiwan University, Touch Lab (Philippines), and Arba Lab (UK).According to an MOEA press release, the 'Best AI Awards' aspires to be Taiwan's equivalent of the "Oscars for AI", embodying diversity, global reach, and a forward-looking vision. Its core objectives are to ignite the creativity of the next generation, foster stronger ties between academia and industry, and nurture a deeper pool of AI-savvy talent and innovative enterprises. Ultimately, the competition aims to drive the industrialization of AI and the adoption of AI across industries, thereby solidifying Taiwan's position in the AI landscape. The competition offers substantial prizes, with student group winners vying for up to NT$300,000 and winners in the enterprise open, startup and SME, and international groups competing for up to NT$1 million.Credit: MOEAAt the awards ceremony, Deputy Minister Ho Chin-tsang emphasized AI's accelerating transformation of the global industrial structure, impacting sectors from manufacturing and healthcare to finance and everyday services. He stressed the 'Best AI Awards' competition platform's crucial role in forging stronger links between talent development, technological applications, and industry demands. In response to the widespread anticipation for AI technologies, the MOEA seeks to foster a practical, application-oriented approach, encouraging innovative concepts to address real-world industry challenges. This strategy aims to continuously cultivate new talent and generate cutting-edge solutions within Taiwan's evolving AI ecosystem. Deputy Minister Ho also highlighted that the competition entries exemplify the fusion of AI technology with tangible needs and creative execution, showcasing the immense potential of translating innovative ideas into viable solutions. Looking ahead, the MOEA will sustain its commitment to facilitating the adoption of innovative solutions and maximizing their value through strategic policy initiatives and industry partnerships.Credit: MOEADirector-General Kuo Chao-chung of the MOEA's Department of Industrial Technology underscored the impressive international engagement of the inaugural Best AI Awards, attracting both enthusiastic domestic participation from enterprises and academic institutions and the involvement of 353 international professionals from 36 countries, including India, the Philippines, the United States, and the United Kingdom. This global participation establishes the competition as a vital platform for the international exchange of AI innovation. To expedite the industry's capacity building in AI talent and applications, the MOEA will not only continue to host the Best AI Awards but also leverage the pilot production capabilities of schools and research institutions to support businesses in design, new product development, and prototyping. Furthermore, the MOEA will collaborate with agencies such as the Small and Medium Enterprise Administration and the Industrial Development Administration to facilitate industry-wide AI transformation and develop practical AI expertise.Chiu Chui-hui, Director-General of the Industrial Development Administration, cited a report by the Artificial Intelligence Technology Foundation, which identifies the 'shortage of relevant technical talent' as Taiwan's primary obstacle to AI advancement. The Global Artificial Intelligence Index corroborates this, revealing that while Taiwan excels in infrastructure (ranking 4th), it faces challenges in talent (38th), research (27th), and commercialization (39th). Consequently, the Best AI Awards are designed to expedite the real-world application of AI and the cultivation of skilled professionals. Chiu emphasized the competition's significance as Taiwan's premier AI competition, characterized by its scale, prestigious awards, and high standards (with a highly selective 7.4% award rate). He expressed his hope for collaborative efforts across all sectors to broaden the adoption of AI and harness its power to drive industrial innovation.The 'Best AI Awards' entries spanned a diverse range of application areas, including information and communication technology (18.4%), manufacturing (16.2%), healthcare (15.9%), wholesale and retail (10.2%), education (8.6%), and finance (7.8%). This diversity not only underscores the application of cutting-edge technologies but also highlights the immense potential of AI to be successfully integrated into various industries.The gold medal winners' covered areas and their attempted solutions to problems are summarized as follows:CategoryTeamWork TitleDomainPain Points to SolveAIStudentNational Central UniversityRealization of Highly Flexible Production LinesManufacturing AutomationTraditional production lines are inflexible and struggle to adapt to rapidly changing and varied production needs.AI SME and StartupData YooFarmiSpaceFarmingEnhance the efficiency and management of agricultural production, and optimize agricultural resource allocationAI Public CompanyHiTRUSTpayVeri-id equipment id verification and real-time AI anti-scam/fraud serviceCybersecurity, FintechSolve device identity verification issues, and provide real-time prevention of financial fraud, thereby enhancing transaction security.AI_INTLTouchLab (Philippines)AI driven TOUCH System – Digitizing TouchDigitalization, human-machine interactionDigitize touch, enabling machines to understand and simulate tactile sensations, thereby enhancing the precision and scope of human-computer interaction.IC StudentNational Taiwan UniversityTernary-weight Transformer model software-hardware synergetic design and neural network accelerator IC design and implementationIC design, AI hardware accelerationDesign higher-performance, lower-power AI chips to accelerate the computation of Transformer models and improve the efficiency of AI applications.IC SME & StartupJmem TekArgusNPU – PQC security edge AI inference systemCybersecurity, Edge AI, AI IC designProvide AI inference capabilities with post-quantum security on edge devices to protect sensitive data and enable high-performance AI applicationsIC Public CompanyeYs3D MicroelectronAI Edge Computing Car Parking Management SystemSmart City Transportation ManagementUse edge AI to improve parking management efficiency, optimize the allocation of parking resources, and alleviate traffic congestion.IC_INTLArbaLab(UK)ArbaEdgeAI Edge ComputingRealizing high-performance AI computing on edge devices to reduce reliance on cloud computing, enhance response speed, and improve privacy protection.Source: MOEAThe winning entries of the 'Best AI Awards' showcased the dynamic development and diverse applications of AI technology, with particularly strong innovation evident in sectors such as healthcare and manufacturing, as well as in the burgeoning fields of AIoT and edge AI integration.The Ministry of Economic Affairs pointed out that looking ahead, it will continue to collaborate with industry, academia, and research institutions, hoping that the "Best AI Awards" can become an important annual platform to promote Taiwan's AI technology development, talent cultivation, and innovative applications, and to continuously discover more promising new talents. In addition, matchmaking events will also be held concurrently during the "COMPUTEX" exhibition every May, with plans to invite more than 20 domestic and foreign venture capitalists and buyers to participate, fostering in-depth exchanges and cooperation between participating teams and the industry, and further expanding the commercialization opportunities for AI innovative applications.Through this comprehensive strategy, the Ministry of Economic Affairs aims to expedite the creation of groundbreaking AI applications and the cultivation of interdisciplinary AI expertise, ultimately steering Taiwan toward its ambitious vision of becoming a globally recognized "AI Island". For the latest updates, follow the official LinkedIn page of the Best AI Awards.