中文網
Taipei
Tue, Aug 16, 2022
03:50
mostly clear
28°C
CONNECT WITH US

iCatch and Prophesee collaborate on development of AI vision processor natively compatible with Prophesee Event-based Metavision sensing technologies

Press release

iCatch Technology (iCatch) has focused on image signal processing (ISP) technology and Camera SOC for over two decades and aggressively invested in research and development in more machine learning (ML) related technology and application.

Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch's V57 AI vision processor with the new Sony Semiconductor Solutions ("SSS") stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.

iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.

iCatch also provides a high customized service and an excellent image quality system to become the eyes of all of machine vision equipment and smart devices in the future.

SSS's Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party's machine vision algorithms to support end customers' AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on.

Prophesee as the inventor of most advanced neuromorphic vision system in the world, has developed the groundbreaking Event-based Vision method for computer vision applications. Inspired by the human retina, Event-based vision sensor features pixels, each powered by their own embedded intelligent processing, allowing them to activate independently when they detect a relative change in the illumination intensity (i.e motion).

This new approach significantly reduces power consumption and the amount of processed data while offering ultra high-speed acquisition, strong low-light and HDR performance.

iCatch AI vision processor with built-in hardware event decoder can natively support SSS's Event-based Vision Sensor IMX636 without external FPGA or interface chip as event decoder for better system cost. It features built-in NPU acceleration engine and CV processing engine to provide the acceleration and optimization required in machine vision algorithms. In addition to iCatch key advantage in low power design and high flexibility in the optimization in human vision and machine vision applications, with event based vision sensors and 3rd party partner's machine vision algorithm, iCatch AI vision processor can fulfill all machine vision applications in a variety of use cases and edge computing ecosystem.

Detail of iCatch https://www.icatchtek.com
Detail of Prophesee https://www.prophesee.ai/
Detail of Sony Semiconductor Solutions https://www.sony-semicon.co.jp/e/

iCatch V57 + IMX636 realized in collaboration between SSS and PROPHESEE, can be used in home care and gesture detection application

iCatch V57 + IMX636 realized in collaboration between SSS and PROPHESEE, can be used in home care and gesture detection application
Photo: iCatch/ Sony/ Prophesee