ONNX support added to CDNN Neural Network Compiler

The licensor of signal processing platforms and artificial intelligence processors for smarter, connected devices, CEVA, has announced that the latest release of its CEVA Deep Neural Network (CDNN) compiler supports the Open Neural Network Exchange (ONNX) format.

“CEVA is fully committed to ensuring an open, interoperable AI ecosystem, where AI application developers can take advantage of the features and ease-of-use of the various deep learning frameworks most suitable to their specific use case,” said Ilan Yona, Vice President and General Manager of CEVA’s Vision Business Unit. “By adding ONNX support to our CDNN compiler technology, we provide our CEVA-XM and NeuPro customers and ecosystem partners with much broader capabilities to train and enrich their neural network-based applications.”

ONNX is an open format created by Facebook, Microsoft and AWS to enable interoperability and portability within the AI community, allowing developers to use the right combinations of tools for their project, without being ‘locked in’ to any one framework or ecosystem. The ONNX standard aims to ensure interoperability between different deep learning frameworks, giving developers complete freedom to train their neural networks using any machine learning framework and then deploy it using another AI framework.

Now with support for ONNX, CDNN enables developers to import models generated using any ONNX-compatible framework, and deploy them on the CEVA-XM vision DSPs and NeuPro AI processors.

Source CEVA
Leave A Reply

Your email address will not be published.