SOMs based on Zynq UltraScale+ MPSoC put AI vision at the edge

Update: April 21, 2021
SOMs based on Zynq UltraScale+ MPSoC put AI vision at the edge

The first product released is the Kria K26 SOM, which has been designed specifically for vision AI in smart cities and smart factories. The company has coupled its hardware and software with production-ready, vision accelerated applications. The small form factor  embedded board is based on the Zynq UltraScale+ MPSoC architecture, which has a quad-core Arm Cortex A53 processor, over 250,000 logic cells, and a H.264/265 video codec. The SOM also features 4Gbyte of DDR4 memory and 245 I/Os, which allow it to adapt to virtually any sensor or interface. It also has 1.4TOPS of AI compute performance which is sufficient to create vision AI applications with three times the performance at a lower latency and power budget compared to GPU-based SOMs, says Xilinx. The SOMs are intended for smart vision applications like security, traffic and road management cameras, retail analytics, machine vision, and vision-guided robotics.

The SOMs are designed for rapid deployment with a pre-built software stack and adaptive modules. According to Xilinx, the SOMs can reduced time to deployment by up to nine months.

The SOMs remove FPGA hardware design, allowing developers to integrate custom AI models, application code, and modify the vision pipeline if the application requires it. The SOMs can be used with TensorFlow, Pytorch or Café frameworks, as well as C, C++, OpenCL and Python programming languages enabled by the Vitis unified software development platform and libraries. The SOMs also have support for standard Yocto-based PetaLinux.

Ubuntu Linux support will also be available, following an agreement with Canonical.  The Linux distribution is widely used by AI developers and is interoperable with existing applications. Both PetalLinux and Ubuntu Linux  will come pre-built with a software infrastructure and utilities.

Xilinx has also announced embedded apps for the SOMs and its ecosystem partners for edge applications, available in its App Store. The apps are open source accelerated applications and provided free of charge. Examples are camera tracking, face detection and natural language processing with smart vision.

Announced at the same time is the Kria KV260 Vision AI Starter Kit. It is designed to support the accelerated vision applications and, according to Xilinx, can be operational within an hour even by someone with no knowledge of FPGAs or FPGA tools.

A set of online tutorial vides and training courses allows hobbyists, makers and developers to develop and deploy a design for production. Part of the Kria development experience is a self-enabled path for exploration, design, and ultimately production deployment through a vast set of online resource.

Confident developers can use the Kria K26 producton SOM, with commercial and industrial versions.

The KV260 Vision Starter Kit is available immediately. The commercial-grade Kria K26 SOM ships in May of 2021. The industrial-grade K26 SOM ships in the summer. Ubuntu Linux on Kria K26 SOMs is expected to be available in July 2021.