Tuesday, September 25, 2018

Lattice Semiconductor pushes ahead with AI software stack for edge

Lattice Semiconductor is boosting the capabilities of its "sensAI" stack designed for machine learning inferencing in consumer and industrial IoT applications.

Specifically, Lattice is releasing new IP cores, reference designs, demos and hardware development kits that provide scalable performance and power for always-on, on-device artificial intelligence (AI) applications. The release includes an updated neural network compiler tool with improved ease-of-use and both Caffe and TensorFlow support for iCE40 UltraPlus FPGAs.

“Flexible, low-power, always-on, on-device AI is increasingly a requirement in edge devices that are battery operated or have thermal constraints. The new features of the sensAI stack are optimized to address this challenge, delivering improved accuracy, scalable performance, and ease-of-use, while still consuming only a few milliwatts of power,” said Deepak Boppana, Senior Director, Product and Segment Marketing, Lattice Semiconductor. “With these enhancements, sensAI solutions can now support a variety of low-power, flexible system architectures for always-on, on-device AI.”

Examples of the architectural choices that sensAI solutions enable include:

• Stand-alone iCE40 UltraPlus / ECP5 FPGA based always-on, integrated solutions, with latency, security, and form factor benefits.
• Solutions utilizing iCE40 UltraPlus as an always-on processor that detects key-phrases or objects, and wakes-up a high performance AP SoC / ASIC for further analytics only when required, reducing overall system power consumption.
• Solutions utilizing the scalable performance/power benefits of ECP5 for neural network acceleration, along with IO flexibility to seamlessly interface to on-board legacy devices including sensors and low-end MCUs for system control.

See also