Lattice Enhances its Ultra-Low Power sensAI Stack
September 25, 2018
Lattice revealed improvements to its popular sensAI? stack designed to accelerate time-to-market for developers of flexible machine learning inferencing in consumer and industrial IoT applications.
Lattice Semiconductor revealed improvements to its popular sensAI™ stack designed to accelerate time-to-market for developers of flexible machine learning inferencing in consumer and industrial IoT applications.
The company is introducing new IP cores, reference designs, demos and hardware development kits that provide scalable performance and power for always-on, on-device artificial intelligence (AI) applications.
Updates to the sensAI stack include:
• IP Cores – New CNN Compact Accelerator IP core for improved accuracy on iCE40 UltraPlus FPGA, and enhanced CNN Accelerator IP core for improved performance on ECP5 FPGAs
• Software Tools – Updated neural network compiler tool with improved ease-of-use and both Caffe and TensorFlow support for iCE40 UltraPlus FPGAs
• Reference Designs – New human presence detection and hand gesture recognition reference designs and demos
• Modular Hardware Platforms – New iCE40 UltraPlus development platforms including Himax HM01B0 UPduino Shield, and DPControl iCEVision Board
• Design Service Partners – New vehicle classification and package detection demos from sensAI Design Services Partners
“Flexible, low-power, always-on, on-device AI is increasingly a requirement in edge devices that are battery operated or have thermal constraints. The new features of the sensAI stack are optimized to address this challenge, delivering improved accuracy, scalable performance, and ease-of-use, while still consuming only a few milliwatts of power,” said Deepak Boppana, Senior Director, Product and Segment Marketing, Lattice Semiconductor. “With these enhancements, sensAI solutions can now support a variety of low-power, flexible system architectures for always-on, on-device AI.”
For more information, please visit www.latticesemi.com.