Edge-to-Cloud AI Deployment with Microsoft Azure and Lanner Intelligent Edge Appliances
June 16, 2021
Advanced AI applications now face fundamental challenges in latency and reliability. Hence, Edge-to-Cloud AI solutions that can be deployed with existing core IT infrastructure have become key AI enablers. Harnessing existing workflows is instrumental to Edge-ready solutions capable of closing complex gaps between core business IT investments, competencies, and next-generation AI opportunities.
Edge-to-cloud AI deployment requires multitasking workloads at the Edge and Cloud networks to reduce latency and increase efficiency. As some of the workloads at the Edge AI appliance can leverage GPU acceleration for video analytics, advanced analytics requires a scalable cloud platform for further analysis at a centralized infrastructure.
To put all of that into the proper perspective, check out the Embedded Solutions Video featuring Microsoft’s Principal Solution Specialist, Ben Lee, and Maulik Upala, AI Product and Market Development Manager for Lanner Electronics. You will learn how the two companies have partnered to simplify AI deployments at the Edge.
The video goes through:
- Introduction to Azure IoT Service on Edge and AI functionality
- How to leverage Azure IoT Edge capability to execute the AI inference model on Lanner Edge AI appliances and manage all the container, Edge device, and data in the Cloud-native environment
- A technical demonstration leveraging the Lanner AI appliance and Azure IoT Edge