Perform AI at the Edge with NGC-ready Hardware
April 09, 2021
A key technology behind AI at the Edge is the GPU. Harnessing the complexity of the GPU for applications like inferencing is not something that’s easily digestible for the novice engineer. That’s where folks like Lanner come into play. With the company’s Edge AI Appliance LEC-2290 system, engineers can focus on the specific application without having to deal with the underlying hardware.
The Lanner hardware is NGC-ready, which means it’s ready for connection to the NVIDIA GPU Cloud, a GPU-accelerated cloud platform that optimized for deep learning, inferencing, and related AI applications, using containers.
In this video, you’ll get a glimpse at how to use an AI-powered hyper-converged server to enable intelligent services; how to build efficient and intelligent networks with a network Edge AI platform; and you’ll receive a hardware perspective on Edge AI inference and NGC-ready servers.