With Map in Hand, Machine Learning is the Next Stop
June 24, 2019
While machine learning is still relatively new to the embedded computing space, each of the major processor vendors has its own methodology for implementing machine learning.
Once you’ve plotted your course for an IoT development, you’re ready to take it to the next level. To be sure you’re on the right path, check out my recent blog Chart the Right Course for Your IoT System Development. That next level could include such features as active speaker recognition or predictive maintenance, to name a few. One technique used to accomplish features like these is to implement machine learning. That’s the best way to maximize the performance of your system by taking advantage of the available higher performance hardware and associated software.
While machine learning is still relatively new to the embedded computing space, each of the major processor vendors—Intel, AMD, nVidia, and so on—has its own methodology for implementing machine learning, and each is coming on very quickly with newer techniques. An implementation that employs the SUSE tools can be enabled by any of those processors. And for each, a container is available (or will be very shortly) based on a flash image that has support for TensorFlow, which is an open-source framework for machine learning.
It’s becoming commonplace to deploy machine learning at the Edge of the IoT, moving away from the Cloud for a variety of reasons. First, the aforementioned microprocessor capabilities are adequate to handle many of the processing needs. Second, handling the machine learning at the Edge eliminates any latencies that are associated with pushing data out to the cloud and back. In a real-time or near real-time application, that time savings could make or break your application. For example, if you’re designing a medical device, having an action occur in real time could mean life or death for a patient. Or, for autonomous drive applications, moving information to the Cloud could result in a crash (a literal one).
Finally, processing the data at the Edge removes the cost associated with transmitting and receiving data to/from the Cloud. Assuming cellular is your transmission medium, costs could rise quickly when you consider the huge amount of data that is pushed around for a machine-learning application. It’s generally only the most compute-intensive applications that require the massive resources only available in the Cloud.
SUSE’s Embedded Linux operating system can be built into an end application to enable a secure, flexible, and scalable system, one that’s based on a proven, time-tested OS. As IoT system designers have discovered, the OS makes it easy to develop, maintain, grow and manage embedded Linux IoT systems across an array of platforms, applications and industries. Its open-source nature allows for a variety of devices, hardware and appliances.
While enabling developers to operate in an environment that they’ve grown accustomed to, SUSE Embedded Linux Operating Systems provide immediate access to tools and training, real-time security, and full support throughout the product lifecycle, including code fixes, patches and updates. A flexible subscription model can minimize costs and allow for product expansion.
Now you have your roadmap and your marching orders. Let machine learning guide your application and, if implemented properly, it’ll tell you what the next steps are. All that’s left is ensuring a secure connection throughout the entire IoT system, and that’s what we’ll look at in the next blog in this series.
Marketing Manager | Embedded Systems
Jared has been a part of the embedded business unit at SUSE for three years. During his time there he has served a cross-functional role, collaborating and planning with product management, solution architects, and sales management to identify embedded applications for SUSE’s open source products and solutions, while creating content and stories focused on the benefits of utilizing Linux for embedded system development.