RTOS Containers for the Intelligent Edge
May 24, 2021
Applications built on top of Real-Time Operating Systems (RTOS) are being incorporated into larger and more complex environments every day. A fighter jet, for example, has many different computer systems that could potentially be running different operating systems.
The question is, how do we accelerate the deployment of software on such large systems? How do we make the deployment process uniform so that there’s no workflow change from one subsystem to another?
Container Technology Energized Edge Computing
The answer is a containerized local infrastructure — or edge cloud — in the plane, car, or factory, ready to serve the software for the various subsystems. That edge cloud connects in turn to another cloud, which allows you to push information and software updates to manage and orchestrate your heterogeneous software subsystems.
In an article for ARC Advisory Group titled “Industrial Edge Containers,” Harry Forbes noted, “Thinking about these new capabilities and how they might be used, it seems to me that in the longer term, today’s sharp border between embedded systems and edge computing will become much blurrier. Indeed, today’s real-time embedded applications may eventually become a special case within a broader set of edge applications that are containerized and orchestrated very much the way cloud apps are deployed today.”
Open Container Initiative (OCI)–compliant containers have the ability to use the same type of cloud infrastructure, the same type of tooling, and the same type of workflow as you would use for any other application in a more traditional IT environment. With container support, the RTOS world becomes more intelligible for modern application development, IT methods, and DevOps sensibilities.
Use Cases to Resolve Challenges at the Edge
Computer solutions that are optimally suited for the intelligent edge are a growing need as more and more applications require low-latency, high-bandwidth performance to meet the design requirements. To deliver the necessary degree of performance, the intelligent edge relies on several emerging technologies, including 5G networking, artificial intelligence and machine learning, IoT and mobility advances, and — increasingly — container technology.
One good example of this trend: Avionics systems have evolved from fundamentally hardware-based solutions to agile, highly upgradable, software-defined infrastructures, enabling new technologies to be incorporated into systems post deployment, without substantial hardware replacements. Software container technology promises to be an effective means of countering cybersecurity threats through quick updates and patches, delivering benefits to both the commercial and aerospace/defense sectors.
Other use cases in which containers can help resolve challenges at the edge include:
- Manufacturing operations and industrial robotics: AI-based automation is bolstered by compact, low-power installations that require the mission-critical reliability delivered by an embedded RTOS.
- Innovative healthcare delivery: Remote patient care, health monitoring systems, counseling, and other healthcare practices that have helped medical organizations deal with the COVID-19 pandemic can benefit from the security and agility of container technology.
- Autonomous vehicles and smart city operations: Lightweight, low-power operations are a vital factor in many embedded use cases involving AI-controlled vehicles, communication between vehicles, traffic flow monitoring, advanced driver assist systems (ADAS), and citywide warning and alert systems.
- Retail customer personalization and communication: Automated information kiosks, personalized signage displays, rich media product demonstrations, and online ordering systems can tap into the flexibility and power of container technology implemented at the intelligent edge.
The benefits of containers for embedded applications, however, can only be fully realized if the security aspects of the technology are well understood and necessary protections are integrated into solutions as part of a DevSecOps process.
Ensuring Container Security
Security is a vital issue in any type of software deployment, and if container technology is to become successful in environments that call for heightened security — such as aerospace and defense, automotive applications, energy grids and subsystems, robotics implementations, and so on — extra measures for hardening solutions are needed.
Cloud-native, open-source registries typically provide a layer of security when using containers. For example, Harbor employs policies and role-based access control to secure container components. Each container image is scanned to ensure that it is free of known vulnerabilities and then signed as trusted before distribution. For sensitive, mission-critical deployments, Harbor provides a level of assurance when moving containers across cloud-native compute platforms.
Following DevSecOps software development best practices is one of the most effective means of protecting container security. The Department of Defense has published the Container Hardening Guide (October 2020), which outlines DevSecOps processes that are important for guarding against security breaches.
Expectations for embedded systems cascade down to the operating systems that power them. Real-time operating systems must keep pace with innovation and embrace modern development practices. This means being compatible with the frameworks, languages, and methodologies embraced by the new generation of embedded system developers, while allowing no compromise in terms of security, safety, performance, or reliability.
As the complexity of applications and their supporting infrastructures create new potential attack vectors for increasingly sophisticated hackers to exploit, containers in embedded systems offer a means to deliver responsive, secure application delivery to the intelligent edge. With these capabilities, aerospace and defense organizations, energy providers, large-scale manufacturers, and medical organizations can take advantage of low-latency, high-bandwidth performance for the most challenging applications.
Michel Chabroux is responsible for the Product Management team driving technology and business strategies for Wind River’s runtime environments, including the VxWorks and Wind River Linux families of products. He has more than 20 years of industry experience including roles in technical sales, support, training, and product management. Prior to joining Wind River, he was a consultant in Business Management and Information Systems working with a variety of clients. He holds a Master’s degree in Computer Science Applied to Business Administration from Universite de Lorraine.