Building better benchmarks for IoT systems

By Brandon Lewis

Editor-in-Chief

Embedded Computing Design

December 02, 2016

Benchmark testing is a favorite of electronic engineers, as it provides a quick reference on performance, power consumption, latency, or other compone...

Benchmark testing is a favorite of electronic engineers, as it provides a quick reference on performance, power consumption, latency, or other component metrics that can have a significant impact on an overall system. Component benchmarks are widely available from organizations like the Embedded Microprocessor Benchmark Consortium (EEMBC), and measure everything from the energy efficiency of microcontrollers (MCUs) and their peripherals to the performance of multicore and heterogeneous computing architectures to the latency and throughput of data center servers. The scores provided by these standardized tests offer a comparative snapshot of how devices function in a controlled setting, and can help engineering decision makers sift through numerous component options quickly and settle on a solution.

But it’s never that easy, as any good engineer will tell you, because benchmark results often only measure the performance of isolated components that aren’t operating in the context of a complex, real-world system. To be fair, the infinite variables of a given system design translate into infinite possible test vectors, making the Holy Grail of benchmarking – system-level benchmarks – impractical.

A perfect example of the complexity of system-level benchmarking is the Internet of Things (IoT). Sometimes referred to as “system of systems” architectures, benchmarking an IoT device would require not only testing multiple components, but the security, latency, and throughput of connections between devices and data center servers as well. The variety of IoT deployment models and the unique requirements of different vertical markets present further challenges to developing standardized tests around IoT systems, again making benchmarking impractical.

Or does it?

“The traditional benchmark is, ‘Here’s a handful of code. Go compile the code with your favorite compiler and run it just on the processor,’” says Markus Levy, President of EEMBC. “As you get more into the system level, you find that you need to have more external influence in order to make it real-world. This is something that we’re going to do with the IoT-Gateway and IoT-Secure benchmarks, and something we have been doing with the IoT-Connect benchmark.”

The EEMBC IoT Working Group was formed in 2015, and set out to develop more holistic benchmarks for the IoT community. The initial product of their efforts was the aforementioned IoT-Connect benchmark, which provides a reliable method for determining the energy efficiency of connected systems by taking into account the power consumption of wireless interfaces such as Bluetooth, Wi-Fi, and Thread (and possibly LoRa in the future), as well as the effect of sensor inputs. As shown in Figure 1, the IoT-Connect benchmark comprises an Arduino-based I/O controller that uses SPI or I2C interfaces to simulate sensor inputs, as well as a Raspberry Pi outfitted with a connectivity shield that acts as an RF controller. Both of these devices communicate with the device under test (DUT), which is also connected to an Energy Monitor similar to that used in EEMBC’s ULPBench test that measures energy consumption and collects timestamp information on the various simulation activities.


[Figure 1 | Displayed here is the IoT-Connect benchmark from EEMBC, which uses an Arduino I/O controller, Raspberry Pi with a Bluetooth Low Energy (BLE) shield (in this case), and Energy Monitor that are being used here to evaluate the performance and power consumption of an STMicroelectronics Nucleo platform.]

The IoT-Connect benchmark represents an initial framework for testing the various elements of a complex IoT system, and also laid the foundation for the two latest EEMBC IoT benchmarks: IoT-Secure and IoT-Gateway.

Developing deeper benchmarks for IoT decision makers

Both the IoT-Secure and IoT-Gateway benchmarks are currently under development, but significant in that they attempt to provide tools that are standardized, yet flexible – all based on the notion of profiles. For example, the IoT-Secure benchmark, co-chaired by Mike Borza and Ruud Derwig of Synopsys, builds off of the framework provided by IoT-Connect to quantify the impact of encryption algorithms such as SHA-256, AES, and ECC implemented through hardware accelerators, cryptographic co-processors, strictly software, or a combination thereof. However, the IoT-Secure benchmark will also be rolled out in subsequent phases that combine these standalone functions into profiles, allowing users to calculate how combinations of technology and implementation methodology can optimize security at the system level.

“Associated with that there’s going to be some form of performance – whether it’s throughput, whether it’s latency, there’s also an energy factor, and also a memory utilization factor,” Levy says. “What we want to find out is, ‘What is the overhead associated with doing this security?’”

The other, perhaps more ambitious of the benchmarks, is IoT-Gateway, chaired by Rory Rudolph of Dell. IoT-Gateway aims to evaluate the basic functions of a gateway, as well as its performance based on the requirements of select vertical markets. Principally, IoT-Gateway will employ a distributed benchmarking architecture that measures both client-server interactions and workloads generated across multiple physical ports to provide metrics on the latency and throughput of an entire system up to a saturation value.

As mentioned, outside of generic tests, typical workloads and usage models from various markets will be used to create additional benchmark profiles (for instance in the areas of automation, media, and transportation) that allow gateways to be tested according to the requirements of their target industry (Figure 2).


[Figure 2 | The different requirements of various vertical markets require that separate profiles be developed for the IoT-Gateway benchmark based on industry.]

As shown in Figure 3, the IoT-Gateway benchmark can be described at a high level as the process of connecting to a server, receiving test data from the server, computing that data, and then transferring it back. In Phase 1 of IoT-Gateway, this will be conducted using a software-based benchmark controller, typical workload generator, and execution engine that all run locally on a gateway under test (Figure 4). In Phase 2, the benchmark will evolve to run real-world gateway workloads through an external controller and benchmarking device (Figure 5).


[Figure 3 | The IoT-Gateway benchmark endeavors to test the basic functionality of gateway devices, as well as their performance based on industry requirements.]


[Figure 4 | Phase 1 of the IoT-Gateway benchmark consists entirely of software running locally on the gateway, will not use physical ports, and generates workloads typical of various industry usage models.]


[Figure 5 | Phase 2 of the IoT-Gateway benchmark will be a distributed architecture test that includes not only the gateway under test, but separate server-based workload generators and controllers.]

“What we’re doing is developing a proof of concept that has communication to the cloud, and from there we’re going to develop components that will hook up to Bluetooth, for example, or other RF methods,” Levy explains. “The real challenge is developing the client-server relationship.

“With the IoT-Connect benchmark it’s a little bit simpler because you’re simulating an IoT edge node device,” Levy continues. “In the case of a gateway, you might have 50 or 100 Bluetooth connections, so it gets a little bit more complicated when you’re trying to saturate an interface. You can’t just have one BLE device talking to a gateway. Ideally you’d keep throwing devices at it, including a mixture of protocols because a gateway could be doing Wi-Fi and Bluetooth and other types of RF all at the same time with different devices.”

The processor-agnostic gateway benchmark will target Linux-based operating systems (OSs) in order to facilitate a level of standardization that is applicable to the broadest range of systems.

Up to the test?

Alpha and beta versions of the IoT-Gateway and IoT-Secure benchmarks, respectively, will be available in Q1 2017, and EEMBC encourages vendors and manufacturers to join its working groups as the tests are built out. Significant work remains in the definition, development, and deployment of these system-level benchmarks, but ultimately, they are what the IoT industry needs.

“The tricky thing is that the more complex the platform you’re testing, the harder it is to isolate the bottlenecks as far as what’s causing them or what’s causing the performance benefits,” Levy concludes. “In order to make a benchmark more realistic, it has to become a system-level benchmark.”

 

Brandon Lewis, Technology Editor

Brandon is responsible for guiding content strategy, editorial direction, and community engagement across the Embedded Computing Design ecosystem. A 10-year veteran of the electronics media industry, he enjoys covering topics ranging from development kits to cybersecurity and tech business models. Brandon received a BA in English Literature from Arizona State University, where he graduated cum laude. He can be reached at [email protected].

More from Brandon

Categories
IoT