Is Your Wearable Design Wasting Too Much Power?
May 29, 2019
Are you minimizing power consumption? What if you had a way to also dynamically adjust the voltage level at the same time? How much more power could you be saving?
You’ve probably designed your optical-sensing wearable device to accommodate different use cases. Chances are, you’ve already built into the design a method to tune the current up and down as the device performs continuous, real-time monitoring under varying conditions, thereby ensuring the device minimizes current consumption at every given point in time. But are you really minimizing the power consumption? What if you had a way to also dynamically adjust the voltage level at the same time? How much more power could you be saving?
To illustrate the potential for additional power savings, let’s compare a couple of simple examples. To highlight the point, I will use some simple math. Your device will likely be much more complex, but the principle should be the same. All optical sensing systems are designed to operate under the more challenging, unfavorable measurement conditions – someone running and working up a sweat on a trail where there’s a mix of shade and sunshine, for example. Under these difficult conditions, the optical sensor algorithm attempting to measure a parameter such as heart rate will need to turn the LED current up to the highest rating for a better signal-to-noise ratio (SNR) and, thus, a more accurate continuous reading. Typically, the optical sensing circuit comprises of an LED in series with an optical analog front end (AFE) driven from a voltage VLED. Let’s consider a system based on green LEDs where the VLED powering the LED and AFE chain is fixed to 5V. At the highest current rating (100mA), let’s imagine that the forward voltage drop across the LED is 4V, which leaves 1V drop across the optical AFE.
- VLED = 5V
- VF = 4V at 100mA
- VDRV = 1V
Now, let’s consider what happens in this system when the measurement condition turns favorable, such as when that same person is working at a desk inside an office or even sleeping. In favorable conditions, the algorithm will reduce the optical sensor current down considerably. At the lower current (5mA, say) the forward voltage drop across the LED will drop to 3V, which leaves 2V across the optical AFE.
- VLED = 5V
- VF = 3V at 5mA
- VDRV = 2V
However, at the lower current, the voltage required across that AFE to operate correctly (its compliance voltage) drops. But, because of the fixed 5V on VLED, VDRV is actually higher than when under the unfavorable conditions. If, for example, the compliance voltage of the AFE is VDRV_COM = 0.16V we can see that we have an excess of voltage across the AFE of 1.84V. That effectively means the system is dissipating 1.84V x 5mA more power than actually needed.
In fact, any system which uses a fixed VLED architecture that needs to be scaled for unfavorable conditions ends up burning too much power under favorable conditions. If, however, VLED were set by a regulator with dynamic voltage scaling (DVS), which could be adjusted up or down in tandem with the current settings, power consumption could be minimized. In effect, the system could adjust VLED to minimize, with some appropriate headroom, the following expression for every current setting.
VLED – (VF + VDRV_COM)
Even with a simple look-up table method, setting the voltage appropriate to certain current ranges could affect considerable power savings, thereby extending the battery life of the device.
Extending the Battery Life in Your Wearables
So how much power could be saved using DVS? Considering that most wearable device users will only be in unfavorable measurement conditions for a small portion of a 24-hour measurement cycle, it should be significant. Precisely how much will vary, of course, from user to user, given different blood perfusion levels, skin types, and use cases.
As a simple illustration, let’s consider a typical scenario:
- LED pulse current = 38.9mA
- Pulse duty cycle is 117µs at 100sps, which is 1.17%
- Buck-boost efficiency = 95%
Now, let’s assume that these parameters apply for four hours in a 24-hour cycle. If we were able to achieve a DVS savings at 1V, we can calculate the battery savings as follows:
1V x 38.9mA x 1.17% x 95% x 4 hours = 1.73mWHr per 24-hour day
For a 100mAHr battery (370mWHr), that would represent a power saving of 4.67% over 10 days of use. And this analysis doesn’t consider any savings contribution over the other 20 hours in the 24-hour cycle. So the potential savings offered by DVS are considerable.
Other Power-Saving Possibilities with DVS
The example above focused on the typical approach to optical sensing on wearables using green LEDs. However, many wearable designs now incorporate LEDs of other wavelengths (infrared and red) into their detection schemes. Such newer LEDs require a lower forward voltage, at any given current or measurement condition, than their green counterpart. Here again DVS could be used to adjust the voltage down when the infrared LED sensor is conducting measurements and up again for the green LEDs. Similar to the previous example, we are trimming the voltage appropriate to the measurement being made rather than sticking with the highest voltage that fits all scenarios. Exploiting the forward voltage variations for different LEDs can yield power savings in a similar manner to the conditions described above. Both techniques can be used in parallel to maximize the savings.
Maxim offers two options that provide an ultra-low quiescent current buck-boost architecture featuring DVS in two scenarios. The first is part of a power-management IC (MAX20345), which also includes other resources necessary to create a wearable system (charger, bucks, and LDOs). The second is a standalone version of the same buck-boost. Whichever option you choose, we believe substantial power savings, largely untapped to date, are there for the taking.