Jack Ganssle, Editor of The Embedded Muse Jack Ganssle's Blog
RSS Feed This is Jack's outlet for thoughts about designing and programming embedded systems. It's a complement to my bi-weekly newsletter The Embedded Muse. Contact me at jack@ganssle.com. I'm an old-timer engineer who still finds the field endlessly fascinating (bio).

For novel ideas about building embedded systems (both hardware and firmware), join the 30,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

Cmicrotek uCP100

Thanks to the fine folks at Cmicrotek, this month's (April, 2019) giveaway is (your choice) either a µCP100 or µCP120 current probe. These are instruments that measure the current consumption of low-power IoT-type systems. I reviewed them here. Enter the contest here.

Low-Power Mischief

September 17, 2018

The latest issue of IEEE Embedded Systems Letters (full disclosure: I'm on the advisory board) has an article titled The IoT Energy Challenge: A Software Perspective. It's not particularly useful for practicing engineers, but does raise some worthwhile points.

The authors correctly state that the structure of the software inside a low-power IoT device is hugely important in influencing the energy needs of such a product. They cite a study that claims the way the code is implemented can account for 80% of the power used. I have no idea how true that claim is, but that strikes me as being high. Maybe the biggest factor is how much time the device spends sleeping, which is probably mostly application dependent. That is, a system that only needs to wake up and take data once an hour is inherently more frugal than one that needs to be less somnambulant.

The article surveys the state of the art in measuring energy needs of IoT systems and notes that today it's basically impossible to understand how much energy is needed.

And yet… the market has addressed this, albeit in imperfect ways. I have reviewed a number of them here. No doubt, more are available. Segger's and IAR's JTAG probes do a great job of measuring current consumed, and can correlate that to the executing code. There are a lot of other tools, mostly pretty inexpensive, that will measure current, but don't sync it to the executing code, but these are useful as well. Note well that in selecting these instruments it's critical to consider their bandwidth, as typical IoT systems have quickly-changing energy needs.

The article does mention using an oscilloscope, but in my opinion that is about the worst possible way to monitor current in a low-power IoT device. While sleeping the system might consume microamps or less; when doing wireless comm perhaps a hundred mA. Scopes just don't have the dynamic range to provide any sort of accuracy in the uA to many mA range. There is an exception to this: ee-quipment's Real-Time Current Monitor converts the voltage drop across a resistor from linear to log, overcoming the dynamic range problem. I tried to explain this in the article at that link, but it seems didn't do a good job as I get a lot of email from people who don't "get" the value of measuring the log(current).

So we do have options, and I highly recommend that anyone building something that has to run for a long time from a battery investigate these products.

But there is another issue which the Embedded Systems Letters article vaguely refers to: Predicting a system's energy needs. This is akin to figuring worst-case execution time. That's easy enough to measure - and it is important to measure it! But that's reactive. The code has been written, more or less works, and only then do you discover that the system is too slow or needs too much power. The boss is yelling about shipping and you just found the promised two-year battery life will be three months.

I've chastised compiler vendors for years that they don't give us a clue how fast the code runs. Not even vague guidelines. In the assembly-language days the assembler would print the number of T-states each instruction took. One could, admittedly tediously, get a pretty good idea how long things would take. The vendors tell me they can't do the same for C because of a lot of reasons that just seem to skirt the issue. Even just a sense of WCET would be better than the current situation, where we're coding into a black hole, hoping for divine intervention that things won't be too slow, and when they are, making sometimes random changes to hopefully save a few microseconds.

I have no idea how tools could predict power needs as this is such a complex issue. But until a solution comes along all we can do is hope for the best, take some measurements, and curse our foul luck when the coulomb count is too high.

(For more on ultra-low power design see this.)

Feel free to email me with comments.

Back to Jack's blog index page.

If you'd like to post a comment without logging in, click in the "Name" box under "Or sign up with Disqus" and click on "I'd rather post as a guest."

Recent blog postings: