Go here to sign up for The Embedded Muse.
The Embedded Muse Logo The Embedded Muse
Issue Number 418, March 15, 2021
Copyright 2021 The Ganssle Group

Editor: Jack Ganssle, jack@ganssle.com

   Jack Ganssle, Editor of The Embedded Muse

You may redistribute this newsletter for non-commercial purposes. For commercial use contact jack@ganssle.com. To subscribe or unsubscribe go here or drop Jack an email.

Contents
Editor's Notes

SEGGER Embedded Studio cross platform IDE

Tip for sending me email: My email filters are super aggressive and I no longer look at the spam mailbox. If you include the phrase "embedded muse" in the subject line your email will wend its weighty way to me.

Quotes and Thoughts

Measure what is measurable, and make measurable that what is not. Galileo Galilei

Tools and Tips

Please submit clever ideas or thoughts about tools, techniques and resources you love or hate. Here are the tool reviews submitted in the past.

Caron Williams had two more book suggestions:

Two oldies but goodies: Brooks' Mythical Man Month and A M Lister's Fundamentals of Operating Systems. I wish someone at Microsoft had read that prior to starting Windows...

Freebies and Discounts

Courtesy of the folks at MAGICDAQ, March's giveaway is one of their Python powered USB DAQ and compatible hardware testing module. See the review later in this issue.

Muse readers can get a 10% discount (valid till the end of June) on these if they send an email to support@magicdaq.com and mention the Embedded Muse.

Enter via this link.

Frustrating Off-The-Shelf Software, and How Much We Need It... Or Do We?

A reader wrote to relate a story of how they found both Integrity and VxWorks to be unusable in their application due to bugs... bugs the vendors were not interested in fixing. They switched to Linux and happily shipped a half-million units. Have you had similar experiences using software, like OSes, from vendors? In particular:

  1. Did they find any bugs in the OS (not the tools - although that can be maddening too)?
  2. How were the bugs handled by the supplier?
  3. How did you demonstrate the bug to the OS developer.
  4. Were you able to get a work-around - In our case we had to abandon a certain important feature?

And, to add some spice to this, Daniel Way wrote about the issues we have by not using off-the-shelf components:

I enjoyed reader James Fowkes' comments contrasting software development with civil engineering and have some more reader feedback to add to the conversation.

[Y]ou are building a bridge over a river. You know the weights and strengths of materials, you have weather data for the area, you have knowledge of tides and currents, you have a specification for static and dynamic loads, etc, etc. Since all these components obey physical laws, you can have a high degree of confidence in how they interact, how changes in one affect another, etc. It's complicated, but it's logical.

In software, you simply cannot make these mathematical, grounded-in-physical-law assumptions.

I contend, however, that software is just as grounded in math and logic as mechanical and civil engineering is in physics. Software engineering is an engineering discipline but hasn't had the centuries or millennia to develop like other engineering disciplines. To catch up with its peers, one specific deficiency software engineering must address is leveraging reusable or of-the-shelf components.

As an example: If my company builds hydraulic pistons, we probably won't (can't) make the internal seals as well. Owning that additional equipment is expensive and doesn't provide a competitive advantage. Companies which specialize in seals may be able to innovate and sell us a better-quality seal at a lower price.

Too much software is bespoke. Writing your own RTOS kernel may not have the same upfront cost as new manufacturing equipment, though it may cost more to maintain long-term. Development and maintenance costs can be dispersed if the software is used by multiple customers. Improvements to the code also benefits all customers and multiplies the software's value.

Why is software reuse such a hurdle when the lessons of specialization in other disciplines are clear? Is it fear, uncertainty, and doubt about the quality; not-invented-here syndrome; fragmentation of language or toolchain; or simply difficulty integrating third party dependencies to the project?

Then there's this report which shows how most routers are running old, unsupported versions of Linux with unpatched security holes. It re-raises an old question: how does one support products based on complex and well-known software components? No one can support a product forever, so is using these packages inherently making your customers vulnerable?

The Pandemic and Us

In Muse 397 last May I reported on a survey of Muse readers about how they are faring in the Sars-Cov-2 pandemic. At that point 2% of us lost their jobs. Another 2% were not working and not being paid. Only 1% were not working but continued to draw a salary. 13% were still going to the office every day, which left a whopping 82% of us working from home.

How about now, a year into the pandemic? Specifically:

  • Are you working from home?
  • Working in the office?
  • Some mix of the two?
  • Lost your job?

A year from now, what do you expect your work life to look like?

Are you happy with your current work environment?

Any other comments?

Email me and I'll report the results.

R&D? No Such Thing

I was reviewing some code not long ago and had the not-so-wonderful privilege of digging through a change log consisting of hundreds of entries for a single module. This was a state machine, and a rather complex beast one at that. All of the device's actions were sequenced through the state machine, and each entry consisted of the usual function pointers as well as dozens of constants that defined delays, steppings, and the like.

The final, shipped, state table looked reasonable, though I wondered where the various constants came from. Digging through the history it became pretty clear: Absolute chaos. The designers had no idea what they were doing. Updates to those constants were made sometimes a half-dozen times a day. Like some high-tech prestidigitators the developers were fumbling changes and apparently just looking to see what happened with each change. The notion of a design just did not exist.
I couldn't help but wonder how well the final version worked as there seemed no science behind the state table entries.

This was one of too many examples I have seen of a classic project failure mode: poor science. The developers are thrust into building something before it's absolutely clear how things should work.

Now, sometimes that's unavoidable. But it's always a schedule killer. And when a product goes out the door tuned to perfection for reasons no one understands it's usually a perfect example of chaos theory. Fragility. A butterfly flapping its wings in Brazil might cause a complete melt down.

The better part of a half century ago we were building a device which required calibration to known standards. The cal process wound up determining coefficients of a polynomial; that polynomial was then used to solve for measured parameters. We discovered, as oh-too-many had before us, that with enough terms over a limited range you can get a polynomial to solve for pretty much anything. And with pretty much zero accuracy. A perfect calibration could be so fragile that a sample microscopically outside the calibration range gave wild results.

When the science or technology behind a product is not well understood you can expect one of two results (maybe both): huge schedule delays, and/or a device that just does not work very well.

We routinely use a term that should not exist: R&D. There's no such thing. There's R (research), which cannot be scheduled. It's a dive into the unknown, it's where we make mistakes and learn things. Finding the cure for cancer (schedule that!). Then there's D (development). D can be scheduled. We have a pretty darn good idea of what we're going to build and how it will work. Maybe not perfect, but there's more clarity than fog.

Conflating R and D is mixing development with unknown science, and is one of my "Top Ten" reasons for project failures.

Review of the MAGICDAQ

One of the hassles I struggle with is sampling and generating signals from my PC to different circuits on the bench. I generally wind up using a single-board computer's analog and digital inputs. This used to be simple, using mbed.com's once-nifty web compiler. Alas, that has become so complexified I've abandoned it and now use regular IDEs. That's a less than ideal solution when you want to log data to a disk file.

The MAGICDAQ is a nice alternative (it's also this month's giveaway). It's a general data acquisition device seemingly targeted at benchtop automation and testing. As the picture shows it lives in a uniquely-shaped box which seems designed to be mounted in a test stand. Or, one could use it just lying on the bench.

The specs tell most of the story:

  • 8 analog input channels (or 4 differential).
    • 14 bit ADC
    • +/-10 V input range though it can withstand -14 to +24V
    • 48K samples/second
  • 8 digital channels which can be inputs or outputs.
    • 5V max output, diode protected
  • 2 analog output channels
    • 0 to 5V
    • 12 bit DAC
    • 31.25 KHz max frequency
  • 1 counter or PWM output channel
    • 5 MHz max frequency of pulses being counted
    • 3.3V output
    • PWM 100 KHz max

Like so much test equipment, the MAGICDAQ is controlled from a host computer over USB. It's electrically isolated from the USB, a very nice feature.

As the picture shows, connections are via screw terminals. That's nice in a lab environment where you might be connecting all sorts of random circuits to it.

It is controlled via Python programs, and the provided library makes this very simple. For instance, to read an analog input:

# Import MagicDAQDevice object
from magicdaq.api_class import MagicDAQDevice

# Create daq_one object
daq_one = MagicDAQDevice()

# Connect to the MagicDAQ
daq_one.open_daq_device()

# Single ended analog input voltage measurement on pin AI0
pin_0_voltage = daq_one.read_analog_input(0)

The upside to a Python interface is that it's largely platform independent, and you can make the thing do pretty much anything. I do, though, wish it had a Windows GUI interface for routine tasks when one doesn't want to write code.

The vendor provides plenty of sample scripts, making the learning curve pretty short.

All in all, a very nice device at an attractive price point ($185 USD).

Also available is an expansion board ($235) which connects to the MAGICDAQ via a ribbon cable. That provides:

  • 3 general current measurement circuits (5A max)
  • A low current measurement channel which has a range down to +/- 100 uA
  • 4 temperature channels with probes (-55C to + 125C)
  • 4 relays that can each handle up to 7A
  • A variable power supply (1 to 10V, 2A max)

More info here.

Failure of the Week

From Scott Rosenthal:

This Week's Cool Product

One bottleneck we sometimes face is getting stuff from memory at a rapid rate. GSI Technology has a novel solution: move processors into the memory array. Their Gemini "In-Place Associative Computing" device places millions of processors right in the SRAM array. Now, these are pretty brain-dead CPUs, capably of only simple operations. But the company claims that for some applications you can expect a two orders of magnitude speed up over a conventional CPU to RAM configuration.

Note: This section is about something I personally find cool, interesting or important and want to pass along to readers. It is not influenced by vendors.

Jobs!

Let me know if you’re hiring embedded engineers. No recruiters please, and I reserve the right to edit ads to fit the format and intent of this newsletter. Please keep it to 100 words. There is no charge for a job ad.

Joke For The Week

These jokes are archived here.

In honor of yesterday being pi day:

3.14% of sailors are pi-rates.

Never talk to pi. He'll go on forever.

Come to the nerd side. We have pi.

Simple as 3.141592…

The roundest knight at King Arthur's table was Sir Cumference. He ate too much pi.

The worst thing about getting hit in the face with pi is that it never ends.

What do you get when you take green cheese and divide its circumference by its diameter? Moon pi.

What was Sir Isaac Newton's favorite dessert? Apple pi.

What is the official animal of Pi Day? The pi-thon.

A pizza has a radius z and thickness a. Its volume is pizza (or pi*z*z*a)

What is a math teacher's favorite dessert? Pi!

The mathematician says, "Pi r squared." The baker replies, "No, pies are round. Cakes are square."

Just saw American Pi. I gave it a rating of 3.14.

In Alaska, where temperatures get below freezing, pi is only 3.00. After all, everything shrinks in the cold.

What do you get when you take the sun and divide its circumference by its diameter? Pi in the sky.

How many pastry chefs does it take to make a pie? 3.14.

What do you get when you cut a jack-o'-lantern by its diameter? Pumpkin pi.

What is 1.57? Half a pie.

What is the ideal number of pieces to cut a pie into? 3.14.

How many calories are there in that slice of chocolate pi? Approximately 3.14.

About The Embedded Muse

The Embedded Muse is Jack Ganssle's newsletter. Send complaints, comments, and contributions to me at jack@ganssle.com.

The Embedded Muse is supported by The Ganssle Group, whose mission is to help embedded folks get better products to market faster. can take now to improve firmware quality and decrease development time.