For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

On Accurate Tools and Calibration

I'm a sucker for tools that measure things, especially when they offer a great deal of precision. Or, better, accuracy, for accuracy is how closely a measurement conforms to reality, and precision is a measure of repeatability. Most of us, myself included, in casual conversation carelessly conflate these terms.

Perhaps one of the more incredible measuring tools we all carry around is a smart phone GPS. In 1991 a friend and I sailed to England with an early Magellan unit. Big, costly and clunky it burned through AAs at an amazing rate so had to be left off except when we needed a position. The slow MCUs of the day required 20 minutes to obtain and reduce a position; I used to race it with the sextant. Today it's amazing that a tiny bit of electronics can suck data from the spacecraft 20,000 km above the earth and almost instantly compute a position accurate to a few meters. What Captain Cook would have given for such technology! (I bet he would have liked decent charts even better).

One of my favorite measuring tools in the woodworking shop is this 6" scale:

6" scale and calipers.

These were available at Home Depot for $2 each. I bought two and now wish I had purchased more, for Home Depot, in the inevitable cheapening of things, replaced these with Chinese crap.

One side is divided into 32nds, the other into 64ths. The calipers confirm as much accuracy as I can discern with aging eyes. But are the calipers accurate? According to a precision calibration block they're within about 2 thousandths of an inch at the one inch setting. That's more than good enough for my woodworking.

But how accurate is the cal block? It was sold as being good to one ten-thousandth but that was years ago. And what about its tempco? Presumably at room temperatures the variation is low, but who knows? We all rely on some sort of faith as guiding our philosophical underpinnings; mine is in that cal block but it could be that it isn't the One True Religion after all.

One neat little item is this $30 Wixey tilt-box which measures angles.

Wixey tilt box.

MEMS accelerometers are so cheap today that a lot of these sorts of devices are available. A rare-earth magnet in the bottom of the unit sticks it to something to be measured; in this photo it's measuring the angle of a table-saw blade. At $30 it has a claimed accuracy of +/- 0.2 degrees. I've checked it against my best square which is supposed to be within 0.0001" over 6". But is the square really that good?

I have to rely on faith. But these accuracies are good enough for woodworking.

The tilt-box can't measure absolute angles; they are all referenced to a value when one presses the "zero" button. So I put it on the saw's table, zero it, and then mount it on the blade.

The darn thing eats coin cells. After six months the CR2032 is dead, though I probably use it a total of ten minutes in that time. This puzzle got me interested in embedded systems that have to work for a long time from a battery which resulted in this study. The unit pulls 10 uA when "off" – the MCU is probably in some inefficient sleep mode. Interestingly, those $4 digital thermometers that live for years in a medicine cabinet yet spring to life when needed draw about a single nA when off. "Off" means different things to different engineers.

How about tape measures? Here's two:

Two tape measures.

One is a ¾" Dewalt, the other a ½" Stanley. At 7 feet they disagree by 1/16". Which one is correct? An old saw states that a man with one clock knows what time it is; one with two has no idea, which seems to be the matter with these two tapes. We're building a 30' long barn so I imagine that the total difference over that run is a quarter inch. Horrors! But the truth is I screwed it up and there's over an inch of skew in the building. Entirely my fault, though it would be nice to blame the tools.

In the electronics lab we all have a plethora of equipment for measuring things. My work is mostly experimentation and really doesn't need much in the way of high accuracy (or precision). Obviously, others will have different requirements. It is nice to know that the gear isn't lying too much, but for me it makes no economic sense to send the stuff out to a cal lab every year.

Take, for instance, a DMM. This HP 3468A is old but, with the exception of a display that's a little hard to read, is quite a nice unit. Does "old" mean "out of calibration"?

DMM being tested by the DMMCheck.

The little board in front of the meter is a DMMCheck from VoltageStandard. For about $50 this thing provides a 5.000 volt output reference, accurate to 0.01% +/- 500 µV, a 1.000 mA reference, accurate to 0.1% +/- 1 µA, and three 0.1% resistors. Recalibration of the unit is free for the first two years and $10 after that. It's really not important that my meter is super-accurate… but it sure feels good to know that it's pretty darn close. (I know that a measurement at just one voltage doesn't mean the meter is accurate at any voltage; the DMMCheck is just a reassuring sanity check).

Other gear is harder to evaluate. I check various instruments against each other from time to time. For instance, the HP 8640B signal generator's indicated frequency is always within 1 KHz of what the Advantest R3132 spectrum analyzer reports – not bad at the signal generator's max 500 MHz. My scopes are all digital and, connected to that signal generator, display the correct frequency to within the relatively low (few digits) resolution that most scopes provide as part of their automatic measurements.

Voltage measurements are much more distressing since scopes generally are rated at -3 dB for frequencies at their max bandwidth. Even if in-calibration it bugs me to see the amplitude drop off as one cranks up the sig gen's frequency. If you don't know how a piece of test equipment is spec'd you may be deceived by what is displayed.

My Agilent MSO-X 3054A scope does have an internal calibration feature, but I can't find any information about how accurate this is. The manual does say that for NIST traceability one must follow a normal (and carefully-outlined) procedure using traceable test equipment.

A recent EDN article discusses the merits of having an in-house calibration capability. For me, it doesn't make much sense but for others the trade-offs are different.

Published August, 2016