Guardian Angels

Every project needs a guardian angel, someone who watches over the code.

Published in Embedded Systems Programming, July 1996

For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

By Jack Ganssle

The nuns of St. Camillus pounded all sorts of interesting ideas into the heads of myself and my grade school pals. Some will no doubt come out in the inevitable therapy sessions I'll need when my kids crash into teenagerhood. Others, though scary at the time, make for good stories today.

"Guardian Angels" were the nuns' invisible avatars, always watching over us, often helping us, sometimes recording black marks in that cosmic book of personal errors. No doubt my guardian angel is a UNIX guru, working full time just to keep the sheer volume of my black marks from crashing all of heaven's database engines. "Reboot heaven? Y/N?"

The nuns neglected to tell us we're each others guardian angels. Parents watch over their kids with a wary eye no supernatural creature can match. Spouses look out for each other's interests. Friends in need find friends indeed.

These thoughts bubbled to the surface recently when I asked an engineer to recode a few functions in assembly language. The system worked great but just seemed a bit slow. Some analysis turned up a busy interrupt service routine, supported by a couple of equally stressed main-line subroutines.

I wasn't looking for much speed - a 20 or 30% improvement was more than enough to give the system a snappy feel. Why not translate a bit of code to assembly and be done with the problem?

The engineer's boss collared me, and in no uncertain terms said "No!" Though he knew that the translation would work, he figured the resulting assembly language - the only non-C code in the system - would reduce the system's maintainability.

He is the system's code guardian. As such he has the authority - and the common sense - to ensure we don't do dumb things that erode any aspect of the software, software we've invested in in a big way. Though I might disagree with his approach, he watches over his code like a parent over children. He protects it from the whims of marketing types, and from expedient solutions like mine.

The code guardian is the quality angel. When a dozen approaches will all work, the code guardian selects one that most closely meets the needs of the customer, yet that preserves the software's integrity.

Every project needs a code guardian. Usually this should be the team's highest-ranking technical person, someone with the authority to say "This stinks! We're not going to do it this way!" The code guardian needs the freedom to do work management might consider non-productive, like instituting version control systems, developing and maintaining software standards, and sometimes rejecting working routines that just don't measure up to his demanding standards.

It's hard to chuck working code. Sometimes it's simply the right thing to do.

It never ceases to amaze me that companies hire huge accounting staffs to guard financial performance, and security firms to protect physical assets, yet seem to neglect the critical importance of their software. Since you can't see or feel code, management leaves it on its own to wither or thrive, protected only by the techies who understand it.

Look at a high tech company's balance sheet: you'll see all sorts of assets, from cash to accounts receivable to chairs and furniture. They count the ICs and soldering irons, yet the millions spent on developing a code base seem to be mysteriously expensed, never appearing in the financials as a critical, valuable, part of the company's portfolio and success.

The role of the code guardian, of course, is to protect the software from outsiders (marketing) as well as from the developers themselves. Most projects start with the best of intentions: Code reviews, structure charts, careful design are our mantra when delivery is a year away. After a few schedule slips, with the deadline looming, it's all too easy to abandon a reasonable methodology in our panic to get a product out the door. The code guardian has got to institute a discipline of methodology. He or she becomes the code Gestapo as the crunch hits, insuring that things are done right. Bad code always comes back to haunt us.

Compiler Guardians

One difference between hardware and firmware development is the shelf life of the tools. Even when pushing the bleeding edge of technology, a five year old scope or logic analyzer is usually more than adequate. Some dedicated folks still crank out 8 bit systems using old Tektronix 545s - 30 year old vacuum tube beasts that invite a hernia each time they're moved.

Our software tools have lifetimes of microseconds. It won't be long before we use the Internet for real time updating of our word processors and spreadsheets. Much of the current embedded hardware technology is all designed to speed firmware upgrades - flash memory lets us reprogram devices on the fly. I visited a company this week that uploads updated firmware to point of store terminals in large stores via a satellite link.

Of all the tools we use, though, compilers are the most problematic. Compilers create the code that is our product. A line of C is nothing more than a series of commands asking (pleading?) for a stream of machine code that implements an algorithm. Most other tools, like debuggers, fly in an orbit around the designer, helping him design and implement the product, but never becoming a part of the unit a customer sees.

A compiler problem becomes a product problem. A compiler change changes the product. When your compiler vendor releases a version that generates tighter code, simply recompiling changes the product. Though the tighter code is nice, it raises the specter of new, lurking bugs.

Before the vendors start sending unibomber email, I'm not implying that compilers are bug-ridden monstrosities. In my travels I see system after system that works until something changes. It may be a compiler upgrade, or perhaps a different set of parts used in manufacturing. Bugs lurk everywhere, and a compiler change often uncovers problems previously hiding in your code.

However, the tools themselves can have problems. We use an array of programmable logic devices here; each comes with its own overpriced toolchain that converts equations or schematics to bits burned into the PLDs. Without exception, every time a new version of the software comes out one or more of our devices no longer build properly. Yes, we push the devices to their limit. And, yes, the vendors are good about making it right. We've learned, though, that change has risks.

What's a programmer to do? Do we upgrade every time new software becomes available? We've found that the risks sometimes outweighs the benefits.

The problem is more severe for older products. If your five year old widget works great, and only needs a very occasional software tweak, do you instantly rebuild the code on each new compiler release? This implies a product change, with perhaps little benefit. Or, do you wait till a crisis requiring a code change hits, and then both fix the bug and use the new version! perhaps changing a ten minute fix to a week of hell?

I don't have an answer. We maintain a half dozen or so compiler versions here. An old product that hasn't changed in years was built with version 2.0 of Borland C. We keep the ancient compiler around, just in case something happens. Other products use intermediate versions! which we also keep. Later we changed to Microsoft C (for embedded and non-embedded work). Great compiler, but there's no way we'll ever go back and rebuild all of the older products.

Since hard disk space is cheap, we litter old tools in carefully protected directories, just in case they'll ever be needed again. It drives our code guardians out of their minds.

The right thing to do - technically speaking - would probably be to rebuild all the old stuff. Who can afford that? The testing alone consumes vast amounts of valuable development time.

The risk is high, the benefits are vague. We (the embedded community, that is) need compiler guardians who keeps old versions around, with associated antique linkers and other support tools.

Perhaps "compiler guardian" is too narrow a focus. Most tools require some level of attention and management. Someone, or some group of people, have to keep all electronic tools metaphorically sharp.

Take for instance a PCB layout program, a staple of every engineering department. When our vendor recently released a major upgrade to the PCB software, we discovered to our horror that all of the libraries were now obsolete. Now we have the great joy of maintaining both old and new versions of the software.

One friend addresses these sorts of problems by never upgrading. When the path becomes too rocky he idles, using the old but presumably reliable software for years after the vendor declares it obsolete. I worry that getting stuck in such a tool time warp will backfire. Support becomes non-existent.

However, "good enough" may be all that's required, as the goal of engineering is to get a product out on time, on budget, that meets specifications. Using the latest and greatest goodies does, I feel, keep us more productive. Does the continual upgrade cost balance the productivity increases? I wish I knew.

Backup Guardians

A close friend called last month. He had just accidentally deleted 1.5 GB of critical information from his hard disk, not realizing that he was working on a network drive, not a personal drive.

Now, Dave is religious - some would say anal retentive - about making backups. Everything was on tape! but the Conner tape drive was in for repair. Seagate bought Conner; the two companies are moving, so he doesn't expect to see a replacement drive for another couple of weeks. Meanwhile he has no way to roll in the data.

I used his misfortune to reexamine my own backup procedures. A utility runs at midnight to copy everything to a spare (huge) hard disk. At 1 AM the tape starts whirring away at its own backup strategy. Perhaps the lesson is to have backup backup hardware.

A few years ago thieves broke into the office and stole all of the new computers. The backup tape was in the tape drive they carted out the door! Fortunately we rotate tapes to off-site locations, but clearly just having data on tape is not enough.

We need backup guardians. Someone who understands the importance of data. Some with the authority to insist that everyone keeps the most important files in a logical place.

Be An Angel

The best guardian, of backups or compilers or whatever, is half evangelist. It's important to take care of the problem, as well as to convince others there is a problem. Backups are a great example: everyone gets complacent when systems work well and no data has been lost for a year or more. That's when the guardian needs to cajole developers into saving their data per the plan.

It's a message of doom, I suppose. "Backup Now or Repent Later."