Embedded Muse 215 Copyright 2011 TGG October 17, 2011
You may redistribute this newsletter for noncommercial purposes. For commercial use contact firstname.lastname@example.org. To subscribe or unsubscribe go to https://www.ganssle.com/tem-subunsub.html or drop Jack an email at email@example.com.
EDITOR: Jack Ganssle, firstname.lastname@example.org
- Editor's Notes
- Quotes and Thoughts
- C Rant
- The Dumbest Thing I Did
- Joke for the Week
- About The Embedded Muse
Better Firmware Faster:
- Chicago October 21
- London, October 28
Lower your bug rates.
And get the product out faster.
Sounds too good to be true, but that's the essence of the quality movement of the last few decades. Alas, the firmware community didn't get the memo, and by and large we're still like Detroit in the 1960s.
It IS possible to accurately schedule a project, meet the deadline, and drastically reduce bugs. Learn how at my Better Firmware Faster class, presented at your facility. See https://www.ganssle.com/onsite.htm .
I'll be conducting public versions of this class in Chicago on October 21 and in London on October 28. Why not rev up your team? There's more info here: https://www.ganssle.com/classes.htm .
Unfortunately few teams keep detailed metrics, so usually have no quantitative productivity metrics. A few do. On average, these groups report a 40% increase in productivity when they adopt the suggestions from my Better Firmware Faster course.
Quotes and Thoughts
"Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do," Steve Jobs
In the last issue I had some fun at C's expense. The responses were overwhelming! Many of you agreed. Others didn't. But I appreciate all of the dialog and well-thought out positions.
Many people wrote in with a variation of "A craftsman does not blame his tools." But if my C compiler miscompiled, I'd blame it. If I struggled to build a complex system using Dartmouth Basic, I'd place at least some of the blame on that language (though it's my fault if I selected the language). Then there's APL which was so bad everyone abandoned it. APL was like a hammer with no handle: a tool that could be made to work, but that is fundamentally flawed.
If the C language wasn't well defined, I'd blame it for making me work harder. Uh, it isn't! What's an int? No one knows, so we each use our own typedef to tame the language.
And there's another issue: not everyone is a master craftsman or superprogrammer. If the tool is safe only in the hands of a best-in-class person, then what do the other 90% of developers do? I'm not advocating that everyone have training wheels, but don't think that a unicycle should replace all forms of transportation.
Several people point out that C is not the only language with an obfuscated code contest. Perl does, too. Though one would think Perl is pre-obfuscated.
Here are a few responses from readers:
The best was this: "Sorry, Jack. I think you're full of crap. Please remove me from your mailing list."
Ben Meyer wrote: "It's true - C and C++ will never go away. But they'll never go away simply because of the flexibility that they offer which for is required for some areas of programming - e.g. interrupt vectors where zero is a valid address, or creating structures of unknown length dynamically, for example, at work we have a structure that is run-time defined in length; it also gets passed around the network so there is no method of specifying a really known size at compile time which is really the only time the language can truly help resolve memory leaks or buffer overruns. So, while many may complain about some of these aspects of C/C++, they are also often required for many different kinds of programming - both OS and application level. A real programmer would learn how to use them instead of complain about them. And honestly, they are only a problem for programmers that are not disciplined enough in their field to do what is necessary to overcome the issues, and personally I wouldn't want one of those programmers working in any programming language. Languages that do work for you only dumb down the field and allow programmers to be lazier, which is not a good thing IMHO; they also make it harder to resolve those issues when they do happen - and yes, you can have a memory leak and buffer overruns in Java too, but you have a lot harder time doing anything about it when you do.
"And quite honestly, I'd like to see some proof backing up Geodesic numbers as most programs I use certainly don't have memory leaks caused by the program itself - that is, I am aware of some programs that have inherent memory leaks due to the Win32 API where there is nothing at the program level that can be done (it's in the Windows NT APIs some where, and completely untraceable for the application programmer), however, I would qualify that as a leak in the OS - it wouldn't matter if you were using Java or .Net/C#/VB/F#/etc or C/C++ - for your application the leak will occur regardless. So still, I'd like to see some numbers on that one."
James Irwin commented: "There is always a tension between freedom and conformity. I choose freedom. C as it stands lets me program in ways that no other "constrained" language will let me. Sure, publish some standards and suggest or require conformity, but much innovation comes from not being standard. C is still here, ADA is not. (And yes I think code should be clear and human understandable*. The obscure C code makes for fun puzzles just like the many mechanical puzzles that exist but have no "useful" purpose.)
"*Especially if I am expected to maintain it."
Mike Silva wrote: "Boy, ain't that the truth! And I mostly earn my living programming in C. Right now I'm rewriting some hyper-macro-ized code that nobody can understand, much less maintain. One of our other programmers says he documented a macro 37 layers deep. Talk about people who should be hunted down and shot! Or at least scolded severely...
"I figured you'd mention Ada, since you've talked about it now and then in the past. Every time I fix a bug I think "would this have happened in Ada?" and of course the answer is usually either "No" or "Yes, but we would have found the problem the first time we ran the build." I am amazed that our industry is so unprofessional and unserious as to keep using the equivalent of 1850-style power take-off belts and exploding boilers, and thinking we're so clever by wrapping a bit of duct tape around the boilers and calling it safer C.
"In my vast spare time I'm actually hoping to design up some simple ARM Cortex M3 boards and somehow cajole the Ada community into porting GNAT onto the M3, and then try to get these boards out into the world. Ada is such a vastly cool realtime language, on top of all its other abilities. We'll see...
"OK, I've vented. Now to earn some more money fixing bugs that wouldn't have happened in Ada. What a life."
Paul Bennett is a Forther: "As one who uses Forth for High Integrity work, I am happy to agree with the sentiment. However, Forth is probably more open to abuse than C. Forth, as a malleable tool to build Application Specific Languages, means that the programmer can put in the required bounds checking and sanity checks. It is up to the programmer how much or how little that can sensibly be.
"Interestingly, even Ada got sub-setted into SPARK-Ada. An even more restrictive arrangement that tightened the straight-jacket further. There are automated checking tools that ensure full compliance to SPARK-Ada and MISRA C, but, as you observed, it is a case of playing the game according to the tools rules.
"Personally, I would rather the programmers took the responsibility upon themselves. By consistent application of self-checking, peer reviews adherence to coding standards, it possible to produce a software component that is robust, performs according to its specification and is an elegantly crafted product that performs well. The way of working might seem to proceed slower this way but, it can work out to be quicker due to lessening the amount of re-work to fix bugs. The goal should always be zero bugs."
Don Dobbs agreed: "Couldn't agree with you more. Like you, I prefer Assembly because I know I am always in total control. C can be just the opposite. When it comes to higher-level languages I use PL/I. I've never had a memory leak, bad pointer, an undetected exception, or an out-of-range subscript. On Intel 80x86-based architectures I use Dr. Gary Kildall's (Digital Research Corp.) 30-year-old version of Subset G. It's limited to 64K bytes of code and 64K bytes of data. If 64K bytes of code is not enough (though it usually is) the linker supports overlays. It runs in a command prompt (DOS) window and doesn't have any links to Windows displays but it is perfect for utility processing of files, report generation, character string manipulation and bit fiddling. Because it supports packed decimal format (natively) I never get the rounding errors that can show up in products that use binary floating point for computations (e.g., spreadsheets). In summary, I avoid C like the plague."
Lou C wrote: "I share in many of your diatribes regarding C. It is the language you can truly love to hate. I kind of feel the same way about assembly language, or maybe even to a higher degree. I have often heard C described as "shorthand assembly language", to which I would add the sub-title, "A faster, easier, and more elegant way to get into trouble.."
"As far as code obfuscation goes, there is one legitimate benefit I can see of allowing it: distribution of source code without disclosure of the algorithm. With obfuscated (albeit unreadable) source code, you can compile something to object code on any platform without giving away any secrets of how the code actually functions. As you probably know, there are programs available that intentionally obfuscate code just for that reason. But I can imagine there are also programs that can un-obfuscate code as well, turning it back into readable form and thus killing my reasoning for allowing it.
"Just remember, any problems you can get into with C you can also get into with assembly language - it just takes longer to code them! One thing I will say for assembly language is it genuinely looks dangerous in the first place, not cloaked in some FORTRAN or ALGOL-like wrapper that hides all the nasty-ness.
"C is one of those languages that one almost has to learn to love in order to not get sucked into its myriad paths for mistakes. And when that is mastered, you can truly love hating it!
Greg Gentile is in the freedom camp: "Maybe we should constrain English so we cannot drive on a parkway after we've parked in our driveway, but then it would be a diminished and less expressive language.
"See Kurt Godel's great work to understand that any language that can express all of the complexity we may encounter will, of necessity, be internally inconsistent. Which is to say that bad programs can (and will) be written in any language which has sufficient complexity to fully embrace a real life problem. There is NO ultimate language immune to this problem. Russell tried to prove that such a language could exist and Godel proved him wrong.
"Clarity is the luster of the mind and the responsibility of the programmer.
"Your argument seems to reduce to the assertion that C is a more powerful tool then the mind can discipline. I say it depends on the mind.
Vlad Zeylikman wrote: "i have to voice disagreement. C does not suck, it does what it was designed to do: provide a more convenient tool than assembly language without imposing additional constraints. that is why it does not have type checking worth a damn or any other constraints. it was simply misused since inception: it should have stayed at the low level of system programming. but when you teach college kids C and tell them to do the assignments in C, that is what they learn and tend to use. C++ is not any better either in that respect: objects and inheritance and other high level concepts but no basic safety mechanisms.
"I cannot say why ADA was not adopted widely, much like i cannot say why MODULA2 was not adopted either. i will let someone more knowledgeable speak to that but C was most certainly misapplied. it's like insisting on using hammer and chisel instead of a router for mass production of furniture. sure is fun and just as sure you get very non-uniform quality and appearance. and it takes a long time to acquire the skill."
Howard Smith agreed with me: "You sure hit the nail on the head with the 'C Sucks' part of Embedded Muse 214.
"The biggest problem with ADA was that it needed a huge footprint in memory and support libraries to work. Nobody made a version that was tailored for embedded systems. And, yes, the syntax was very restrictive, and many people hated that. However, the syntax for VHDL is also very restrictive. Hardware designers learned to accept the restrictions, and did great things with that language. There would not be a new microprocessor every 18 months without it!
"I would love to see a new language created for embedded firmware development, but there are at least 3 things that will need to change before that can happen. First, the software managers need to accept the fact that the new tool chain will be expensive, because somebody needs to make some money to risk creating it. Remember, a full seat of VHDL is certainly not free. Second, the embedded firmware community will need to accept the new paradigm, learn it, and use it. This is probably as hard as getting the software manager to pay for the tool set. Finally, a new language has a half life of about 10 years. If its usefulness becomes apparent in that time, the language will survive. Otherwise, it dies a slow death.
"In the mid 80's, Nicholas Wirth created a language called Modula-2. Many people at that time thought it should be called Pascal 2. The language had many of the good characteristics of ADA, but was certainly more suitable for embedded firmware. I liked it because it removed the 64K code limitation that Pascal had, but kept a lot of the Pascal syntax. It also used the concept similar to the entity and architecture concepts in VHDL to ensure that the interfaces between functions/subroutines were always correct. This made compilations a little crazy when changes were made, but it was easier for the compiler to identify the problem that trying to find the resulting bugs.
"I think it may be interesting to look at Modula-2 again, as a starting point for a new embedded firmware language. Or, maybe a reasonable subset of ADA could also be considered.
"It all goes back to this: But We've Always Done It This Way! Change takes a lot of effort. So does chasing all of those bugs in the 'C' based embedded firmware. I would rather change the paradigm, and reap those benefits."
Luca Matteini wrote: "I totally agree that computer languages aren't _anymore_ "a way of getting a computer to perform operations", as things changed radically. When I was a kid I loved to find every book I could, about electronics and computers, and by the end of the 70s that wasn't always so easy here. Particularly, about programming languages, the best I had at hand was about some FORTRAN implementation, or a large machine assembly code, that I couldn't even try out in real life. By that time, programming was more about machines, than applications, maybe.
"When personal computing started giving us real general purpose machines, things evolved rapidly, since all the programming languages of the time began being really public. You could code in BASIC as in FORTRAN, or in FORTH, or C, plus the billion of dialects out there. That phase elevated the programming paradigm, while assembly language kept being strong, there was a significant take-off from board-level software design.
"The next big change was to make effective object oriented programming. In the 80s I started gathering documentation about SmallTalk and Prolog, and later even the heavy-weight Borland made a (blind?) step in that direction, with Turbo Prolog. Eventually they returned back more to C-alikes, as with C++. OOP gave new paradigms, enlarging the distance from program counters and stack pointers. This wind of change brought already the anticipation for the following step.
"The fourth step (in my scale, not in recognized "language generation" units) turned into a "babelization" of programming languages, very different from the variety of the 70s or the 80s. New programming languages, from the 90s on, had to be OO, but started to be self referential too. They all(?) came up to life as "the definitive programming language", collecting all the knowledge from the past, and at the same time keeping all the past well deep buried. This wave of languages has spawn paradigms that live for themselves: programming with a language has became not just the tool, but the final purpose.
"That last feeling I have, still today, is that programmers don't employ anymore computer languages to develop applications, but to develop the language itself. So, when the (apparent?) simplicity of C has spawn C++, then Java, or whatever, has given way to a complexity that requires more work on "proper use of the language", than in application development. And I'm sure any expert does disagree with my opinion, showing how an "hello, world" takes less lines to be written, in modern languages. Don't take me wrong: I use C++ myself too, but I know well I'm far from the elegant code written by purists. I just need it working, long before I learned the best practices for STL, which maybe I could never put in small MCU.
"Then all this falls into C programming, which keeps being my favorite, besides all of its pitfalls. I like reading from time to time, new C coding styles, proposed by anyone, just to exercise my mind, or to learn something new. I was so happy, in one of my first readings of MISRA rules, that I already complied and agreed with many of them. Even though I strongly disagree with some of them -- like avoiding some standard library calls I use: should I rewrite them, maybe poorly, better than using the compiler's well crafted ones?
"A quote from a forum on AVRfreaks.net: "Learning those practices and applying them properly would qualify me as being MISRA-able?"
"Long live C language. Will stay there maybe forever, since there's no substitute language yet: new ones (with same minimal features) are usually far more complex, and that alone is a good reason.
"One note: in middle-earth conflicts, between C and C++, many ones often forget the existence of EC++ ..."
The Dumbest Thing I Did
When interviewing I always ask candidates (those with experience) about their dumbest mistake and what they learned from it. Those with no mistakes are generally those with no experience - or are perhaps truthiness-challenged. Do you have any?
Pete Friedrichs contributed: "In reading the "Dumbest Thing I Did" feature, I was struck with the idea that in the situation depicted, the "dumb" thing actually occurred farther upstream of the event. Let me explain:
"When I design test equipment electronics, I try to anticipate the kinds of mistakes an end-user is likely to make because he is tired, distracted, overloaded, or rushed to make some arbitrary delivery schedule.
"If DC power, for example, is being applied to an instrument through banana jacks, terminal strips, or any other means besides a keyed connector, sooner or later, power *will* be applied backwards. You can bank on that.
"Since I accept this as an inevitability, my feeling is that "dumb" is not rooted in the depicted polarity reversal, as much as in the failure of the equipment's designer to engineer it to tolerate such a mistake in the first place.
"In some cases, bullet-proof polarity reversal projection can be accomplished with nothing more than a fuse and a reverse-biased diode. In all likelihood, then, the $4000 loss in the story could have been prevented through the inclusion of $2 worth of additional parts."
And this is from William Carroll: "My first job out of college was with a startup building a digital music sampler. We targeted a large annual convention as our make-or-break opportunity to introduce our sampler to the industry.
"Our prototype was several wire-wrapped boards in a chassis with a wire-wrapped backplane. To increase our chances of a successful demonstration (and to get the final bugs worked out before that demonstration), the company sent the hardware tech and the software engineer (me) out to the convention site a day early to set up the prototype and make sure everything was working.
"The evening before the convention opened, I was sitting in the demonstration suite, debugging furiously, when I decided I needed to update the PROMs. I had foreseen this eventuality, and had shipped the UV eraser (remember UV-erasable PROMS!) and the PROM programmer along with the prototype (and the ICE, remember those!).
"I pulled the PROMs out the CPU board, popped them in the eraser, then turned to the PC to build my changes. I don't remember what PC I was using, or why it wasn't the one from my desk back at the office (this was long before the era of laptops). But as the compiler ran through the project, it became obvious that the PC on which I was building didn't have all the include files.
"Did I mention that I had already popped the PROMs into the eraser? I quickly rushed over and removed them. I put them in the programmer to read their contents. They weren't completely erased, but they were more than far enough along to be beyond recovery.
"So I was sitting in a hotel meeting room on the opposite coast from my office with the company's make-or-break trade show opening the next morning, with effectively-blank PROMs, and no way to build an image I could program into them. Did I mention that I had not thought to bring an image of the PROMs, or save one before erasing them?
"I have wiped most of the next few hours out of my memory. (Much like what I did to the PROMs!) Somehow, some one managed to arrange a modem connection between the PC in the office and a PC in the hotel business office, and somewhere around 2 AM I was able to modem the files I needed. (Yes, there was a time before the Internet, and in between evading the dinosaurs, we transferred files over phone lines using a modem.) Using the files I was able to build a firmware image. And when the convention opened the next morning, the prototype was operating. (About as well as it ever did!)
"I did not get a lot of sleep that night. But I also didn't lose my job that night. And ever since, before going into the field, I have insisted on being able to do a complete build on the PC I was taking with me before leaving the office.
"And though some of my co-workers, past and present, might argue the point, I like to think that that was the dumbest thing I ever did."
Steve Litt sent a link: "Here are descriptions of several bonehead maneuvers I made during my career" http://www.troubleshooters.com/tpromag/200610/200610.htm ."
William Carroll had a backup story: "The company kept all of its shipping and receiving records on a Macintosh, operated by the wife of the engineering VP. One day, she got some errors and called her husband over. He worked on it a while, before deciding that the disk was beyond repair. He wiped it clean, re-installed the system, then turned and asked where the backups were.
"Oops. No one had done a backup in months. They were done to floppies (remember those?), took a lot of time, and the wife didn't like doing it. So they restored from the most recent backup, then spent several weeks re-entering data from the paperwork.
"I was already doing backups for the engineering PCs (or at least those willing to let me interrupt them to do one) and rotating tapes through the off-site storage (a safe deposit box at a nearby bank). So the VP bought a tape drive for the Mac, and I was asked to collect a tape every week from his wife, and include it in the weekly tape rotation.
"She had a tape for me every week for a while, but gradually (much like the engineers), she lost the sense of urgency and stopped giving me a new tape regularly.
"One day, she got some errors and called her husband over. He worked on it a while, before deciding that the disk was beyond repair. He wiped it clean, re-installed the system, then turned and asked for the latest backup.
"Oops. No one had done a backup in months. I gave him the most recent tape, which he restored. And they once again spent several weeks re-entering data from the paperwork.
"The company went out of business before we got to the third iteration. And in fairness, the engineering PCs would have been a bigger disaster. There was no LAN to which to backup files, no version control to serve as an archive, nothing. A disk crash would have simply lost everything that that engineer had ever worked on. Luckily, that never happened."
Let me know if you're hiring firmware or embedded designers. No recruiters please, and I reserve the right to edit ads to fit the format and intents of this newsletter. Please keep it to 100 words.
Joke for the Week
Note: These jokes are archived at www.ganssle.com/jokes.htm.
Adam Braun sent this one:
A mathematician, a physicist, an engineer, and a computer programmer are given a problem. All odd numbers are prime - prove or disprove. They all go off and work on the problem for a while.
The mathematician comes back first. He says "3 is a prime, 5 is a prime, 7 is a prime, 9 is not a prime ... therefore the statement is false".
The physicist comes in next. He says "3 is a prime, 5 is a prime, 7 is a prime, 9 is not an experimental error ... therefore the statement is true".
After a while the engineer comes back. He says "3 is a prime, 5 is a prime, 7 is a prime, 9 is a prime ... therefore the statement is true".
The computer programmer is gone for hours. When he finally comes back, he says "You three cannot agree so I've written a program that will solve this once and for all. The computer just going to get this all coded up." He sits down at the machine, types a few keys and hits enter. Immediately the program starts running and the answer pops up on the screen.
3 is a prime
3 is a prime
3 is a prime
About The Embedded Muse
The Embedded Muse is a newsletter sent via email by Jack Ganssle. Send complaints, comments, and contributions to me at email@example.com.
The Embedded Muse is supported by The Ganssle Group, whose mission is to help embedded folks get better products to market faster. can take now to improve firmware quality and decrease development time.