For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.
The complete index to reviews of embedded books is here.
Embedded Systems Design, Arnold Berger
Arnold Berger's new book 'Embedded Systems Design' (CMP Books, 2002, ISBN 1-57820-073-3) is an introduction to how we go about building embedded systems. Arnie teaches a class on the subject, so this book distills his wisdom in getting the message across to EE students. At $34.95 and 236 pages it offers welcome value to the world of overpriced technical books.
The book has a very powerful focus on tools, a direction no doubt gleaned from Arnie's many years at AMD and Applied Microsystems. The explanation of BDM/JTAG debuggers is one of the best I've seen. In addition to the thorough coverage of standard BDMs, he also discusses the very important Nexus efforts to standardize these debuggers, and to extend them to offer more capability for dealing with real time systems.
You'll immediately note his bias towards using the most powerful debuggers, especially In-Circuit Emulators. Not sure how an ICE works? Read this book for a great overview and advice about using the most important features of these tools.
Most welcome is the 23 pages devoted to selecting a CPU. That subject gets too little attention, and can be sometimes more a matter of faith than of science. The book covers all of the selection criteria in a readable and comprehensive manner.
The book is a needed addition to our art. It's not aimed at the experienced developer, though. Couple this with Simon's An Embedded Software Primer and you'll have a good start on the basics of building embedded systems.
Embedded Systems Design using the Rabbit 3000 Microprocessor, Kamal Hyder and Bob Perrin
Embedded Systems Design using the Rabbit 3000 Microprocessor, by Kamal Hyder and Bob Perrin, (ISBN: 0750678720) is a complete introduction to programming with this popular microprocessor.Rabbit Semiconductor sells a popular range of 8 bit microprocessors that offer quite high-end performance. My son and I just finished a project for his high school with one, and I've used them for a number of other applications. The R3000 is sort of like a Z80 on steroids, with many new instructions, wider address bus and a wealth of on-board peripherals.Like any modern high-integration CPU the Rabbit offers so much it's sometimes hard to get a handle on managing all of the I/O. This book will get you started, and is a must-read for developers using the part.The first few chapters describe the CPU in general and the development environment provided by Rabbit (Dynamic C).Chapter 5, though, is a description of interfacing to the real world, using all sorts of devices. It's aimed at engineers, not raw newbies, but, for an engineer at least, is an easy and descriptive read.
The chapter on interrupts is one of the best I've seen in any book. It covers the hard stuff, like writing ISRs in C and assembly, with real-world examples. If you're using the R3000 just cut and paste the code into your application.It seems today that if there's a transistor in a product then it needs an Internet connection. Rabbit has several development kits that include everything needed to connect to the 'net. The authors devote considerable space to networking, but thankfully with only a cursory explanation of protocols. Rather, they give step-by-step instructions on implementing a working network, and conclude with a complete web server for monitoring water sprinklers.The final chapter covers an alternative toolchain from Softools. Dynamic C is a single-module compile-it-all paradigm that's highly interactive. Softools (http://www.softools.com/ ) sells a well-supported, reasonably-priced conventional C compiler, assembler and IDE. I only recommend products I've used and like, and the Softools products are first-rate.Systems Design using the Rabbit 3000 Microprocessor is required reading for users of the R3000, and a pretty darn good introduction to the entire realm of embedded systems development as well.
Embedded Systems Security, David Kleidermacher
At last! Finally, a book about building secure embedded systems.
Read the newspaper and you get an almost daily litany of stories about leaks and exploits. Credit card numbers seem to leap out of data bases. Personal information spews from corporate servers.
But the elephant in the room, the real threat vector, has been largely ignored. For every PC in the world there are dozens of embedded systems, which are increasingly interconnected via an ever-growing array of communications channels. Wi-fi, Bluetooth, Ethernet, RFID, Firewire - the list is endless. A smart phone has at least four antennas, reflashable memory, and, if compromised, could be an electronic Typhoid Mary.
Anything with a connection can be a threat vector. A USB toothbrush (yes, at least one is on the market) might carry both biological and computational infectious elements. As can a smart mousepad (which uses the connection to display custom pictures). That wi-fi electronic picture frame showing cute photos of your loved ones may have a wicked heart as it poisons the neighbor's network.
Consider that some automobiles currently have brake-by-wire. Tractors have had steer-by-wire for some time; the steering wheel merely spins an encoder, and that technology will certainly make it into cars. But consumers want Internet connections at work, at play, and on the road. How hard will it be for a bad guy halfway around the world to bring the I-5 to a disastrous crunch by issuing TCP/IP packets?
Not hard at all, apparently. Researchers have already taken control of cars using a variety of vectors.
Engineers at the DHS destroyed a large diesel generator by infiltrating it remotely. Stuxnet showed that a boring old SCADA control system can be a state-sponsored weapon. SCADA systems control power plants, factories, and, well, just any industrial process. A sewage plant in Queensland was successfully attacked, causing it to leak noxious goop over a wide area. The Feds admit that foreign states have infiltrated the USA's aging power grid, perhaps leaving rogue code behind.
What happens when the grid becomes smart?
At least one company sells bearings that use a magnetic field to suspend a shaft; a DSP performs some 15,000 calculations per second to keep things running smoothly. And they have Ethernet connections! A coordinated attack on bearings -- bearings! -- could cripple manufacturing.
Internet-connected toasters are on the market, as are networked coffee makers. When 100 million bleary-eyed, and possibly hung over, Americans wake one Monday morning and find their caffeine fix unavailable business will come to a standstill. But the truth is they'll still be blissfully sleeping since the alarm clocks were all reset by a teenager in Nigeria.
There's another security issue: non-malware bugs. As embedded systems become increasingly part of the fabric of civilization any bug that compromises the reliability of a system can become a mission-critical security threat. When the automated jail control doors fail to close, the prison becomes a hotel. And not Hotel California, because the guests are sure to check out. A task that errantly consumes too many resources (like memory) or CPU cycles can prevent other activities from running. The traffic light fails to turn red, the railroad signal remains open or the ATM's bill counter fails to stop spewing money (one can hope).
Most firmware developers have little training in security, and are largely unaware of both the threats, and the techniques and technologies needed to make their creations robust. For the first time there's a book on the subject. Dave and Mike Kleidermacher clearly describe the issues, and offer a five point "Principles of High Assurance Software Engineering" guide. Need a clear but complete description of encryption techniques? Or details about protecting data? There's a lavishly-illustrated chapter on each.
Many developers are resigned to building insecure systems, figuring the enemy will always be one-step ahead. And embedded systems aren't upgraded every 15 minutes, like PCs are, so improved defenses always lag new attacks. But Dave and Mike show that it is indeed possible to build firmware which is inherently secure.
The authors don't claim that making secure embedded systems is easy. In some cases even the tools themselves must be evaluated and qualified. But secure firmware will be both a national priority and a competitive advantage. This book offers answers. I hope it serves as a wake-up call as well.
An Embedded Software Primer, David Simon
An excellent book, 'An Embedded Software Primer', by David E. Simon (1999, Addison, Wesley, ISBN 0-201-61653-X) came across my desk. Embedded titles are becoming more common - not so long ago ANYTHING embedded was worthy of attention - but this book is a standout.
It's aimed at the novice or nearly novice embedded person, one with experience in C but little feel for the unique issues of embedded systems. The book starts with the standard introduction to microprocessor hardware (which could have been left out), but quickly moves on to a very good description of interrupts; this section alone is quite worthwhile.
Three of the 11 chapters are devoted to real time operating systems. The included CD has a copy of the older version of Jean LaBrosse's uC/OS RTOS. Whether you use uC/OS or a commercial product, Mr. Simon's discussion of RTOS issues is a very good introduction to the subject.
If you've never used an RTOS, this is a pretty good reference (but also check out MicroC/OS-II, LaBrosse's new companion volume to his upgraded RTOS). If you're trying to figure out what firmware is all about, and get a sense of how one should write code, this book is for you.
This is one of my all time favorite books on embedded systems. Very highly recommended.
The Essential Guide to Digital Signal Processing, Richard Lyons and D. Lee Fugal
Summary: A new book could be called "Signal Processing for the Masses."
Rick Lyons sent me a copy of his latest book about DSPs back in June, and I finally had a chance to read it. It's titled "The Essential Guide to Digital Signal Processing," by Richard Lyons and D. Lee Fugal.
Most embedded.com readers probably will not want to read this book as the material is elementary. It's aimed at those with really no understanding about DSP. However, I could be wrong, as the DSP segment of the market is small compared to general MCU usage, and perhaps many readers haven't had a chance to delve into this topic. Signal processing is very important and interesting subject and I do think every embedded person should be familiar with it.
How elementary is the material? Chapter 2 describes analog signals in general. That's followed by a chapter about frequency. Now, what could be simpler than that? However, the authors wisely talk in terms of degrees and radians, and the latter may be unfamiliar to non-EEs. You cannot understand any of the literature about signals without a solid foundation in radians. Every EE remembers the cos(x*pi+phase) notation, but perhaps CS graduates don't. The chapter covers spectra and shows how a spectrum analyzer differs from an oscilloscope, in really clear prose.
Chapter 4 is about digital signals. Yawn -- that's what we're all mired in. But are you familiar with decimation? That's another critical part of signal processing.
Later chapters are meatier for techies. I have never seen a better explanation of aliasing, which is depicted with simple but very clear diagrams. Anyone using a digital scope needs to understand this. I once walked in on an EE who couldn't understand why his 32 MHz clock looked like 32 KHz on the scope; a quick spin of the time base knob cured that woe, and an explanation of aliasing lifted the fog from his brain.
The two most demanding chapters cover, unsurprisingly, transforms. The FFT is described in general and a good example shows how it is computed. Wavelet transformations are increasingly-important and the books does a good job describing them. Wavelet transforms weren't known when I went to college, yet now my son uses them extensively in processing seismic signals. The ten pages devoted to the subject gives a sense of what they are, when they should be used, and how they differ from Fourier transforms.
Filters are covered superficially. More detail would improve the book as they are so critical in many applications.
The rest of the volume is old stuff to old hands. Scientific notation and binary numbers are part of your DNA and there's nothing new about them. An appendix on dBs is quite complete and worthwhile if you're not comfortable thinking in logs. I wish the person on the street would read it and realize that "100 dB" is meaningless since dBs are always referenced to some value. Of course, these are the same folks who are giving 110% in football.
Interestingly, there's nothing on DSP processors. This is a book about signals, not implementations.
The book is very well written with a quick, breezy style. Most engineers will get through it in an hour. My only complaint is that the $39.95 list price is an awful lot for a 188 page tome. Amazon lists it at $28, or $17 for the Kindle edition.
This is a book for the vaguely-techie who needs just a bit more than a little familiarity with the subject. DSPers working for a non-technical boss should slip a copy under the bigwig's door. It's also probably the best work for a practicing engineer who wants a passing familiarity with the subject, as it is such an easy read. Early chapters will seem like CS101; skip the stuff you know but do look at every page as there are gems buried even in that material that might surprise.
The Existential Pleasures of Engineering, Samuel Florman
This review was published in May, 2011
A number of readers have written recommending The Existential Pleasures of Engineering by Samuel Florman. Normally one doesn't hear the words "existential" and "engineering" in the same paragraph... or even the same book!
This is an old book first released in 1976 and updated in 1994 so it's a bit dated and is very much a reaction to the anti-establishment 60s where a distrust in technology burned. (Today I'd argue that rather than distrust there's a complete lack of understanding about technology and science, which is arguably worse). And a lot of the context is in terms of the new (at the time) rise of the environmental movement, heralded by Silent Spring and President Nixon's creation of the EPA.
He paraphrases philosophers of technology who "reserve some of their disdain for the activity [engineering] itself, and for the men who make it their life's work. As seen by the antitechnologists, engineers and scientists are half-men whose analysis and manipulation of the world deprives them of the emotional experience that are the essence of the good life." Hmm. I've yet to meet Mr. Spock despite nearly four decades as an engineer. But that sure does make a fun sound bite.
Mr. Florman discusses the golden age of engineering, which he pegs from 1850 to 1950. Certainly in the 19th century engineers were the rock stars of the age, their glories trumpeted widely and their ambitions admired by all. He worries that the field has been in decline since then. I'm not so sure; 1950 signaled, roughly, the birth of digital computing which has accomplished marvels that could not have been even dreamed of. Even civil engineering, which one would think had reached a stable maturity, has blossomed with creations like the stunning Puente de la Unidad bridge that offers a slender and functional beauty unknown a century ago.
I found his railing against other peoples' perceptions somewhat tiresome, while agreeing completely with him. But some of the writing is nearly poetic: "My proposition is that the nature of engineering has been misconceived. Analysis, rationality, materialism, and practical creativity do not preclude emotional fulfillment; they are pathways to such fulfillment. They do not reduce experience, as is so often claimed, they expand it. Engineering is superficial only to those who view it superficially. At the heart of engineering lies existential joy."
I suspect a lot of engineers won't care for this book. After all, who cares what people think about us? But underneath Mr. Florman's concerns is a paean to engineers, and to the work we do. And for that I enjoyed the book, and recommend it to those who find the history and philosophy of this field compelling.
Extreme Programming Explained, Kent Beck
Software engineering is a field that seems to proceed in fits and starts. Most of us write code the same way we did back in college, though occasionally a new approach does come along. I'd count Fagin Inspections as one, OOP, another.
In the last couple of years, though, Kent Beck's Extreme Programming (XP) has surfaced as another interesting approach to writing code. And code is the operative word. XP starts with the requirements in the form of user stories. The customers deliver and prioritize the user stories. The developers analyze the stories and write tests for them.
Everything ends with code. The code is developed in pairs of developers to increase quality. Quality code is the goal, and that's obtained by constantly rewriting it (refactoring in XP lingo), pair programming so two pairs of eyes look at it all, and constant testing/integration. The output is clean, clear code that fulfills the customer's wishes, with no extra frills or hooks for extensibility.
One book that does a great job of describing XP is Kent Beck's Extreme Programming Explained (ISBN 201616416), a slender but complete $29.95 volume.
I sometimes find these sorts of books tiresome. An evangelist pushes what some might see as a wild-eyed new way to create software, while the evening wears on and my interest flags. This one is different. Between the writing style and the quite fascinating ideas behind XP I found the book compelling.
XP requires a customer who lives on-site, constantly providing feedback to the development team. A very cool idea. Practical? I have doubts, especially in the embedded world where so many of us build products used by thousands of disparate customers. But a cool idea nonetheless.
XP demands conformance to a coding standard. Excellent! The pair programming I'd find a little too 'in your face', but is an interesting concept that builds on the often-proven benefits of code inspections, though in my experience two pairs of eyes are not enough.
XP teams focus on validation of the software at all times. Programmers develop software by writing tests first, then software that fulfills the requirements reflected in the tests. Customers provide acceptance tests that enable them to be certain that the features they need are provided. There's no such thing as a big integration phase. This is the XP practice I find most brilliant. Even if you're not going to pursue XP, study it and take the testing ideas to heart.
Constant testing plays into the 'frequent releases' XP requirement. Don't build a big thing and then dump it on a customer. Release portions of the project often to get feedback. This is much like the spiral development model, which seems to offer the only practical hope to meet schedules. Of course, neither spiral nor XP development promises that we'll know a real delivery time at the project's outset; instead, we evolve the product and schedule together. Most managers can't accept that.
Finally, I'd imagine most of us would quickly buy in to XP's belief in 40 hour work weeks. Tired programmers make mistakes. Go home!
Extreme Programming Refactored, by Matt Stephens and Doug Rosenberg
Like Martin Luther's 95 thesis, Matt Stephens and Doug Rosenberg's new book 'Extreme Programming Refactored', Springer-Verlag, NY NY 2003, ISBN 1-59059-096-1) lifts the hood on the hype and exposes the problems that come with XP.
Just as educated Christians should read what's available of the Talmud (at least, the little that's been translated into English) to understand better an important and interesting part of our world, all educated developers should go dig through a couple of XP tomes. And then read this book, which in the Agile spirit I'll acronym to XPR.
It's the most infuriating programming book I've read. The message is spot-on, but is told in such an awful manner that it's sometimes hard to hear the reasonable thoughts for the noise. Like the lyrics to 40 (I counted) annoying XP-bashing songs littered randomly in every chapter.
Sometimes witty, it's often entertaining in the manner of the National Inquirer or a car wreck. Though the authors repeatedly express dismay at how XP zealots attack their doubting Thomases, XPR wages near-war against the XP personalities. An entire chapter belittles the opposition's personas. A special overused icon warns the reader of yet another tiresome bout of sarcasm.
XPR carefully and correctly demonstrates how all 12 of XP's practices are interrelated. Drop one and the entire game falls apart like a house of cards. Testing is the only defense against poor specs; pair programming an effort to save the code base from a poorly thought-out, frantically hacked-together creation. The book is worthwhile for this analysis alone. The XPers don't stress how vital all 12 steps are to success on a project.
Yet the authors, in the few demonstrations of failed XP projects they present (no successes are noted), sheepishly admit that none of these programs were built using an unmodified form of XP. All used subsets'- the very approach XPR demonstrates cannot succeed. So the credibility of these examples suffers.
A sidebar cleverly titled 'The Voice of eXPerience' quotes disgruntled programmers who used (subsetted) XP. Actually, I think there are two programmers quoted, the same ones over and over. One pontificates: 'My feeling is that XP wouldn't score highly at all when compared to other available principles'. That may be true'- but isn't a very convincing demonstration of proof.
The authors do miss a couple of other arguments that indict XP-like development processes. The Agile community calls the test strategy XP's 'safety net'; they say it insures bad code never makes it to the field. Yet study after study shows tests don't exercise all of the software - in some cases less than half! I'd argue that tests are the safety net that catch problems that leak through the code inspections, design checks, and careful design. In the embedded world, the automated tests required by XP are devilishly hard to implement, since our programs interact with users and the real world.
XPR completely ignores embedded systems, rather like, well, rather like every other software book. One anti-XP argument for an embedded project is that without some level of up front design you can't even select the hardware. Do we need an 8051 or a Power PC? Is data trickling in or gushing at 100k samples per second?
XPR concludes with a modified version of XP that's less eXtreme, more logical, and better suited to firmware development. That chapter is the best part of the book.
Now don't get me wrong- I do believe there are some programs that can do well with XP. Examples include non-safety critical apps with rapidly changing requirements that simply can't be nailed down. Web services come to mind. I know of one group that has been quite successful with XP in the embedded space, and numerous others who have failed.
Should you read the book? If the siren song of XP is ringing in your ears, if pair programming sounds like the environment you're dying to work in, read XPR today. Others wishing for a balance to the torrent of pro-XP words flooding most software magazines will find this book interesting as well. If it had been a third as long, without the revisionist Beatles lyrics, and, well, more polite, it would deserve 5 stars.
Feature Driven Development, A Practical Guide to, by Stephen Palmer and John Felsing
The last decade or so has been an exciting time to be in software development. Hardware design has, in my opinion, lost some of the fun now that ICs are so dense and speeds so high. But the software world has been flooded with new ideas and methodologies. Some are brilliant, others goofy, but all are fascinating.
One is Feature-Driven Development (FDD). Do read 'A Practical Guide to Feature-Driven Development', by Stephen Palmer and John Felsing (ISBN 0-13-067615-2, Prentice Hall, 2002), which is a readable treatise on this important topic.
Feature-Driven Development (FDD) is a relatively agile development methodology that is, in my opinion, much more suited to most embedded efforts than techniques like eXtreme Programming. XP is an interesting idea with lots of fabulous concepts we should steal, but I'm concerned about how XP shortchanges design. FDD requires considerable initial design, yet preserves much agility in the feature implementation phase.
As an aside, a new article, People Factors in Software Management: Lessons From Comparing Agile and Plan-Driven Methods by Richard Turner and Barry Boehm) gives a quite good analysis of where Agile methods fit in the spectrum of projects. The quick summary: Agile methods are best if you have lots of superior people, a project whose reliability isn't critical, quickly-changing requirements, a small team, and a culture that thrives on chaos. UPDATE: Crosstalk no longer exists, which is sad as it had a lot of great articles. Here's an article on what happened to it.
FDD, too, requires above average to superior developers. That seems to be a characteristic of most new methods. Where do all of the average and below-average people go? Obviously, simple math tells us an awful lot of developers won't score in the superprogrammer category.
FDD has a Project Manager who owns the project and acts as a force field to shield the developers from interruptions and administrivia. Day to day direction falls to the Development Manager, a person endowed with both people and technical skills.
A Chief Architect is responsible for the systems overall design.
Chief Programmers own feature sets and lead small teams implementing the code. They rely on Class Owners - the actual workers cranking out software. Unlike XP, where everyone owns all of the code, in FDD the Class Owner is responsible for a particular class.
FDD has 5 processes. The project starts with an overall design, called a 'domain object model'. From there a features list is constructed. A plan is made, grouping features into sets which are assigned to Chief Programmers.
The fourth and fifth processes comprise the project's engine room. A Chief Programmer selects a set of features that can be implemented in two weeks or so. He or she designs each feature, creating a list of classes which are designed and implemented. This closely resembles Gilb's well-known Evolutionary process, which focuses on 'time-boxing' a schedule - that is, figuring out what can be done in a two week timeframe, implementing that, and iterating.
The book includes a 10 page cheat-sheet that details each part of FDD. It's a handy guide for outfits just embarking on a new project using the methodology.
The book has frequent sidebars featuring a dialog between an experienced FDD developer and one just embarking on a project using the technique. I found this distracting and not terribly enlightening. And the authors push TogetherSoft's product just enough to be annoying.
But these are minor complaints. Unlike some programming books that are long on passion while shortchanging substance, this volume gives a clear introduction to FDD, with enough 'how-to' to use the method in your own projects.Highly recommended.
The GE Transistor Manual
In Ogilvy On Advertising David Ogilvy claims there are three magic words in the ad world. Put one in the headline of your ad and people will read it. The magic words are new, free, and sex.
But the one sure way to get my attention when flipping through a magazine is to have a schematic diagram. Any kind. A radio. Vacuum tube circuits. Logic. Piles of op amps. For some reason I find schematics arresting and always stop and take a closer look.
Clicking around the web recently I stumbled across some vacuum tube sites, which brought back fond high school memories of building tube ham radio transmitters. Everyone relied on the RCA Vacuum Tube Handbook as the bible for specs on the parts. Wouldn't it be cool to find an old copy? And wouldn't that be utterly pointless? That thought morphed to memories of the other indispensable tome of the time: The GE Transistor Manual. Not too many clicks later and one was on its way here.
I ordered the 1964 version. What's astonishing is how much was known about transistor theory by that date; transistors had been in common use for just a handful of years at the time. And this is the seventh edition! Yet it explains transistor theory in a level of detail that my college classes almost a decade later never approached. Read - and understand - the first 170 pages and you'll be a transistor expert. But no attempt is made to make the subject easy.
The price on the cover is $2, though it cost me, used, $6.98. Alas, the one that arrived is the "light-weight edition," a 594 page subset of the full-blown one I remembered. The light-weight version is missing all of the detailed specs of the transistors GE once made.
The GE Transistor Manual was, and still is even though it has been out of print for generations, one of the best compendiums of information about designing transistor-based circuits. Part of its appeal was that it's just stuffed with schematics of every conceivable kind of circuit. One can get lost for hours and days studying the cool ways the authors crafted designs with an astonishing economy of parts. It's engineer porn, graphic illustrations that makes one's heart beat a little faster as one furtively flips from page to page, mostly not reading the "story" but gazing deeply at the pictures.
Old timers will remember the unijunction transistor. There's an entire chapter dedicated to its use. These were used in timer circuits in the pre-555 days. UJTs are still available, though it has been a very long time since I've seen one in use.
But there's no discussion at all about FETs, which today represent, to a first approximation, 100% of all of the quadzillion or so transistors made every year. Though FETs existed at the time, they enjoyed little commercial success, and even into the 70s were seen as niche products. Its exclusion from this book suggests that GE did not make any at the time.
Some of the components discussed are obsolete. Or, at least I thought they were till checking the web. Stabistors, for instance were low-voltage zener diodes, but it seems these are still available, and one can even get them in modern SOT packages. Are SNAP diodes still around? There's a good description of them in the book.
Those who enjoy tech nostalgia - or schematics - will get a kick out of the book. If you want a deep look into transistor theory and use, this is a great resource. Copies can be found on Amazon.
Guidelines for the Use of the C Language in Vehicle Based Software, by MISRA
Frequent contributors to the comp.arch.embedded newsgroup sometimes refer to the MISRA (Motor Industry Software Reliability Association) publication 'Guidelines For the Use of The C Language in Vehicle Based Software'. As one interested in the firmware reliability (is that an oxymoron?) I wanted to check out this publication, but was frustrated by its unavailability on the net. So I ordered a copy from England (35 pounds for overseas shipments) through the web site (http://www.misra.org.uk). The PDF is available there for 15 pounds, and is a bargin.
While C is indeed a very powerful language, it should come with a warning label: 'danger: experts only'. It's so easy to create programs that leak memory, run pointers wildly all over memory, or create other difficult-to-find havoc.
The MISRA standard, a collection of 140 coding rules, tries to prevent problems by limiting the types of C constructs we use, and defining safe ways to use others.
Quite a few of the MISRA rules make tremendous sense: don't redefine reserved words and standard library function names. Document and explain all uses of #pragma. When a function may return an error, always test for that error. Functions should have a single exit point.
Some are interesting: never use recursion. Keep pointer indirection to no more than two levels.
A couple are hard but possibly quite valuable: check every value passed to every library routine. Avoid many common library functions.
Other are trivial: only use characters defined by the ISO C standard. Don't nest comments. Write code conforming to ANSI C. Don't confuse logical and bitwise operators. Don't have unreachable code.
Some of the requirements I find disturbing. For instance, one rule prohibits the use of dynamic memory allocation. Not a bad idea, due to problems associated with fragmentation. But there are alternatives to malloc/free that still give us the benefits of dynamic memory allocation without the pitfalls. More problematic, this rule tells us not to use library functions which employ dynamic memory, specifically mentioning string.h. This seems awfully restrictive to me- I sure don't want to write my own string handlers'- and further, how is one to identify the suspect libraries?
Another rule prohibits the use of setjmp and longjmp. These are worse than gotos, of course, in that they let us branch to specific memory addresses. Yet in a few cases longjmp is almost unavoidable.
The MISRA standard has rules that will surely raise some people's hackles. If so, subset it. They go a long way to improving the reliability of C code.
Ham Radio for Dummies, Ward Silver
Ward Silver's Ham Radio for Dummies appeared in my in-box recently. Published in 2004 by Wiley it's a moderately hefty 360 page introduction to the world of Amateur Radio (aka "ham radio.")
For those not in the know, Amateur Radio is a means of communicating world-wide with surprisingly sophisticated equipment using a vast array of frequencies. It's internationally regulated; all hams must have a license which comes only after passing a test.
Ham radio is sort of out of the purview of embedded systems, but this hobby pushed many of us into the world of electronics and computers. I've had a license for many decades; as a teenager building (vacuum tube!) radios I learned an awful lot about electronics. For me designing and building equipment was more fun than chatting with other hams. But that's ham radio's appeal. There are many different facets to the avocation.
First I have to admit that the "For Dummies" books irritate me. I've spent a lifetime studying many subjects and may be uneducated on some, but never consider myself a "dummy." A title like "For Novices" or "An Introduction To" is a bit more seemly, yet for some reason these dummy books have a wide appeal.
This is a book for rank novices - not dummies, but for people who are interested in the hobby but just don't know where to go to learn more. Though the ARRL, the ham radio advocacy group, (http://www.arrl.org) does offer lots of useful information, this book packages the data in a more convenient form than any other publication I know of. The author does a superb job of describing what the hobby is all about. In fact, perhaps half the book discusses different aspects of ham radio. Did you know you can run your own TV station? Mr. Silver shows how. How about radioteletype, moonbounce, or other operating modes? This book gives an overview of each, with good links for more information.
It's peppered with amusing anecdotes and cartoons. The writing is lively and non-technical, easy enough for anyone to grasp.
You can't operate as a ham without a license and Mr. Silver clearly describes the testing process, as well as the different kinds of licenses available. This is not, however, a test preparation manual. You'll need other books, such as those at http://www.arrl.org/ . Thus the book is totally tech-free.
What's the test like? In the US there are 35 or 50 multiple choice questions. Get 75% and you pass. Questions are both technical (electronics) and regulatory (the operating rules). Trust me: it's not hard to pass, especially using the aforementioned study material.
Though there is a license that doesn't need Morse code, any serious operator will want a license with more privileges. That requires passing a code test at 5 words per minute, 25 characters a minute or about two seconds per character. This requirement has been substantially downgraded from the 13 or 20 word per minute test of just a few years ago. With a little study and practice 5 WPM is a breeze. Update: In the USA there is no longer a code requirement.
One strength of the book is that Mr. Silver clearly explains actual operating procedures in a fashion that's more engaging than the ARRL publications.
Even after 35 years as a ham I didn't know about beacons used to check radio propagation (covered on page 101). And he discusses the digital modes which are all the rage today, with which I have no experience and therefore learned a few things.
Ham radio exists in a very different environment than when I first became interested in the hobby. It was relatively easy to build a rig when radios had only a handful of vacuum tubes. Today's multimode transceivers are packed full of surface-mounted ICs. It's harder to build this sort of equipment in a typical home shop. Yet there are still sources for kids and equipment, and a surprising number of hams build their own gear, especially "QRP" (very low power) gear. This book has a list of companies that sell kits.
Appendix B, a list of links and other sources, is invaluable.
The "bible" of ham radio is the ARRL Handbook for Radio Communication, which has a mediocre introduction to the hobby, but is fantastically complete in electronics and radio theory, coupled with plenty of build-it projects. Ward Silver's Ham Radio for Dummies fills the introductory niche left blank by the Handbook.
The Hands-on XBEE Lab Manual, Jonathan Titus
The late twentieth century saw the birth of connectivity. From simple computer networks the Arpanet grew into the Internet. At first the playground of academics, nearly everyone is now connected to the 'net, and therefore to everyone else and to vast amounts of data.
The 21st century has seen the dawn of the wireless era. Cell phones are no longer telecomm devices; they're part of the vast Internet. Now we're all connected all of the time, in the office or at the beach. Some of us use a smart phone more for data communications that voice transmissions. Indeed, that mobile device may have four or more antennas: one with a link to satellites in space for navigation, another to the 3G network, a wi-fi connection and Bluetooth for near-field links to headsets and the like.
Radio data communications surrounds us, from messages sent to the smart sign hanging over the highway to science instruments transmitting their findings from the remote artic to a lab in some pastoral setting via a satellite uplink.
Why have a Bluetooth link from the phone to a headset? A wired approach is a nuisance. It's bulky, in the way, and snags on things. Electronics is cheap; connectors are not, and the wireless version likely saves money and is more reliable. Messing with a tangle of cables in a lab or even with your PC is awkward. It won't be long before those all go away.
It's much more complicated to establish a link over an RF connection than with a wire, but smart hardware costs little today, and canned software is increasingly available. One popular option is Digi International's line of XBee prepackaged radio modules. You don't need to understand the nuances used, like direct-sequence spread spectrum coding or offset quadrature phase-shift keying, because those details are all taken care of by the modules.
You do need to understand how to use and interface to the modules, which is not made any easier by the terse and sometimes cryptic manuals. And XBee also uses the old AT command set, which is increasingly hard to find information about. For these reasons Jon Titus's Hands On Xbee Lab Manual is invaluable.
We learn in different ways. For many of us a hands-on approach is much more efficient than slogging through data books. Jon has taken that approach here, in a series of experiments designed teach by doing. From the very beginning you'll be putting together components that make something neat happen. Early experiments emulate a single-direction wire: a receiver module's output pin mirrors the input pin on the transmitter. Look Ma - no wires!
Each lesson is progressively more complicated and useful. Send analog data through the ether. Control multiple XBee modules. Connect other embedded components, like an Arduino Uno or ARM mbed board to the XBee modules. That, of course, is really the basis of embedded wireless networking.
Explore personal-area networks. These are self-assembling communications links where the network discovers at run time which XBee modules are operating. Jon shows how to do this, and how to piece the network together. Along the way you'll learn to handle interrupts, a crucial concept in the world of embedded systems.
Complex stuff. But fear not: Jon shows every step of each experiment. The lavish illustrations leave no chance for confusion. Whether it's a screen shot of a communications tool or a drawing of how to connect a LED, nothing is left out.
Welcome to the second decade of the 21st century, the age of wireless data communications. This book is your essential guide to using XBee modules to toss off the shackles of wires.
High Integrity Software, John Barnes
'High Integrity Software' - the title alone got me interested in this book by John Barnes. Subtitled 'The SPARK Approach to Safety and Security', the book is a description of the SPARK programming language's syntax and rationale.
The very first page quoted C.A.R Hoare's famous and profound statement: 'There are two ways of constructing a software design. One way is to make it so simple that there are obviously no deficiencies. And the other way is to make it so complicated that there are no obvious deficiencies.' This meme has always rung true, and is one measure I use when looking at the quality of code. It's the basis of the SPARK philosophy.
What is SPARK? It's a language, a subset of Ada that will run on any Ada compiler, with extensions that automated tools can analyze to prove the correctness of programs. As the author says in this Preface, 'I would like my programs to work without spending ages debugging the wretched things.' SPARK is designed to minimize debugging time (which averages 50% of a project's duration in most cases).
SPARK relies on Ada's idea of programming by contract, which separates the ability to describe a software interface (the contract) from its implementation (the code). This permits each to be compiled and analyzed separately.
It specifically attempts to insure the program is correct as built, in contrast to modern Agile methods which stress cranking a lot of code fast and then making it work via testing. Though Agility is appealing in some areas, I believe that, especially for safety critical system, focus on careful design and implementation beats a code-centric view hands down.
SPARK mandates adding numerous instrumentation constructs to the code for the sake of analysis. An example from the book:Procedure Add(X: In Integer);--#global in out Total;--#post Total=Total~ + X;--#pre X > 0;
The procedure definition statement is pure Ada, but the following three statements SPARK-specific tags. The first tells the analysis tool that the only global used is Total, and that it's both an input and output variable. The next tag tells the tool how the procedure will use and modify Total. Finally a precondition is specified for the passed argument X.
Wow! Sounds like a TON of work! Not only do we have to write all of the normal code, we're also constructing an almost parallel pseudo-execution stream for the analysis tool. But isn't this what we do (much more crudely) when building unit tests? In effect we're putting the system specification into the code, in a clear manner that the tool can use to automatically check against the code. What a powerful and interesting idea!
And it's similar to some approaches we already use, like strong typing and function prototyping (though God knows C mandates nothing and encourages any level of software anarchy).
There's no dynamic memory usage in SPARK; not that malloc() is inherently evil, but because use of those sorts of constructs can't be automatically analyzed. SPARK's philosophy is one of provable correctness. Again - WOW!
SPARK isn't perfect, of course. It's possible for a code terrorist to cheat the language, defining, for instance, that all globals are used everywhere as in and out parameters. A good program of code inspections would serve as a valuable deterrent to lazy abuse. And it is very wordy; in some cases the excess of instrumentation seems to make the software less readable. Yet SPARK is still concise compared to, say, the specifications document. Where C allows a starkness that makes code incomprehensible, SPARK lies in a domain between absolute computerese and some level of embedded specification.
The book has some flaws - it assumes the reader knows Ada, or can at least stumble through the language. That's not a valid assumption anymore.
I found myself making hundreds of comments and annotations in the book, underlining powerful points and turning down corners of pages I wanted to reread and think about more deeply.
A great deal of the book covers SPARK's syntax and the use of the automated analysis tools. If you're not planning to actually use the language your eyes may glaze over in these chapters. But Part 1 of the tome, the first 80 pages which describes the philosophy and fundamentals of the language and the tools, is breathtaking. I'd love to see Mr. Barnes publish just this section as a manifesto of sorts, a document for advocates of great software to rally around. For I fear the real issue facing software development today is a focus on code uber alles, versus creating provably correct code from the outset.
High Speed Digital Design, Howard Johnson and Martin Graham
Every embedded hardware designer simply must read High Speed Digital Design (a Handbook of Black Magic) by Howard Johnson and Martin Graham (1993 PTR Prentice Hall, NJ). Though the book will challenge you if your grasp of theory is rusty, it's worth reading even if you must skip the math.
Modern components are so fast that even slowly-clocked systems suffer from all sorts of speed problems. This book leaves no stone unturned in the quest for reliable digital designs.
The authors cover transmission line theory in detail. At first glance I shuddered, remembering with no joy two incomprehensible semesters of electromagnetics. Johnson and Graham balance theory with lots of practical information. For example, a right angle bend on a PCB trace is a transmission disaster... that you can sure simply by rounding the edges of the track.
Most of us vaguely know that corrupting a PCB ground or power plane is not a good thing to do. Yet we sometimes yield to temptation when that board will simply not route on 6 layers, so running a couple of tracks on the plane. In a few paragraphs this book shows why this is a horrible idea, as the current return for any track runs under the track itself. A slot etched in the ground plane, to allow the routing of tracks, may block a return path. Current will flow around the slot, greatly increasing the path's inductance. Even designers with the best of intentions may accidentally create this situation by poorly specing out hole sizes for connectors. If the holes are too large, they may intersect, creating a similar, though unintended, slot.
What's the best way to stack layers on a PCB? The book includes an entire chapter about this, though I would have liked to see more discussion about how signals couple with different stack configurations.
Vias, too, get their own chapter. There's lots of good advice. The best sound bite is that small vias are much faster than larger ones. Small sure helps routing as well, especially with SMT boards, so there's a ray of hope for us yet!
One of the biggest challenges faced by digital designers is propagating signals off-board through cables. A chapter about this subject is worth the price of the book alone. Ribbon cable is far better than I realized, especially when you run grounds as the authors recommend.
What's the best way to use a scope on a high speed system? What is the effect of that short little ground wire coming from the probe? It turns out that the 3 inch ground lead can degrade the displayed risetime by more than 4 nsec! The authors offer the best description of scope probe problems, and solutions, I've ever seen. They show how to build a better probe using parts found in any shop.
Did you know that skin effect, the tendency of high frequency signals to travel only in the outer edges of a conductor, can become important on PCB tracks at frequencies as low as 4 MHz? Halving the length of a conductor improves its frequency response by a factor of 4. Until reading this book I was under the impression that only RF designers needed to worry about this effect.
Read this book. Pass it along to your PCB designers. Then, read it again.
How Computers Do Math, Clive "Max" Maxfield and Alvin Brown
Clive "Max" Maxfield and Alvin Brown have written a wonderful book called "How Computers Do Math" about the essential workings of computers. All of Max's writings are entertaining and offbeat (e.g., "Bebop to the Boolean Boogie").
The book is aimed at people starting out in computers; we embedded experts know this stuff cold. But an interested 15 year old could get truly in-depth insight into the mysteries of computing from this volume.
It's a very readable book laid out with easy-on-the-eyes formatting and a plethora of clear illustrations. The illustration of a LIFO stack just booms clarity. Chapters start with relevant and often amusing quotes; one of my favorites is Lewis Carroll's "The four branches of arithmetic: ambition, distraction, uglification, and derision."
Quickly page through the book and you'll be puzzled by its organization. The first 55 pages (out of 450) comprise its ostensible meat. The rest are labs for each chapter, a series of problems the authors pose to illustrate important concepts. They nudge you through the solutions - there are no proofs left to the confused student.
The labs are very well-written accessible activities in which the authors take the reader along hand-in-hand. They're a bit insidious: work through them and the reader will become a reasonably competent assembly-language programmer, without realizing he's learning one of the more difficult aspects of programming. There's a perverse genius in covertly slipping assembly language into one's head without pain.
The authors' sure hands guide one along each lab, with descriptions and demonstrations till the code that's required is almost anticlimactic: "of *course* it must be like this!"
Where too many computer books have a dreary chapter about number systems, "How Computers Do Math" cover the subject in an entertaining and very complete fashion. From basic binary math they go on to show how one constructs an adder out of gates. Signed, unsigned, multiplication, rounding (9 different approaches!), BCD - it's all there, and it's all extremely comprehensible.
The book is published by John Wiley & Sons, Hoboken, NJ, copyright 2005, and sells for $26 on Amazon.
If I Only Changed the Software, Why is the Phone on Fire, Lisa Simone
Lisa Simone's If I Only Changed the Software, Why is the Phone on Fire isn't the usual dreary work stuffed with arcane wisdom buried beneath paragraph-length sentences seemingly written by someone just starting with English as a second language. This is certainly the first embedded book with characters. The first with action, and with interesting and cool stories.
Bad code that makes a phone burst into flames?
This is a James Patterson-style fast-paced book with dialog as close to gripping as one can imagine for a computer book. Its uniquely-embedded focus twists together elements of hardware and software as we engineers do in our daily design activities. One can't be understood without the other. Code makes the hardware smoke. That's unheard of anywhere but in the embedded industry.
Lisa weaves stories around deep technical issues. She's teaching the way humans have learned for 10,000 years. Most of us fought off sleep with varying levels of success in high school history classes. Who really wants to memorize the date of the First Defenestration of Prague or the name of Polk's vice president?
Yet now as adults we eagerly consume historical fiction (like James Michener) and real history assembled as a story (consider David McCullough). Cro-Magnon Grog taught his sons to avoid poisonous berries by telling them of uncles who died; the Old Testament was passed down orally as a collection of stories rather than a recitation of facts. Not properly casting an unsigned char sounds pretty dull, but when captured as a story, the interaction of people puzzling out a problem in a real-life setting we all identify with, we're engaged and learn the important lessons better.
Lisa shows how people are part of the solution and part of the problem. The concept draws on an oft-neglected axiom of the agile methods: people over process.
Despite the stories and character development, this is a textbook of a sort. There's homework. When Lisa asks you to stop and answer a question, do so! Think. Reflect. Surely Grog asked his sons questions to make them consider the lessons he imparted. We learn best by such interaction. Readers of Watts Humphrey's brilliant yet ineffably dull "A Discipline for Software Engineering" either do the homework and see their skills skyrocket; or read the book, skip the homework, and get no benefit at all.
Buried under the lessons Lisa derives an important zeitgeist, a design pattern if you will, that should guide us in our work. It's one of creating readable work products: use cases, comments, requirement documents, and more. Though we need not emulate her use of story development in writing a report, we should and must abandon our traditional use of tortured English. Write interesting documents. Be lively and engaging. After 2000 years it's time to leave Pollio's legacy behind and realize that if our readers are confused, frustrated or bored by what we produce, we're history.
There's more on her blog at http://www.lisaksimone.com/phoneonfire .
Introduction to the Personal Software Process, Watts Humphrey
The Software Engineering Institute (www.sei.cmu.edu) wages a war on poor software practices via their seminars, conferences, on-line materials, and their Capability Maturity Model (CMM).
The CMM, though, is a bitter pill to swallow. Without total commitment from the top of the organization on down it is doomed to failure, as the practices it entails are far from easy. Going the CMM route is surely as difficult and costly as getting ISO9000 certified.
Watts Humphrey, one of the architects of the CMM, realized that too many organizations will never be able to climb the rungs of the CMM ladder, yet are crying for ways to improve their software processes. His seminal work A Discipline for Software Engineering (1995 Addison-Wesley NY NY) outlined a process he calls the Personal Software Process (PSP) that lets us as individuals take charge of things, and improve the way we generate code, on our own, with no commitment from management.
Now he's followed that book with Introduction to the Personal Software Process (1997, Addison Wesley Longman, NY NY, ISBN 0-201-54809-7). Where the original book was long winded and filled with heady statistics, Introduction is practical, down to Earth, and filled with action plans. Introduction is the book to get if you want a step-by-step way to improve your estimation and coding skills. Humphrey claims that most engineers can achieve a 70% improvement - or better - from a 'one semester' exposure to the PSP.
I presume most people reading this have left 'semesters' long behind in a happily-forgotten past! However, as professionals we can never stop learning new things, even if management is unsupportive of our efforts. Humprey's original book feels, smells, and reads like a conventional college textbook; this successor is more of an 'Idiot's Guide' to the PSP, and is much more accessible.
However, nothing important ever comes easily. In my experience it takes the average engineer who pursues the PSP on his or her own about 6 months of steady work, a couple of evenings a week, to master the concepts. Though this could be shortened considerably by management that makes a few hours during the workweek available, it's rare to find such enlightened bosses.
If your company won't give you the time to do the PSP, go after it yourself, at night. Shut down the TV for a few hours here and there; the benefits are too great to ignore. Use Humphrey's new book, Introduction, as it's so much more tractable than the first.
But this book and process is not a cakewalk. If you're not willing to put some serious hours into it, don't buy the book.
On to more book reviews.