|For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.|
By Jack Ganssle
Code Inspection Book
I follow a number of companies that offer tools to embedded developers. We humans are tool builders; we engineers in particular create and use tools ranging from the physical (e.g., a wrench or logic analyzer) to the ethereal (software like compilers). Bereft of tools we'd be utterly unable to do our jobs. Just think of our helplessness when the power fails! Here we stare dumbly at each other for a minute or two before going out for ice cream. or a drink if it's late afternoon.
Now Smart Bear has a book on the subject which summarizes the results of 2500 reviews of 3.2 million lines of code at Cisco. Using Code Collaborator, which automatically generates and preserves metrics, they were able to learn a lot about lightweight reviews in a real-world scenario. Too many studies are set in academic communities using undergraduate students on toy projects.
The authors rightly point out that Fagan, Gilb, et al promote a highly formalized process that we're not supposed to tune much. Yet a Fagan inspection simply isn't possible in a two-person shop. And sometimes it's hard to match the process of the Fagan approach to modern agile methods. Indeed, in eXtreme Programming, for better or for worse the review takes place in real-time as a pair of developers crank code sharing a single machine.
The book refers to a number of studies, some of which are relatively obscure. For instance, did you know that when reading a function developers repeatedly return to look at variable definitions? The implication is that short term memory doesn't hold a lot, so wise teams will insist that all functions fit on a single page. Then it's easy to glance up at the declarations without shuffling through paper or screens.
The Cisco study showed a tremendous variability in inspection rates for a lot of reasons. But engineers achieved the best results when inspecting at about 300 lines of code per hour or less. And after about an hour review effectiveness plummets. We get tired.
Expect to find about 15 defects per hour.
Authors who "annotate" and explain the code before the review have fewer mistakes. It's not clear from the book what "annotates" means or how this is different than decent commenting. But clear explanations to someone else, presumably in written form, makes the author think more deeply and thus find his or her own problems before the review takes place.
It's a very well-written 164 page book that's a fascinating read. Even better, "Best Kept Secrets of Peer Code Review," is free! Go to http://smartbearsoftware.com/codecollab-code-review-book.php to order it. It's a physical volume, not a PDF which is a pain to read on-screen and annoying to print. Yes, it promotes their product, but the authors wisely relegated the sales hype to the last chapter. And do read that section carefully, too. Any tool that can help improve your processes and reduce defect rates is worth investigating.
What do you think? Have you read the book? Do you do code reviews of any sort?