For novel ideas about building embedded systems (both hardware and firmware), join the 27,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. It takes just a few seconds (just enter your email, which is shared with absolutely no one) to subscribe.
By Jack Ganssle
An article in the April 15, 2004 issue of SD Times (http://sdtimes.com/news/100/story2.htm) suggests that computer science majors get little or no exposure to software testing in their college classes. A researcher at Gartner Labs complained that "The myth is that if you're not coding, you're not working." Several academics said that testing is low-paid work no one wants to do. so why spend good college dollars preparing people for such dead-end jobs?
So I surfed the computer science catalogs from a few universities. Some offer cool courses like artificial intelligence, programming robots (what fun!), and cryptology. Testing? None, zilch, nada. Debugging? Ditto.
The University of Maryland's catalog (http://www.inform.umd.edu/CampusInfo/Departments/InstAdv/UnivPub/ugradcat/0304/Chapter8.pdf) is typical. Only a single undergraduate course mentions "design". The word "requirements" appears only in terms of class prerequisites. Yet these subjects are the holy grail of software engineering. Given solid requirements and a detailed design, anyone can write the program. Given solid requirements and a detailed design, the coding is easily offshored.
Q/A and acceptance testing will generally remain a local effort; it's the last quality hurdle the software passes before the offshore vendor gets their final payment. So it seems the colleges are equipping students with skills easily replaced by low-wage competitors.
Fabulous software engineering can greatly reduce the amount of testing required. But the U of MD, like so many other schools, has only a single class on the subject. The bulk of their courses cover languages, data structures, and the like. The curriculum essentially skips all of the up-front and post-coding efforts in the software lifecycle. It's startling načve.
Various studies and my own observations confirm that testing and debug consume about half of a typical project's schedule. (I combine the two, because debugging is the first set of checks used to isolate the most obvious problems. It's the beginning of the test process). Coding eats only around 20%.
Testing makes most of us yawn. And that's a tragedy. Though Deming (http://www.deming.org/) taught us that you can't test quality into a system, it's still a critical part of building great code. Today there are all sorts of cool things happening in the test world. The DO-178B (http://www.validatedsoftware.com/cert_faq.html#Q.%20%20What%20levels%20of%20structural%20testing%20are%20required%20by%20DO-178B?) standard for safety critical software, for instance, requires branch and decision coverage, which may mandate the use of special - and interesting - tools. The SPARK (http://www.praxis-cs.co.uk/sparkada/language.asp) language builds verification statements directly into the source code. SPARK is an Ada subset that's gaining some traction in the military and space work, where failures cost megabucks and lives. eXtreme Programming, a novel if whacky approach to programming, brilliantly integrates test into the entire engineering process.
Why is software so bad? It suffers from rushed timetables, lousy design, and inadequate testing. I doubt we'll ever solve the schedule problem, but the second two effects are both tractable and teachable.
If only they were taught.
What do you think? Did college teach you coding or real software engineering?