|For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.|
By Jack Ganssle
Microsoft's decision to let Windows XP auto-download updates makes a lot of sense. It's a great way to disseminate security upgrades and bug fixes. As a user I have to believe that my computer's operating system is cheating entropy and improving over time.
The answer is Trust Microsoft. Fact is, that's worked pretty well so far.
But now the NY Times (http://query.nytimes.com/gst/abstract.html?res=FB0B15FA3C5A0C758CDDA80894DB404482 - sorry, fee required) is reporting a new threat to software - especially to embedded systems. Though no specific instances have as yet surfaced, many people have expressed concerns that outsourced software - especially that created in developing countries - may have maliciously-installed security vulnerabilities.
Officials fear that organized crime, cyberterrorists or simply rogue programmers may secretly infiltrate overseas software development efforts. Pakistan, the Philippines and Russia are mentioned as being the biggest threats.
The company contracting for the project has no idea what is inside the delivered code. Most outfits figure if it works, it's done. It's perfect if it passes the tests. Yet sleeper code could lurk that remains dormant till some time passes, and then unleashes mayhem.
Currently most contract work takes place in India, where developers have a strong sense of the Right Way to build software. Inspections insure that no vulnerabilities slip in. I'm sure the biggest of these outfits jealously guard their reputations by conducting very effective inspections. One cannot help but wonder if the smaller, hungrier, companies are as thorough. Are developers in other countries as disciplined? Does local management insist on a careful inspection process? Do the customers understand and audit the process?
You can almost define software as being that component of a system that's invisible, that is essential to a system's operation but that no one ever sees. The open source movement is one powerful deterrent to code with back doors. But open source will never provide the security safeguards needed by embedded systems. The market is too fragmented and specialized; how many folks are going to read the code for your smart toaster?
Arup Gupta, head of an Indian contract software company, said ".we can guarantee, basically, that the code we deliver will be bug-free and will perform to specifications and will not have holes in it."
Bug-free code? That's patently absurd. Is the hole-free comment equally ridiculous?