The speaker lineup for the Embedded Online Conference is pretty amazing! Sign up with promo code GANSSLE90 and wonderful things will happen like reversal of male pattern baldness, a guest spot on Teen Vogue magazine, and a boost of what JFK called "vim and vigor." It will also get you $100 off the $190 registration fee (which goes up to $290 May 1).
By Jack Ganssle
What's your system's clock rate?
About a third of the firmware developers I poll can't answer this question. Yet in my twisted EE way of thinking the clock rate is a basic aspect of a system. Firmware people use that figure to program timers, serial channels and PWMs. Sometimes we're required to select a clock frequency as part of the setup procedure for our debugging tools. And it's tough to do performance estimations without some idea of the system's speed.
But the shifting landscape of embedded systems development has changed everything. The olden days of 4k programs is largely history. Reiner Hartenstein, in the April 2004 Proceedings of the First Conference on Computing Frontiers claims that embedded software doubles in size every 10 months, and will reach 90% of all software being written by 2010. Other numbers suggest that, today, firmware consumes 70% of the project's development time. It's clear that firmware is the foundation upon which civilization rests.
Developers are changing, too. In my firmware seminars I'll often ask how many attendees are EEs. A few years ago more than 50% answered "aye." Last week, for the first time, not a single hand shot up. CS and CE majors are slowly crowding out the EEs in software development.
In Controlling Software Projects, Tom DeMarco urged software developers to embrace specialization. Some people are terrible coders but great testers. Some fumble in debugging while others zap bugs lightening-fast. Isn't it silly that each engineer analyzes requirements, designs, codes, inspects, debugs, tests and documents. and then provides tech support? How many of us are really experts at so many diverse specialties?
Doctors have an arguably much less creative profession than we embedded people, yet you'd never go to a GP when confronted with a diagnosis of cancer or heart disease. Specialties abound: cardiologist, podiatrists, gynecologists, sports injury practitioners, dermatologists, ophthalmology, plastic surgeons and many, many more. And that's not to mention the vast array of docs trained to work on, in, or around your brain.
The field of medicine is so vast that only by specializing can a doctor know enough about one single area to provide good care.
The embedded world is nearly as vast: networking (TCP/IP, I2C, CAN, Bluetooth, Zigbee, Wi-fi and a hundred others), network management (SNMP and more), data acquisition, signal processing, C, C++, Java, any of a hundred different assembly languages, motor control, electromagnetics, PCB design, Verilog, ASIC design, and more. Each demands a high level of expertise that can take years to acquire. Any one could consume an entire career.
Yet in the embedded world resumes are expected to be acronym soup; we're specialists who are expert at everything. Realistic? Probably not.
IBM's Chief Programmer Teams of the 60s and 70s tried unsuccessfully to divide a team into a group of developers each practicing one specialty. Developers rebelled and the practice soon lost cachet. We like doing it all. I rather miss the early embedded days. It was a kick to design the hardware, tape out PCBs on great sheets of mylar (that was before CAD and autorouters), troubleshoot the boards while simultaneously writing assembly-language device drivers, and then implementing the rest of the firmware. Unlike the cardiologist who wearily sees the same handful of problems day in and day out, building an entire system means exercising a wide array of skills. There's never a chance to get bored.
But engineering is a business affair. Managers only care about the most efficient way to create embedded systems. Do we rely on broadly-educated developers doing all phases of the project. or specialize?
Windows CE and Linux have already made big inroads into this industry. For managers these desktop OSes remove the need for highly skilled embedded developers. Millions of competent desktop programmers can write code for an "embedded" system layered on top of Windows or Linux.
I predict that the embedded world will bifurcate. Real time hardware-intensive work will be marginalized by the use of abstracting tools like UML, Java/C++, canned protocol stacks and more. Companies will hire a few great traditional embedded people for the low-level work, using lower-paid IT-like programmers to build the bulk of the code.
In my opinion, such specialization will save the company money, but sure sucks the fun out of projects. What do you think?