Two articles in the last Muse raised the theme of the nature of engineers and engineering. Many, many readers replied. Here are some of those thoughts.
Michael Covington wrote:
"If you are rushing, the planning has failed" – YES! Things I've told my graduate students:
(1) Pulling an all-nighter doesn't make you a hero and doesn't impress us. It's like overdrawing your checking account. We want you to do things on time at a reasonable pace. (Note: I mean pulling an all-nighter to make a deadline. Some people choose to work all night to avoid interruptions and then rest the next day.)
(2) One form of bad management treats every deadline as an emergency. By definition, deadlines are never emergencies.
(3) Treating the situation as an emergency can be an excuse to lower your standards.
Fabrizio Bernardini had some interesting thoughts about "engineer" in different languages:
1. I think it is important to note that in the English world 'engineer' comes from 'engine' and 'machinery' in general. However in many EU languages the word (Italian) "ingegnere" (which sounds very much like 'engineer') derives from "ingegno" that is 'ingenuity'. Maybe the simple exchange of two vowels, from "engineer" to "ingeneer" could help … :)
2. Also in our (Italy) system, you can get a degree in Engineering ("Ingegneria") and your papers will say you are a "Doctor in Engineering", but to sign yourself as "engineer" (we use the title ing. before the name) you need to pass an additional professional qualification exam. I did that so I am 'ing. Fabrizio Bernardini', instead of a generic 'dott. Fabrizio Bernardini'.
3. Software engineering is another topic. A while ago I was shown by a curator at NASM a booklet on Software Engineering dated very early 60s. He said it was the first to open the topic. I find it a bit disturbing that we are still discussing it. If pure software can be a science, there is no way that good software for control systems (and I mean big systems, like spacecraft for instance) requires a good dose of engineering knowledge of interacting systems. But then, there is a difference between the system software engineer, the software architect/designer and the programmer.
Lars Pötter submitted:
> Reader Geoff had some interesting thoughts:
> *If you are rushing the planning has failed.* ...
> *Is a bug on a developers machine a "near miss"?* ...
> *Iterate on science rather than on engineering* ...
How do we define "Engineering" in the software context?
I often see "professional" software development. The only valid definition of "professional" seems to be that you get payed for doing it.
One could now try to define "professional" by introducing certificates for courses and exams. But that has been tried and we all know that it does not work.
More important than that the coder is educated is that the created software is "of good quality". But what does that mean?
Maybe a comparison with manufacturing can help. If we buy a soap box car from an 8 year old child, or a used car from a newspaper ad or a new car from a dealer we have different expectations to the quality of the vehicle we buy. The car dealer gives a warranty on the car. On the other purchases we get the vehicle "as is".
Outside of safety critic and other certified software development, all software we can get is "as is". It clearly states so in the license agreements. With proprietary software being "as is" the manufacturing example would be a used car on sale where you are not allowed to look under the hood. So if we are unlucky such a thing could make the 8 year proud but not more.
Why are software buyers agreeing to this?
Again and again the statistics show that most software projects fail. People seem to be lucky if they get something that barely works in the end. So better software must be even more expensive.
Shouldn't we more clearly communicate why projects failed? What quality of software we create? What can reasonably be expected?
Every professional knows that if the material he works with is bad then even the best craftsmanship can not get him to a great product. But we accept "as is" libraries and "as is" drivers and other "as is" software packages into our solutions. As an Engineer: Why do we expect this to work?
All those war stories of hero coders putting together "as is" libraries and make it work just in time for the customer. They only show that we waste a lot of effort due to the bad quality software we build on.
I think it is time to tell the customer that if he buys software "as is" then he gets software that might work. If he want reliable software than he needs to get that grantee from the software vendor. Companies that buy software should have no problem to calculate the damage if the software has bugs. So requesting some compensation should be easy to add to the contract.
I don't want to go into certifications and SIL Levels. But the awareness that "as is"
software is comparable to something he picked up on the flee market should become common knowledge.
The only way that employees may create higher quality is the customers request or appreciate this higher quality, because it comes with a cost.
The customer will only request higher quality if he knows that what he is now buying is "as is" crap.
Without customer awareness of software quality I don't see how the situation in software development could improve.
From Pete Friedrichs:
In this issue, you remark: "I hear constantly from developers that they cannot use a particular software component (comm stack, RTOS, whatever) because, well, how can they trust it? One wonders if 18th century engineers used the same argument to justify fabricating their own bolts and other components. It wasn't until the idea of interchangeable parts arrived that engineering evolved into a profession where one designer could benefit from another's work."
Back around the time of the First Gulf War, I remember reading a story in a trade rag that, while the specifics now escape me, the gist of which was this:
The U.S. had sold Saddam Hussein their radar early warning systems in a prior era when we were still on good terms. In a precursor to our Desert Storm activities, we inserted some malignant code into—of all things—a printing peripheral. We then sold that peripheral to a dealer in France, with explicit instructions that it should NOT, under any circumstances, be resold to Iraq, but knowing full well that that would be the FIRST thing this dealer did. The peripheral was sold, shipped, and installed in Hussein's infrastructure.
As the story goes, prior to our attack we did something to activate the malignant code, and Hussein's defenses were substantially blinded. Almost a decade later, I read the same story in another trade rag, which went so far as to confirm its veracity.
I had a friend—a communications guy in the army—tell me that he had been issued a certain brand of laptop for his use in configurating and maintaining radio and crypto gear. He (and everyone else who had been issued that laptop) was suddenly ordered to stop using them and turn them in. The manufacturer (my guess, at the direction of the CCP) had installed a back door—not in some application software or in the in OS—but in silicon!
And so, back to your comment… When you said: "One wonders if 18th century engineers used the same argument to justify fabricating their own bolts and other components," my first reaction was: "The 18th century guys didn't have to worry about their bolts spying on them, stealing their financial or military information, or failing on-demand, at the command of a foreign power."
In this day and age the lack of trust you observe is, in my opinion, fully justified.
Another thing… as the end-user of a bolt supplied by somebody else, there are any number of tests I can apply to assure it will do what I want, without any nasty surprises. These include non-destructive tests to verify function before installation, and destructive tests that I could apply to representative samples. I may never know precisely what goes on at the manufacturer's plant, but I can build confidence in the parts they supply.
How do you do that to binary "parts" purchased from someone else? Yes, you can disassemble and reverse-engineer assembled or pre-compiled code, but depending on the size, this is likely to be a prohibitively complex and time-consuming endeavor. Not only that, every time the manufacturer releases an upgrade and the package checksum changes, you have to repeat the effort.
From a business standpoint, I suspect it would be cheaper, easier, (and require a smaller staff) to write your own code. For better or worse, you at least know what's in it.
Laszlo Arato compared this to shipbuilders:
"Musing on Engineering" and the Ship-Building comparison:
I think, that ship building is a bad example in the mentioned context, as it is an extremely iterative process.
Large shipbuilders like the "Meyer Werft" in Germany are famous for their innovative luxury ocean liners. Yet each ship they build is 99.9% repetition of what has worked before, and about 0.1 % new innovations per ship. In terms of actual welding work, the innovation part per ship is very likely even less innovative …
Even as a very conservative embedded engineer, about 10% of my work is innovative, which makes my work very engaging, challenging and fresh. And I honestly like it that way – including the necessary care to avoid bugs!
Bogdan Baudis had a similar thought:
"If the shipbuilding industry can build ocean liners without iterating on multi-thousand-tonne-hulls, could software engineers produce production quality code without iterating?"
A long time ago as a semi-starving student I joined cooperative performing all kind of odd contract jobs for various employers, including that for the shipyards.
While I have not observed a hull iterations I certainly observed many on-spot rebuilds to get things done.
One would assume that mating two major parts like hull and the main engine should be very well designed and planned operation.
So it was .. until it turned out to be a puzzle .. blowtorch symphony ensued :-)
John Hawke contemplated changes that "won't affect anything":
The driver for writing this rambling email is the article in Muse 415, "Are we Engineers", and the comment about other trades using standard parts, usually certified by some standards body or another. One of the problems of this is apportioning responsibility when things go wrong. Who is liable? Is it the standards authority, the test house or the manufacturer / programmer, especially as the cost of accreditation tests increases "exponentially".
As you will have heard, we have a problem with this in the United Kingdom, as a result of the devastating fire at Grenfell Tower in London. The rapid spread of fire has been attributed to the use of flammable cladding (and other regulation violations). The cladding had passed tests, but, it appears, the cladding company had gamed the tests, and also altered the product without retesting, presumably as the tests were so expensive. The subsequent investigations have identified many tower blocks with similar problems; the estimated cost of reparations is in the region of $20 million!
Small changes which "won't affect anything" are all too easy in software, and have bitten me in a sensitive part all too often. In a commonly used off the shelf part a fault introduced in this way could hit millions of products, all of which might require a field update.
Who pays for such failures? Many of these software parts will be written by small development companies. Sue them and they go bankrupt without paying for a noticeable proportion of the costs. Chase the directors or programmers and they become bankrupt. If the test house or standards body is held responsible the only result is the cost of accreditation soaring, causing more and more companies to skimp on retests or even initial testing. In the Grenfell case the cost will almost certainly eventually fall on the UK government. Could you see the same happening for a coding problem?
The availability of trustworthy reusable code modules transforms the ability to develop complex code cost effectively, but the possible risks have to be assessed at the highest level in an organisation, those way above my pay grade! Unfortunately this due diligence is often undertaken by those who have little knowledge of the processes involved.
Tom Archer wrote:
Too many decades ago my high school physics instructor suggested that "engineering" is "the art of making simplifying assumptions." He said "good engineers are good at making the right assumptions." There is merit to that, as long as there is agreement, and you've alluded to it in this issue of Embedded Muse, that not all that is labeled "engineering" fits a purist's definition. It applies across many disciplines including "software." I haven't found an "assumptioneering" program, but good assumptioneers are hard to find.
I am a retired Registered Professional Engineer and am familiar with the various State limitations on the use of the term "engineer." Most states gave up the battle and people who need to know the difference do. Corporate America long ago figured out titles are cheap and the Society of Manufacturing Engineers, at least by the '70's was openly challenging convention by accepting "manufacturing engineers" based upon experience rather than degrees. "Sales Engineer" was a common title, though everyone understood few were. My first business card from a summer job when I was still in school said "Engineering" but had no title. Automotive and aerospace were, and are, perhaps the most ubiquitous dispensers of "engineering" titles based upon experience or certifications rather than degrees. It's never made any difference to me unless it was a legal job requirement and that's mostly related to civil engineering and insured products such as pressure vessels.
Philip Banister took a stab at defining "engineer":
Geoff brings up an interesting point when he asks what counts as an engineer. Indeed in the UK if your washing machine breaks down then the person who is called out to repair it (if you don't want to do it yourself) is listed as an engineer, but seldom few have attended university and gained a degree in engineering ... again if that is how we are judging it.
I remember a supervisor in a previous company voicing his opinion on where he drew the line on Engineering/Technician and I find that in general it's a good one.
"An Engineer is someone you can give a problem to and given enough time and coffee will produce a solution for you whether that is a hardware or software fix, a technician on the other hand is someone who can take the provided solution and can build, maintain or repair it."
I'm aware the lines blur quite a lot, but throughout my career I have struggled to feel as though I am actually an engineer and not just a very stressed-out technician expected to troubleshoot and provide solutions above his paygrade.
... as did Jean-Christophe Mathae:
About the definition of an engineer, the one I prefer is "someone who solves problems".
... and so did Bob Collins:
On the topic of "Are we engineers?", I propose the simplest (modern) definition of engineering is: applied science.
Engineering is the practice of developing solutions to real-world
problems using theory discovered by science.
I see no reason why software development cannot be engineering.
A reader asked me what should be on an embedded person's bookshelf. That's a tough question as a Raspberry Pi user will have different needs from someone doing VHDL design. However, here's my list of what I find essential.
Books About Electronics
You can't get away from the electronics in embedded development, nor would I ever want to. Here are some titles:
The Art of Electronics, by Horowitz and Hill. This massive tome covers everything from AC circuits to analog to digital. An essential reference.
That book is a bit superficial when it comes to circuit theory. Probably any good college textbook on the subject will suffice, but I like Circuit Analysis, Theory and Practice, by Robbins and Miller, and Electronic Devices and Circuit Theory by Boylestad, Nashelsky, Edgar, Morey and Temple.
The Radio Amateur's Handbook, aka The ARRL Handbook lies between the entirely practical Art of Electronics and a college textbook. While slanted to radio technology it is a very useful guide, especially to AC circuits. There's a new edition every year. My dad bought me my first copy not long ago... actually, come to think of it, it was in 1966!
Working with HDLs? In that case I like Design Recipes for FPGAs by Wilson.
Op Amps For Everyone by Carter is a good reference for analog work. TI's free Handbook of Operational Amplifier Applications is very useful. Then there's Bob Pease's Troubleshooting Analog Circuits, which is more of a fun read than a reference work.
High Speed Digital Design by Johnson and Graham is essential for working with modern fast electronics.
Books About Software
Many would argue for Knuth's The Art of Computer Programming series. I have these and find them good reading but I have never found them all that useful in day-to-day engineering.
The Mythical Man-Month by Brooks is a must read though not a reference.
An Embedded Software Primer by Simon is essential for those without a ton of experience.
Probably the best book on managing risks is Wiegers' Practical Project Initiation. DeMarco's Controlling Software Projects is another solid reference.
For algorithms for approximations I can't do without Hart's Computer Approximations and Muller's Elementary Functions.
No book comes close to Wiegers' Software Requirements for this most important part of any project.
Sampling Theory and Analog-to-Digital Conversion by Jungwirth is a solid guide to these issues. Lots of math, but if you're willing to plow though it you'll learn a lot.
Humphrey's A Discipline for Software Engineering or his Introduction to the Personal Software Process are, in my opinion, the most important books written about software engineering, but few will like his prescriptions.
Fan's Real-Time Embedded Systems is a very complete and heavy read with lots of worthwhile info.
As one who thinks code should be readable and beautiful, Boswell and Foucher's The Art of Readable Code is great and a fast read. Also recommended is Stavely's Writing in Software Development.
Agile! by Meyer is one of the most honest looks at this discipline.
Embedded Systems Security by father and son duo Kleidermachers is important for those engaged in that important area.
C For Everyone by Man and Willrich is an excellent book on the language. The C Programming Language by Kernighan and Ritchie is the classic, though dated, reference.
DeMarco and Lister's Peopleware is the most important book around about productivity in software engineering.
What books do you consider essential?
From Chris Lawson:
Chris writes: The current temperature was evidently "Negative Zero, with a forecasted low of Eight". Yikes! That's all sorts of wrong. Dear software engineers (including myself), please remove the negative sign when converting a floating-point number, and the resulting integer is ZERO. There's no such thing as "negative zero," unless of course, you're calculating limits. Also, when the current HIGH or LOW exceeds the forecasted value, why not just update the prediction in real time?!
Jack notes: Of course, ones complement math does support negative zero, which is one reason we don't use it anymore.