For novel ideas about building embedded systems (both hardware and firmware), join the 35,000 engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.
I'll present my Better Firmware Faster seminar in Melbourne and Perth, Australia February 20 and 26th. All are invited. More info here. The early registration discount ends January 20.
By Jack Ganssle
Del Cross B
I was in New Orleans last weekend visiting my son who is studying physics there. He complained about the difficulty of only one class: electrodynamics, which, oh-so-many years ago tormented me, too. We EEs were required to take two semesters of electromagnetics. And I never understood a word of it.
Sure, we learned all about circular integrals and all sorts of math that I no longer recall. There was some physics of electric and magnetic fields as well, but the cryptic equations managed to, for me at least, disguise pretty much all of the meaning. But back then I still had the notion that engineers use a lot of complex mathematics in their jobs, so struggled unsuccessfully to keep up.
We had to take three semesters of calculus, in parallel with physics which oddly always demanded a bit more math than we had learned. Those classes were essential preparation for other courses, and in my post-college career I've occasionally found basic calculus, and only basic calculus, useful when learning some new subject. A handful of times it has been useful in my job. That foundational knowledge remains important in how I (for one) understand the world.
But after 30+ years it's amazing how much one forgets. I spent a summer a few years ago re-learning calculus to help the same son when he was in high school. That experience was delightful, as there were no tests, and I could skip boring problems. Calculus is pretty cool. for a geek.
Then there was differential equations, an odd class, where the professor introduced the subject by exclaiming that no one knows how to solve the vast majority of diffy-qs, but that we would learn - via rote memorization - solutions to some. I've never used those solutions in my career, and only found them useful (other than the simplest of forms) in a nuclear engineering class.
Linear algebra was useful preparation for Circuit Theory, and I have used it extensively over the years in implementing algorithms.
I have no idea why we needed abstract algebra. They took a perfectly reasonable subject - algebra - in which we manipulate variables and equations with clearly defined operators, and removed all of the operators. Everything we had learned was wrong. It was a sort of meta-algebra that math majors might glean some benefit from, but that was worthless for working engineers. A semester of basket-weaving would have been more useful as a fall-back career in times of recession or outsourcing.
A mathematical education is important and provides a firm foundation that promotes understanding of other subjects. But in my experience we digital engineers, at least, don't use much math in our jobs. Most of that we learned we never used outside of college.
What's your experience?