I saw an old review of Steve Maguire's Writing Solid Code, in which he gets lambasted (with considerable ad homiems, I notice) by a reviewer that I think missed the point.
While quibbling about minutia regarding C developing, the reviewer misses the biggest take-away from the book: developer attitude.
- Developers let bugs happen. (Many developers are judicious, but most bugs are usually caused by cavalier attitude or ignorance.)
- Therefore, they're in the best position to prevent them from happening.
- The best bug is the one that never makes it into the stream.
- If you have a tool that can catch a bug (lint, compiler warnings, etc.) at compile/build time, that's the best. If you have a tool (say, a parallel debug-only algorithm) that catches a bug during execution (say, of a test suite) that's good, too.
- Test Engineers help, but they're really just keeping SWD honest.
I don't buy into all of Steve Maguire's suggestions. Some grate, some I just don't agree with, and some have lost their currency in these days where C has given way to C++, C#, Java, and numerous other production languages. But those differences shouldn't undermine the one key point that is illustrated by the examples: developer attitude. Don't let bugs happen, by identifying practices that catch bugs before they get into the stream. That advice remains timeless, regardless of the specifics of language and environment.