The analogy was basically:
You drive to work every day and don't get into an accident, you should be able to write code every day that works.
At first, I dismissed it as simply a bad analogy.
But why don't most people get into accidents more often? Bugs and accidents both seem like things that will happen. Is coding simply harder than driving?
At some point I realized, it's because there is a margin for error built into driving, and you can increase that margin, if you want. You can allow more room ahead of you. And make sure you have an exit plan if something happens ahead.
You don't drive inches from the car ahead. Lanes are not the exact with of cars. You can veer a little and not have a huge accident.
Code generally doesn't have a margin for error built in... at least not much of one. Compilers add some margin, but generally, there isn't a lot.
However, you, as an engineer, can add to the margin. And realizing this changed the way I think about coding.
How? You might ask.
Compilers are one way. Opting for a compiled language, increases the margin a little bit, depending on the language.
Tests increase the margin for error. Assuming they're automated, unit, integration, and system tests all help.
Defensive programming can increase it, by making sure you're working with the data you think you are.
Tools like linters, formatters, and static code analysis all increase the margin for error.
Small merges and deploys increase the margin for error. Using source control.
And in the code itself, you can check for errors and take some action. Maybe retry a network request? Reset some state and try again? Try some backup path?
Message queues, RAID disks, replicated data, even TCP all increase the margin for error. Some of which you don't even think about.
All these years later, I still think about that analogy and increasing my margin for error. The experience wasn't awesome, but the lesson was.