In an earlier post, I discussed the dangers of being wedded to your own ideas. I would like to riff on this because of how often it comes up, in others and even in myself.
When you hold onto a belief without sufficient evidence, you run the very real risk of missing important data to allow you to debug an issue.
"I did this, which should make that happen. So now that that has happened, let's look over there."
Wait, what? When something "should happen", something that you are relying on in the steps that you are debugging, you had damn well better make sure that happened. Validate everything before moving on.
"Trust no one" – Deep Throat
Holding onto a belief too long can also cause you to diminish evidence that may be pivotal to uncovering something or, at least, to distract your efforts. Constantly evaluate and re-evaluate your evidence to see where you might be making assumptions, or holding onto beliefs, that are not justified.
"My code works this way. That other problem is something else."
Uh, no, not necessarily.
Time is on your side here, in the sense that having a very accurate timeline can pinpoint when things go astray. And that time-based data can fly in the face of your sacred cow. You need to let go when the evidence tells you to.
"Shit really started to hit the fan Tuesday night. What did we do Tuesday night?"
What sorts of evidence can you rely on? We have covered log files already, because they are (if properly done) an archive of what has happened in the process you are debugging. And that archive is gold.
Another useful piece of evidence that is often overlooked is a modification timestamp on a database row or file. Knowing when a piece of data has changed can be the turning point for knowing what did, or did not happen.
This came up recently where some database data we were looking at could be updated in one of two ways. The modification timestamp on the row, correlated with the log files, clearly demonstrated what flow was responsible for the update. And one of the core premises on which we were operating, a mighty cow, had to be put down.
The Scout Mindset
In the weird way that these things happen, the issue of sacred cows came up on a podcast this week during an interview with Julia Galef, the author of "The Scout Mindset", a topic she introduced several years ago in her TED talk:
In a nutshell, we are all predisposed to be in one of two modes, or mindsets:
- the "soldier mindset", where we staunchly defend our deeply held beliefs, sometimes to the point of irrationality;
- the "scout mindset", where we don't care about whether we are right and wrong, and merely want to understand.
Clearly (to me) debugging is about being a "scout", uncovering data to get at an understanding of a problem. And this post has been about being mindful of your inner "soldier". Note that, depending on the circumstances and beliefs, we can quickly flip between these two mindsets, with no one person being entirely a "soldier" or a "scout".
But knowing what you are as truthfully and consistently as you can will help you stay intellectually honest, and a be a powerful debugger in the process.