How Do You Define the Term 'Bug'?
In case you missed it, my next webinar is scheduled for March 24. In it, I will present a high-level method for debugging your web applications. If you'd like to attend, please RSVP at benwilhelm.com/webinars.
In preparation for that, I'll be spending the time between now and then talking about debugging here in my daily list.
So if we're going to spend the next two weeks talking about bugs and debugging, we should probably define what we mean by bug.
I like to use this definition:
A bug is when observed behavior in your application differs from expected behavior
I like this definition for academic purposes because it doesn't worry about whether the bug is acceptable to the user or whether it's worth the time to fix. In a real-life product setting, those are very real concerns. But for purposes of this email list and the upcoming webinar, we're just trying to get better at making our apps do what we tell them to do.
Which brings me to my next point. I very intentionally didn't say that observed behavior differs from specified behavior. The reason is this: The code you write is your specification of the behavior that you desire. The machine does what you tell it to do. If the machine doesn't do what you expected, it's not because it disregarded what you told it. It's because you didn't specify your desired behavior accurately.
This might sound intimidating at first, but I find it liberating. It means that I have the power to fix the bug. It's not an externality that I'm in conflict with. It's behavior that I need to more accurately describe within my code. I've said before that that I don't like to use the term "bug" for just that reason. It's not a thing that can run away from you.
Frustration is the enemy of effective debugging. A frustrated mind is not a clear-thinking mind. The first thing to remind yourself when you begin debugging is that it's within your power to find and fix this defect in your code.