The other day, I was involved in a conversation with a group of developers. Several of them work at a large company whose product is built on Ruby on Rails. We were talking about the development environment there, specifically the test suite. The codebase is quite a few years old and has a very slow test suite in part due to some design issues in the codebase. I asked what their test suite was currently running at, and I don't remember the exact figure, but it definitely had a lower bound at 30 minutes (I think it might have been an hour). I then asked about just the unit test suite, the one you should be running frequently while developing. The speed for that was on the order of many minutes. Someone else then asked how it was to run a single set of examples, the type that's focused on whatever part of the codebase you're actively working on, the type that should be run almost constantly while writing code. This was 'better,' being on the order of 30-45 seconds. I remarked that this was still horrible, especially when working on a piece of the system that didn't need specific features of Rails*. One person remarked that this was inevitable with a large codebase. I disagreed. That is, I disagreed with the statement that the size of the codebase was the cause. Slow unit test suites and the inability to quickly run focused subsets are an indication of a design problem with the codebase, not the size. As we talked more, I started to notice something that I've seen before but had a hard time placing: extreme rationalization of bad situations.
This got me thinking. Often times, we choose to be in certain situations and then, rather than admitting that the situation is bad and dysfunctional, then convince ourselves -- and justify to others -- that this is the only way it can be. While we are free to choose trade-offs for whatever situation we place ourselves in, it is important to be honest with ourselves about the causes of the situation. For example, the above company has interesting computer-science-types of problems which makes up for the bad situation regarding the actual development process.
If we are not honest about the causes of a bad situation, we pose a significant danger to those who are either less-experienced or uncertain about their own situation. Take a slow test suite, for example. Having a test suite that actively hinders a developer from running portions of it while developing is a deterent even to run the suite at all. As the run time climbs into minutes, the test suite becomes an antagonist, rather than the helper and guide that it should be. Instead of rationalizing, say you have some serious design flaws in your system. Or, say you have a large codebase and a huge team consisting of people with varying desires to write automated tests. When we mask the real causes of a problem, it has the potential to confuse less-experienced developers who look to us for guidance. For example, thinking a slow test suite is inevitable could stop someone from asking for help optimizing their own codebase while there is still time. If you are talking to a less-experienced developer and you mask the fundamental problems, what message are you conveying to them?
The point of this post is not to point out how people are wrong or that you have to fix, or are able to fix, a bad situation. I just want to raise a reminder that there is a difference between rationalization and being honest with the reasons for something. By rationalizing, we are fooling ourselves and those who learn from us into thinking that you can't do better and it isn't worth trying. By being honest with the reasons, we have the opportunity to both learn from our mistakes, and teach others what pitfalls might await given certain decisions.
So, the next time you find yourself talking with someone and describing a less-than-optimal situation, ask yourself whether you are being honest about the causes. It doesn't mean you have the power or inclination to fix the problem, but talking about the causes can lead to valuable conversations about how we can do things better in the future.
also... can we stop using the terms 'ivory tower' or 'real world' when rationalizing our situations? As DHH said, "The real world isn’t a place, it’s an excuse. It’s a justification for not trying." (other inspirational quotes)
*In fact, I will put forward you can't do an effective test-driven development cycle with a feedback loop that long.
As always, thoughtful comments are happily accepted.