Sign In

Communications of the ACM


Software Learning: The Art Of Design Regret

View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook

"What If?" is a question so fundamental to human learning it has infused generations of science fiction enthusiasts with the possibilities fixing our mistakes through time-travel, as 19th century writer H.G. Wells' "The Time Machine" and Star Trek's "City on the Edge of Forever" episode come to mind. Given that such machines are not on the horizon any time soon, the best we can do is to look backwards for insight and apply lessons forward. This is not as easy as it sounds, as there are two equally unhelpful poles: the first is to never to look in the past and question what could have been improved, and the second is to persistently ruminate in the past. The goal is to live somewhere in the middle. But how should software engineers try to classify their reflections?

Retrospectives have long been a part of software engineering practice, and it can be tempting to look at prior efforts and label every decision as a "flaw" or "bug" if it doesn't agree with one's particular sensibilities. This is not a constructive attitude as understanding the context of decisions is critical, as no effort exists in a vacuum. Factors such as available budget, resources (in quantity, quality, experience, and personality), technical options, and target schedule all interact to affect the decision-making process. Only by understanding design context we can differentiate between the preventable and the unavoidable, and similarly understand what we could reasonably beat ourselves up about what we need to just let go.

Anachronistic Regret

This is the case where a framework or technology options that didn't exist at the time of a design decision, but regret is felt for not having those options anyway. While this can make for some interesting hypothetical discussions, such as the effects that personal computers could have had in the 1960s space race, it can also be taken too far, such as pondering how Abraham Lincoln could have revolutionized space travel as President if he only had rockets and computers during his administration. There was no decision possible because there were no valid options at that time.

Actual Mistake Regret

It happens. Where "it" can be everything from a fat-fingers, code-horrors, or bear-traps, and many books have been written on this subject ranging from Code Complete (McConnell) to Refactoring (Fowler). One example that I lived through was a colleague who was enamored with the Inversion of Control and mocked objects pattern. There were unit tests, but it had gotten to the point where the unit tests were testing nothing but mocked objects instead of the actual codebase. When this codebase was deployed to the field, it was a disaster, and I wound up having to clean this up. The issue in that example wasn't that Inversion of Control and mocked objects weren't legitimate patterns; they were taken too far, as adherence to patterns became the goal instead of the functionality of the overall software effort. That is just one example; there are many other possibilities.

Decision Regret

This case is the regret of the "road not taken." In software efforts, there are design choices that seem ever-pitted against each other, such as natural vs. synthetic keys for database design. But there are plenty of other cases where multiple reasonable options could exist for a situation that aren't so doctrinally charged, each valid and "appropriate-enough." As long as each option was evaluated honestly and thoroughly in terms of strengths and weaknesses – and ideally documented - this is really the best we can expect any software engineer to do in terms of making the best choice under the circumstances. Decision regret is an inevitable outcome of actually making decisions, and it is far better to make progress and live with some Decision Regret than be paralyzed with inaction.

Unknown Consequences Regret

This is the head-smacking case of "I wish I would have known that at the time," where one experiences an unanticipated side-effect or edge-case of a design, and often popping up at the worst possible time. These don't necessarily invalidate an overall design, but expose an extra condition or two that need to be addressed. The Java programming language is filled with these, particularly around memory management and garbage collection. Java has proven itself to be an effective language in a great many cases, but also contains some surprises for designs that require operating under high memory load where software solutions have a tendency to work … until they don't. This necessitates diving into arcane Java Virtual Machine settings and sometimes redesigning some software elements in response. In fairness to Java, every programming language and technology framework has sharp edges lurking somewhere, and finding those edges is the frequent consequence of doing interesting work.

Missed Opportunity Regret

Who hasn't exclaimed, "why didn't I think of that?" Well, you didn't. And, that's life. Again, the best one can do is to continually strive to expand one's knowledge horizons and look for opportunities to apply those lessons in the future. Fortune favors the brave — and the prepared. Iteration is equally important, as the more practice one obtains, the better one can become at pattern recognition.


Doug Meil is a software architect at Ontada. He also founded the Cleveland Big Data Meetup in 2010.


No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account