It seems to be a law of software development that things always take longer than we expect. When a project manager talks to a designer, programmer, or tester and tries to get a sense of how "complete" the assigned task is, the normal reply is "about 90%."
This article motivated me to pick up a subject that had seemed to lose importance, for several reasons. No joke: I used Zeno throughout my career doing development work. It was a nice way to think about earned value.
Modern approaches push the uncertainty on the user. Who now even thinks of trying to specify how some system will look or work. Any attempt at such is full of problems.
The chaotic collective that we call markets (and their servers) might be an example. Just recently, there was news that an exchange had processed incorrectly.
The classic Zeno's paradox is, of course, not a paradox at all. The distance shortening of 1/2 + 1/4 + 1/8 + 1/16 +... does indeed never reach zero and if we were to spend the same amount of time traveling each increment we would never get anywhere. But of course we take 1/2 the time to move the 1/4 distance increment that we did to move the 1/2 distance increment, so the time increments are getting shorter at the same rate that the distance increments are getting shorter.
However, the surface paradox does somewhat resemble the eternal 90% complete issue in that people's halving the distance to the goal line each week has a basis in fact--specifically that by most measures of "production" in software the overt production does "slow down" as we get toward the end. This is the long tail of the cumulative Rayleigh function that I showed in the article.
As an aside, Diogenes the Cynic refuted Zeno's point by simply walking across the Athenian senate floor.
With forty years of IT development and support work behind me, the only projects I have ever seen completed on time and within budget were those that delivered much less than what was explicitly or implicitly promised when the project began. Having been a contractor for over half of that time, I have seen a large number of projects, in nearly two-dozen different companies. So I don't think that my experience is atypical.
But I do think that there is a way to do better, although I was never given the go-ahead to try it. All too briefly, it is this: treat the deliverables of any project as release 2 stuff. Define a subset of those deliverables as a project-internal release 1, and then carry out a full project life cycle on that subset with a delivery date that leaves at least a third of the project calendar. Then use that last third to expand the release 1 into the final project deliverables. One of the most important things this does is work out all the communication and support snags with outside personnel and departments who cooperation is required to successfully start out, continue, and complete the project.
If this is just that tired old horse "iterative development", then I'll just have to say that in the course of a long career, and projects with many different companies, I've never seen it actually used.
Forty years Tom? Me too. That's a long time.
And a long time ago, Fred Brooks had a whole chapter of "The Mythical Man-Month" called "Plan to Throw One Away" suggesting this. But we don't learn it seems.
The two cycle release makes sense from the "Orders of Ignorance" perspective. We start off a project with things we know (Zero Order Ignorance or 0OI), things we know we don't know (1OI) and things we don't know we don't know (2OI). It takes one pass to convert the 2OI into 1OI and the second pass to convert 1OI into 0OI which is the provably correct knowledge now resident in executable form.
This is also what happens in testing which generally consists of two stages:
1. find a bug (2OI->1OI = become aware of what you don't know) and
2. fix a bug (1OI->0OI = resolve the answer to the now-clear questions).
How did you come up with the 2/3:1/3 division between release 1 and 2?
Displaying all 4 comments