Software development is always about acquiring knowledge. How we get that knowledge can vary, but it is never enhanced by being lazy in thought or deed and there are capabilities in modern systems development that can encourage laziness. It was not so years ago...
When I started programming professionally in the early 1970s, the only available computer was a large mainframe. When I say large, I mean physically; in memory size it was miniscule.a The computer's primary function was running a steelworks and any software development task requiring computer resources had to wait until the machine was not engaged in more important tasks. This would generally be around 2:00 A.M.
I can just imagine the outrage and opposition to the suggestion of introducing deliberate delays! Would the developers know that these delays were being set and would they really use those delays
to think a little harder about the problem at hand? I think the topic is fascinating and I propose a different solution. My suggestion is that the "delay" that needs to be introduced should be in the form of an automated inspection tool running in the background. Every so often, the developer should be alerted to the results of the inspection (ie. specific pclint errors were found or some modules exceed the acceptable mccabe complexity or nesting levels or whatever the chosen coding violations may be). This form of a "delay" actually makes use of the computing power rather than just having the computer "wait" for the developer to "think". Also, I think it could force programmers to work smarter if they know they will be faced with inspection results that they will then have to spend time fixing before moving on. A good implementation of this proposed solution might actually equate to faster=faster!
That's a really good idea, it makes use of the delay rather than simply, well, delaying.
There is quite a bit of evidence that much of our truly innovative thinking occurs in slack time--time when we are not "working" but any reaction to the delay would somewhat negate the benefits.
Also, I recall back in the 1980s working on IBM mainframes running under "TSO" that it would typically have a most irritating delay. Each VT terminal interaction might take 2-30 seconds to respond to a simple key entry. The duration of the delay did not allow any other work (or even thinking) to occur while the uncertainty of the response time prevented you from leaving. So there might be characteristics of the delay that would work and others that wouldn't. It might be better referred to as a "batch mode" processing where there is a period of activity followed by a period of inactivity (where the thinking occurs) rather than a "delay" which might just mean doing things slower.
A project team could easily set up a work schedule or cadence that would achieve this without the necessity of inserting artificial slow downs. Worth thinking about.
Great article. It's hard to get developers to think, because they think they don't have to. And it's hard to convince clients that thinking is important when they hear that you should be able to build a new system, any system, almost overnight. Just grab a bunch of JS frameworks, fire up some virtual servers, and use MySQL.
Part of the problem is tools and languages that don't reward thinking. The dead ends we go down on our own projects are local versions of what the industry does on a large scale with paradigms.
Ms. Jutras's suggestion points to one of our major issues, complexity. We need to teach people to fight complexity at every turn.
When I read the note about PLAN, it made me think of Logo and "playing turtle". When I saw your joke about making a game, it reminded me that I saw one: http://www.kickstarter.com/projects/danshapiro/robot-turtles-the-board-game-for-little-programmer?ref=search
I'm not affiliated with it in any way, but it looks like there are geeks out there who want to teach their kids this way.
Great article! I came from a similar background I think, except that I had been spoilt rotten by working at light-pen graphics development on a Univac 1108 super-computer, albeit at assembler level, that even had drum storage, in 1967, before joining ICL in Putney, London, on a 1900 computer using PLAN assembler. That was certainly a slow-down. Since then I have used a huge variety of machines including many micro-processors, and now Arduino, and certainly appreciate a responsive environment, but I can't help but think that there is a new class of people that are way into the red part of Phillip Armour's graph using the current trend towards the Sprint methodology.
Even the existence of a two-level Git source control system which mandates passing a code test before the second level places the changed code into the global project space does not really deter verification that "something runs" as distinct from validating that it does something useful.
Perhaps all that is needed as a useful slowdown is a code walk-through between the first and second stages of introducing changes into the system.
Thanks for a great article, Phillip. Maybe I even met you in London in the period 1967 to 1970!
Displaying all 4 comments