Why Rigorous Processes, Specifications and Estimations are a Waste of Time

Share the article!

It borders on ridiculous when we encounter so many software development efforts are managed on the premise of a heavy weight process (aka. Waterfall model) that attempts to reduce risk and tries to guarantee that no requirements slip through the cracks. It can be extremely frustrating trying to argue to someone trained in CMMI or even RUP the falacies of their beliefs. Afterall, the rigor, precision and deterministic nature of heavyweight1 methodologies certainly have their appeal for people trained in analytic endeavors.

The complexities of software development are not the same, they vary considerably in the contexts in which software is developed. The can be ordered in increasing scale of complexity:

  1. Internal In-house software development.
  2. Software Consulting/Services.
  3. Software Product Development.

The complexity will also vary tremendously between the activities of maintenance and start from scrarch (aka. Green field) development. I have witnessed utter failures2 when management takes an organization successful in maintenance work and asked them to develop new software. The organization structure and processes for maintenance work are simply non-transferable to developing new software. The same dire consequences are bound to happen when you attempt to take a consulting organization and try to develop a software product. A software consulting organization scales linearly at best and exponentially at worse with the amount of work. It’s a characteristic that’s simply not viable in highly competitive marketplaces.

So to be clear, when I talk about software development in this article, I really mean developing software products in highly competitive marketplaces. The other two contexts one may argue can be served adequately using existing heavy weight processes. The viability of in-house development rests in justifying a budget. The viability of software services rests in justifying the amount of services provided. Efficiency and productivity does not necessarily have to correlate with achieving revenue.

The problem is best summarized by Ross Mayfield when he writes on the failure of heavy weight processes:

Organizations are trapped in a spiral of declining innovation led by the false promise of efficiency. Workers are given firm guidelines and are trained to only draw within them. Managers have the false belief engineered process and hoarding information is a substitute for good leadership. Processes fail and silos persist despite dysfunctional matrices. Executives are so far removed from exceptions and objections that all they get are carefully packaged reports of good news and numbers that reveal the bad when it’s too late.

In worlds where efficiency a productivy are of critical importance one cannot afford the warm fuzzy feeling brought about by a Cargo Cult process. The reality is that one cannot overlay a deterministic methodology on top of an intrisically empirical endeavor. One may argue the importance of detailed functional specifications, only to realize that its sole purpose is to support an “illusion of agreement“:

Functional specs are often appeasement documents. Theyxe2x80x99re about making people feel involved. But, therexe2x80x99s very little reality attached to anything you do when the builders arenxe2x80x99t building, the designers arenxe2x80x99t designing, and the people arenxe2x80x99t using. We think reality builds better products than appeasement.

Not only are these “appeasment documents” of less value than traditionally thought, the cost to achieve appeasement all too often spirals into endless debate. “Why should I care what color the bikeshed is?” is an excellent metaphor that describes this problem:

Parkinson shows how you can go into the board of directors and get approval for building a multi-million or even billion dollar atomic power plant, but if you want to build a bike shed you will be tangled up in endless discussions.

Think of it as “analysis paralysis” at the requirements gathering level. This kind of group behavior consumes time exponentially at you move to analysis and then to design.

But to be fair, there is always value to how you “elucidate” requirements and how you arrive at a consensus or agreement. Unfortunately, methodologists have a tendency of proposing a one-size fits all recipe on how this is done. I have observed that the effectiveness of a rigorous requirements gathering activity degrades proportionally to the technical complexity of the subject matter. This was best expressed by Linus Torvalds who wrote:

So there’s two MAJOR reasons to avoid
specs: they’re dangerously wrong. Reality is different, and anybody who
thinks specs matter over reality should get out of kernel programming
NOW. When reality and specs clash, the spec has zero meaning. Zilch.
Nada. None. It’s like real science: if you have a theory that doesn’t match
experiments, it doesn’t matter how much you like that theory. It’s wrong.
You can use it as an approximation, but you MUST keep in mind that it’s
an approximation. Specs have an inevitably tendency to try to introduce
abstractions levels and wording and documentation policies that make
sense for a written spec. Trying to implement actual code off the spec
leads to the code looking and working like crap.

So if the simplest of subject matters promote too much debate, and the more complex subjects are to far divorced from reality then were does one find that effective middle ground? One really needs “just enough requirements and specifications” and not get hung up on it and move quickly on. Unfortunately, this takes a lot of experience to recognize what’s good enough.

However, the heavyeeight methodologist will also argue that one simply cannot expend (and potentially waste) resources without having a detailed detailed plan derived from the functional specification. Unfortunately, a detailed plan requires effort that focuses on creating accurate estimates. The question must be asked as to how much effort does one place on estimating before the laws of diminishing returns weigh into effect. David Anderson writes:

So I ask you, what do you think your customer would prefer? – a promise which says, “we can deliver approximately 100 Features this month, plus or minus 20″ (i.e. 80 to 120), or a promise which says, “Well we estimated everything carefully and we are confident that we can deliver 63 Features this month?” (and we are almost completely certain that we will break this promise because this isn’t an exact science).

The point that I’m trying to make is not to throw out process, functional specifications and planning out completely. Rather, I am arguing that they aren’t really worth the amount and effort traditional spent on these activities. The usefulness of these activities tend to drop considerably and quickly over time. Unfortunately, heavy weight methodologies tend to place too much weight on these three activities ignoring the reality of the law of diminishing returns. The belief is that by doing extensive upfront work one would be less prone to risk, avoid potentially costly detours and even (by some stretch of the imagination) do less real development work.

Yes, traditional deterministic methodologies work when building a bridge or performing maintenance work on known software. Unfortunately, the methodology breaks apart in a environment where innovation and time to market are of essence. It can take an inordinate amount of time to rigorously spec out what you customers are dreaming about, unfortunately once you’ve arrived there, the market itself would certainly have moved.

One’s best choice then is to rely on something bordering on clarivoyance, that is guestimates and gut feelings. It requires the ability to react quickly based on partial information and execute efficiently on the most minimal of resources. It requires a gut understanding of the pain points and an imagination to define an innovative solution where one previously was non-existent. This requires a development process that requires “less mass“:

A quick look at physics makes this obvious. The more massive a moving object the more energy it takes to change its direction. What applies to the physical world also applies to the business world.

and ultimately taking to heart some counter intuitive rules for software success:

Program Less. There are lots of things we can program, but just because we can doesn’t mean we should.

In the next article, I’ll highlight some emerging ideas for software development that contains ‘less mass’. Ideas that rely more on choreographed and market driven development rather than Soviet style central command driven development.

1.
To help me define what I mean by “heavyweight” one should read Scott Amblers “Glacial Methodology for Software Development
.


2. In a past life, I was involved in a CMMI project which consumed over a year to eventually realize than a version control system was absolutely critical.


Share the article!

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>