Imagine waking up one morning to discover that one of the core beliefs of the civilised world has been challenged?
And as for that so-called genius Sir Isaac Newton, who claims to have mathematically proved Galileo and Copernicus were right all along, what does he know?
Ever tried to climb up a Waterfall?
Let's transport ourselves forward in time, from the heady heights of astronomy into the heavenly world of software development. It's 2001 and the launch of the Agile manifesto. Living in more enlightened times, no one's threatening to burn this group of self-appointed industry experts or have them imprisoned. But it's right to be sceptical. After all why change what we've been doing for years? We have the Waterfall methodology.
You gather requirements, analyse, design, code, test, release. Alright, so quite often good communications are lacking. It doesn't facilitate change particularly well. It's hard to accurately plan and poor old test always seem to have their time squeezed, but after all it's only test. The finished product rarely matches the original business vision and delivery is frequently over budget and late. Better not tell the shareholders.
But should it have ever seen the light of day?
What if the adoption of Waterfall was built on shaky foundations? What if it's sequential model was never the intention?
With ever improving technology, software engineering as a new industry, had to find ways to manage the development of larger and larger systems.
As a contribution to the debate around this problem, in 1970 Dr. Winston Royce, now credited with Waterfall's introduction, issued the white paper 'Managing the development of large software systems'1
Dr. Royce lists each stage of the process:
- System requirements
- Software requirements
- Program Design
We can all recognise this familiar model.
'I believe in this concept,' writes Dr. Royce, then immediately goes onto say 'but the implementation above is risky and invites failure.' He argues that by leaving the testing phase to the end, there is little or no opportunity for change should an error be discovered in a previous step. He concludes his premise by stating:
'Either the requirements must be modified, or a substantial change in the design is required. In effect the development process has returned to the origin and one can expect up to a 100-percent overrun in schedule and/or costs.'
To address this issue Dr. Royce proposes a series of elaborations iterating around the process, to capture errors and to avoid a calamity during the testing phase. The end result is a complex process diagram, which, when compared to the the basic waterfall model, so the story goes, led a Pentagon official to adopt the simple route instead of Dr. Royce's complex elaboration. And we've been living with the consequences ever since.
An earlier example of a fledgling Waterfall process in action appears in the reissued study 'Production of Large Computer Programs'2 by Herbert D. Benington, originally published in 1956. The paper describes the broadly successive steps undertaken to develop a large system and the per instruction costs around its production.
If it's broken, fix it
Reflecting in 1983 what he would have done differently, Benington concludes:
'I would have worked to test and evolve a system. I estimate that this evolving approach would have reduced our overall software development costs by 50 percent.'
Both Dr. Royce and Herbert Benington understood the flaws in the sequential model, not from any academic standpoint but from hard experience.
In 1992 Pope John Paul II officially recognised that Galileo had been right all along; 350 years after the event. It would be a pity to think that the software industry would take as long to recognise the broken software development methodology being used to this day, instead of promoting the practice of iteration and early testing as espoused by those heretics who launched the Agile manifesto in 2001.