Programming is the discipline of talking pure logic to a machine that will unfortunately take everything you say literally and act upon it.
Process, is continuum of actions or mini processes that have inputs and outputs is repeatable.
The programme is a process and the programming too is a process. It is therefore not surprising that many old school programmers simply can’t accept the principals that make agile development successful. Are they right?
The Fagan method of code review is a milestone in the thinking that developed around software engineering in the seventies. Invented by no less an authority than IBM, the same organisation that taught me agile fifteen years later, the Fagan method set out by code review and manufacturing strength measurement, to eliminate errors and produce perfect code.
This was not Six Sigma, but a measurement of sufficient opportunities to eliminate reworks at the earliest stage. IBM claimed substantial improvement in case studies published. The impact on software success rates was not however significant and I, at least, am not shocked by this revelation.
Here’s my take:
Software is built by trial and error, simple as that. Anyone who doesn’t know that has never written code and needless to say , few people who have written code ever get involved with lofty ideas like quality improvement.
1. Treating something like software development as a continuous process that can be accurately measured and improved is a theory that has some purpose but is limited in scope.
2. The problem with software is not and has never been the quality of code.
3. The definition of quality is invariably wildly inaccurate when it comes to software products.
Measurement and process improvement
I have had mixed successes in the past with Fagan like measurement of exit criteria form tasks or products and from analysing the scope and quantity of bugs by programmer, designer, functional area and other criteria. Luckily I was doing this in an organisation where I was surrounded by professional researchers and statisticians and despite my best efforts to gain and follow their advice, not a single test I could devise was able to satisfy their strict criteria for reliable measurement.
The things that did pay dividends was creating over time a culture of following strict guidelines and strategies that reduce the risk both at a design, documentation, communication and process level.
The real problem with software
Any time you analyse the causes of failure in software projects, the causes are found in a just a few places:
1. The commonest cause of failure is that the system delivered does not meet the needs or expectations of the users or stakeholders. Bad requirements, bad communication, bad UAT, took so long that the needs had changed, etc , etc
2. The business case was built on flawed assumptions and it should never have been completed, but pulling the plug is seen as failure while a poorly performing, or even stifling system is not.
3. Too little time was spent on the core needs and too much on the flora and fauna around the edges
The definition of quality
I’m not going to quote anyone , but t bravely state that the proof of the pudding is in the eating.
This system must satisfy users by helping them to perform to the expected level or exceed it and that includes keeping them motivated or better still re-motivated. It must also satisfy stakeholders that it is delivering an acceptable return on investment
Not a definition I know, but definitions are not all they are cracked up to be
What’s the verdict then
Well my verdict is that we should use process intelligently whether at a detailed level or a framework level and bearing in mind the power of a motivated workforce, but should not try to drive a square peg into a round hole in terms of ordering something that is about intense and accurate human interaction between professions that share little understanding of each others worlds. What delivers the goods or gets the job done is thing that delivers quality.
At IBM, not so long after the Fagan era, I learned to my astonishment that, in direct contradiction of his earlier paper, the cheapest way t fix bugs was not to fix them until you had to. This was measured very simply on the basis of whether the bug was preventing testing form continuing. If not, the developer could decide to ignore it and usually did. Once a week we all got round a table with a huge projector screen and we analysed the bugs together, drawing simple dependency links as we predicted the relationships between them. Every programmer knows how one bug causes several and sometimes the one causing them is not manifesting itself in any other way.
A group of switched on brains after the first coffee of the day could generally choose one or two bugs each to work on which resulted in most of the remainder disappearing and not returning.
If these bugs had been tracked down doggedly by tired individual programmers we would have lost cumulative weeks chasing shadows and ignoring the power of collective minds over large areas of complexity.
My belief is that well managed agile approaches to software improve communication immeasurably and eliminate the chance of a system not being what was expected. I think this is already applied to other processes from having a suit made to furnishing your home.
I also believe that quality in delivery is as much about motivation and peer review and recognition as it is about process, testing, logging and measuring. Real quality is built in and can’t be applied afterwards. In my view, well managed agile creates a vibrant team environment where programmers have direct contact with end users and stakeholders and receive regular feedback and recognition.
How quality should be measured
Quality is measured very simply by how the software performs for the user and delivers for the stakeholder and the two go hand in hand. The quality controller is mostly the user with input from the stakeholder and a technical reviewer. Perfect software with nill defects is not only unnecessary, it is not wanted in a world changing so fast that it will most likely be discarded in a short period of time. Fit for purpose is the ultimate measure, once purpose has been agreed.