Software Development for the 21st Century
Historically, computer software development has involved a lot of work before you start building things, and has rarely involved the people who use them. There were lots of good reasons for this:
These things resulted in an approach to serious computing which was entirely logical in the 1950s-1970s. Decide what your software should do (functional requirements - write down on paper exactly what it should do), decide how it should do it (specification - work out exactly on paper every possible situation and what to do when it occurs), then some programmers build it and you start using it. No testing because you were expected to get the requirements and specification right - so testing was irrelevant. No user involvement because users were cheaper than machines, so you could train them on whatever the computer did. Most programs did not have users anyway - they took data, processed it and produced results.
When you understand that context it is clear why software design and specification approaches evolved. At the most extreme it could take many months and hundreds of pages of documents to produce 5 lines of code. Methodologies arose around this.
In the late 1970's and early 1980's everything changed. Computers got faster and cheaper - you began to see Personal Computers (PCs) - cheap enough that a person could have one in their own house (about £1,000) - though there was no internet, so these small computers were on their own. Xerox invented graphical user interfaces - Windows, Icons, Menus, Pointers (e.g. the mouse) on the Xerox Star. Then Apple, IBM and Microsoft 'borrowed' the ideas. In less than 10 years programming changed completely. At the beginning of the 1970's about 1% of the code in the program was concerned with users and interaction. By the middle of the 1980's it over 70% of the code was about users and the user experience. These days it can easily be 90%. Users can interact with programs and change things dynamically. More than that, computers everywhere mean that they can be used by anyone, with no training, so the ideas of ease of use, reversibility of actions and WYSIWYG (What You See Is What You Get), among others, came to the fore.
So the availability and power of computers changed. Where they were and who used them changed. But the ways to approach the design and development of software did not. By the 1980's traditional software development approaches were already out of date and outmoded for most of the software in the world. They still worked, and were used, by governments, big organisations (e.g. banks) and anyone whose mindset had not moved to the future. And of course, once we started with handheld and mobile connected devices things changed again - but that is not the point.
Many designers and developers of software reacted against the 'traditional' - requirements, specification, build - model. They became more flexible, realised that the user was an important part of the process, and operated on quicker design and build cycles. Faster computers meant it was economically viable to make something, try it out and modify it if it did not work. Users changed their expectations so a more supportive environment which did not require them to learn anything was essential. By the 1990's we had users that would take a program and just click everything in sight to see what happened. That worked because it was well designed and built, but it created an expectation that a system will fit with the user rather than a user fitting to a system. Big organisations did not realise what had happened. They persisted with old models of software design and development - ignoring users.
As this was happening, user expectations changed and a new discipline was born. Since the 1950's there was been a field called 'ergonomics' - the study of work. This led to things like standard heights of desks and chairs, finding appropriate lighting for an office etc. In the late 1970's and early 1980's people realised that computers were part of this work process. Work was done on screen colours (though they were all still text), the height a monitor should be at and so on. Then someone realised that making something suitable to work with is not just physical - making a computer good to work with became an issue. Originally it was called 'Cognitive Ergonomics' - making the computer fit with how you thought. Not physical stuff but designing software to fit with the way people did jobs and thought about them.
Computer software development became anarchic. Big corporations kept using the old models, but most real programmers were designing for small fast systems, consumer markets and fast developing technologies. They had to be faster, leaner and more responsive to the users.
Eventually, some people recognised this change and decided to make it explicit. This is Agile. Lots of people and groups had the same realisation and developed ways of dealing with it. The Agile Manifesto succesfully capture the core of many of these ideas.
Agile was a a reaction against the old systems. It produced new ways of designing and developing software which were more in tune with the way the world is now. It is a big step forward.
BUT - Agile is not a magic solution, and it is not used correctly in many cases. You will find plenty of sites and books that tell you about Agile. Sometimes you see Agile evangelists that tell you extremely Agile processes are the only way to go. They are wrong. As with many dichotomies, it is useful to set up Agile as the alternative to traditional design and development methods, but it is not correct. The truth is in between.
The answer is to think in an Agile way - and use any technique or method that is appropriate. Sometimes the old ones are the best. Sometimes spending 2 months writing a proof of the accuracy of 1 line of code is best (OK - in very rare circumstances). The point is that you must think in an Agile way - just following one methodology is not going to help. Learning and following the rituals is not Agile - you have to embrace the philosophy.