Considering Agile, deciding against it
Some time ago, while managing a distributed software project, I spoke to a developer who was thinking about joining our team. This individual was a big proponent of Agile development. When he found out that in our particular project we were not using an Agile development model, he began to lose interest. He compared my position there to the position of the Tom Smykowski character from the movie Office Space. When I recently then came across this blog posting by Shannon JJ Behrens, I had to remember that particular incident...
In the movie Office Space, Tom is an older employee at Initech, whose job it is to “take requirement documents from the customers and bring them to the developers”. Not surprisingly, he is being characterized as 'useless' and let go. Similarly, in Agile development, direct communication between customers and developers is seen as the way to ensure that the deliverables align with customer requirements. No middle man is needed. Instead of extensive up-front design, requirements are captured in short and concise user stories.
I have to admit, I was somewhat taken aback by the comparison to the Smykowski caricature from the movie, but this episode certainly made me think about why we had chosen a non-Agile style for our project.
Let's try to answer this. Take a look at the graphic here (I don't remember where I saw it first, but it can be found on many different sites, click to see the larger version):
What do we not see anywhere? Design and architecture. The definition of the 'backlog' and the 'release plan' is probably the closest we get, but even that is feature focused. And in Agile, features always seem to be driven by customers.
For me it boils down to this: I believe that Agile is applicable in projects where not only a clear vision of the project has been communicated to the developers, but also where certain conventions and structures have already been established and where we are dealing mostly with explicit requirements, rather than implicit ones. Projects, where the 'how' is already known, and we are just dealing with the 'what'.
For example, to achieve a consistent UI you require some up-front planning, possibly a UI czar: One person - or at most a small number of people - who establish and downright dictate what the UI shall look and feel like. It may stifle the creativity of the developer, but it will ensure a much more consistent look for the UI. Imagine you have user stories describing the requirements for various dialogs in the application's interface. Different developers take on the implementation of those user stories. Unless there is some established, over-arching strategy and - dare I say it? - upfront design of the UI then the results are likely to appear disjointed or piecemeal.
Another example is error handling in the application. I remember one of my software engineering instructors a long time ago, who insisted that a project should have an error czar: Someone who establishes the standard by which errors should be handled within the code and who then diligently ensures that all new code complies with this standard. If this is not done and established ahead of time, how are the different developers supposed to produce consistent code that knows how to adhere to those standards and handles errors in a unified manner?
It appears to me that if you are developing an application for an existing framework then many of these up-front 'infrastructure' decisions have been made for you already. Consequently, you can focus more readily on explicit customer requirements and Agile is a much more worthwhile development strategy.
For example, if you develop a web-app in a framework such as Django or Rails then there are specific ways in which your database is going to be laid out. You also have specific conventions for how errors are handled, and so on and so forth. Thus, the design and architecture of much of your application is pre-determined by your choice of framework. You don't have to make those decisions any more. And later then, the more of the application has been written, the more such underlying decisions will have been made already. Agile works well in those situations. Since big-picture architectural decisions don't have to be made any more, we can focus on the addition of small features (or larger features broken down into many smaller ones), while all developers benefit from the established infrastructure of the project. The 'grab-bag' approach with focus on the individual user story becomes feasible.
However, if a vision for the project still has to be conjured up and the basic architecture has to be created and the UI look and feel needs to be designed and error handling or other standards have to be established... well, then Agile does not seem to be ideally suited for that. I have seen design and architecture by committee and design by chaos (everyone going off to implement things before anything was established). Neither one is a pretty sight. The best results are achieved if the architectural direction and basic design can be established by one or just a few, good individuals. Up-front. Well, somewhere after establishing the vision and collecting initial requirements, of course.
Realistically, I'm sure that most Agile projects will include some sort of up-front design and architecture, whether the participants call it that or not. Heck, even questions like: “Should we use Java or Python for this?” need to be answered ahead of time and are part of these implied up-front activities.
That particular project for which the developer from the beginning of this article interviewed was not at the stage yet where an Agile approach really would have worked. There were still too many unknowns at that time, which required up-front consideration. Thus, our project wasn't run as an Agile project. There is a time and place for everything, and our project was not at a time or place for Agile, yet.
However, projects can be transitioned from a more conventional approach to an Agile approach over time. The more we know, the more suitable Agile becomes. The less architecture and design decisions have to be made the better for Agile. But at the start of a brand new project, without anything established, some quite conventional up-front design and architecture is needed.
You should follow me on twitter here.
In the movie Office Space, Tom is an older employee at Initech, whose job it is to “take requirement documents from the customers and bring them to the developers”. Not surprisingly, he is being characterized as 'useless' and let go. Similarly, in Agile development, direct communication between customers and developers is seen as the way to ensure that the deliverables align with customer requirements. No middle man is needed. Instead of extensive up-front design, requirements are captured in short and concise user stories.
I have to admit, I was somewhat taken aback by the comparison to the Smykowski caricature from the movie, but this episode certainly made me think about why we had chosen a non-Agile style for our project.
Let's try to answer this. Take a look at the graphic here (I don't remember where I saw it first, but it can be found on many different sites, click to see the larger version):
What do we not see anywhere? Design and architecture. The definition of the 'backlog' and the 'release plan' is probably the closest we get, but even that is feature focused. And in Agile, features always seem to be driven by customers.
For me it boils down to this: I believe that Agile is applicable in projects where not only a clear vision of the project has been communicated to the developers, but also where certain conventions and structures have already been established and where we are dealing mostly with explicit requirements, rather than implicit ones. Projects, where the 'how' is already known, and we are just dealing with the 'what'.
For example, to achieve a consistent UI you require some up-front planning, possibly a UI czar: One person - or at most a small number of people - who establish and downright dictate what the UI shall look and feel like. It may stifle the creativity of the developer, but it will ensure a much more consistent look for the UI. Imagine you have user stories describing the requirements for various dialogs in the application's interface. Different developers take on the implementation of those user stories. Unless there is some established, over-arching strategy and - dare I say it? - upfront design of the UI then the results are likely to appear disjointed or piecemeal.
Another example is error handling in the application. I remember one of my software engineering instructors a long time ago, who insisted that a project should have an error czar: Someone who establishes the standard by which errors should be handled within the code and who then diligently ensures that all new code complies with this standard. If this is not done and established ahead of time, how are the different developers supposed to produce consistent code that knows how to adhere to those standards and handles errors in a unified manner?
It appears to me that if you are developing an application for an existing framework then many of these up-front 'infrastructure' decisions have been made for you already. Consequently, you can focus more readily on explicit customer requirements and Agile is a much more worthwhile development strategy.
For example, if you develop a web-app in a framework such as Django or Rails then there are specific ways in which your database is going to be laid out. You also have specific conventions for how errors are handled, and so on and so forth. Thus, the design and architecture of much of your application is pre-determined by your choice of framework. You don't have to make those decisions any more. And later then, the more of the application has been written, the more such underlying decisions will have been made already. Agile works well in those situations. Since big-picture architectural decisions don't have to be made any more, we can focus on the addition of small features (or larger features broken down into many smaller ones), while all developers benefit from the established infrastructure of the project. The 'grab-bag' approach with focus on the individual user story becomes feasible.
However, if a vision for the project still has to be conjured up and the basic architecture has to be created and the UI look and feel needs to be designed and error handling or other standards have to be established... well, then Agile does not seem to be ideally suited for that. I have seen design and architecture by committee and design by chaos (everyone going off to implement things before anything was established). Neither one is a pretty sight. The best results are achieved if the architectural direction and basic design can be established by one or just a few, good individuals. Up-front. Well, somewhere after establishing the vision and collecting initial requirements, of course.
Realistically, I'm sure that most Agile projects will include some sort of up-front design and architecture, whether the participants call it that or not. Heck, even questions like: “Should we use Java or Python for this?” need to be answered ahead of time and are part of these implied up-front activities.
That particular project for which the developer from the beginning of this article interviewed was not at the stage yet where an Agile approach really would have worked. There were still too many unknowns at that time, which required up-front consideration. Thus, our project wasn't run as an Agile project. There is a time and place for everything, and our project was not at a time or place for Agile, yet.
However, projects can be transitioned from a more conventional approach to an Agile approach over time. The more we know, the more suitable Agile becomes. The less architecture and design decisions have to be made the better for Agile. But at the start of a brand new project, without anything established, some quite conventional up-front design and architecture is needed.
You should follow me on twitter here.
Labels: agile, architecture, design, development, error czar, error handling, requirements, software, ui czar
7 Comments:
Very good article, I have ran into many of these "Prophets of Agile" over my developer years and I always found this to be one the the biggest downfalls that none of them seem to realize until its too late.
I think running forward without even an overall idea of how the system should be architected is just asking for bloat and obscurity in your final product.
By Anonymous, At July 14, 2009 at 11:24 AM
Exactly.
No Architecture.
No Design.
No Clue.
Please don't let the children play in the software.
By Mike, At July 14, 2009 at 12:28 PM
I've never seen an agile project that actually has a logical and coherent process at all and I've seen dozens. The dirty truth is that agile is mostly hot air and hand waving. In practice it breaks down to the same random hacking people have been doing since the invention of the computer, it's just been relabelled as a process for marketing purposes.
By Anonymous, At July 14, 2009 at 12:53 PM
I find that my project-management structure is like an anonymous-agile-method where I fall back to a kind of waterfall model, when major design iterations are required. Even if it's only me working on a project, I do find that a waterfall like cascade of requirements, design, and classic design documentation, that precedes any coding on the next stage of the project is essential.
Sometimes, I find that writing documentation and help files can be the best way to lead me back to a structured design review, where particular "aspects" of the software are focused on.
- How good are the trace-logs and error messages output by the application? Do users find the messages readable. Can developers use the results to find problems? Is there a lot of signal, or a lot of noise?
- Is the application well structured for internationalization?
- Is the user interface capable of handling changes in user display DPI and screen size, or standard themes, and font sizes, in the various platforms or operating system versions it is tested on?
You name it, there's an endless number of potential 'czars'. Sometimes I delegate a particular role to a particular person, but I don't find that it's necessary to have a single developer attached to a role like UI design in order to achieve consistency. I don't let developers who can't follow user-interface guidelines. My rules would take up about 4 to 6 pages, and could be summarized as (a) do the things well that the MS Office UI, and Windows UI itself does well. (b) don't make the mistakes that the Office and Windows UI people make, and I have about five pages of those. The microsoft UI standards documents are good, and merely need a "corporate overrides" document added to them, for most corporate Windows desktop software development. If you're writing for Mac OS X, then you should probably read the Apple style guides.
Anyways, I prefer explicit documentation to czar-like control, and code-review, and a bit of collaborative (loose-knit pair) programming.
So I cherry-pick a bit from the Agile people while eschewing the label or the overall manifestos.
There ain't no silver bullets, and there ain't no such thing as a free lunch.
Warren
By Warren, At July 14, 2009 at 1:34 PM
Rather than throwing the baby out with the bathwater consider that these elements can also be introduced into a project. Stories aren't just veritcal slices of functionality they are everything a project needs doing, including things like choice of language. When you widen the scope of what can be planned with agile to everything you end up with what you need.
Agile is about principles not process, so if your project has a technical need to do before you go ahead and do something else well then that needs doing and in an early iteration. Technical requirements are just as important as user based work and it needs to be on the work list as well. What your project then boils down to is every iteration you aim to finish something fully and once you have executable software you release it frequently and fully tested.
By Paul Keeble, At July 14, 2009 at 11:41 PM
The question is if "agile" means "process" or only to "be efficient". For me it is the latter one. So, to add some architecture and design to it to describe the direction of a project is something useful to reach the goal: to be as efficient as possible.
My experience with developers shows that design and architecture is something they don't know and they don't have a focus on. As far as I remember all the agile stuff is an idea that was born by developers. So, this could be the cause why you don't find design/architecture explicitly in those images.
I still think that you need a design before you implement a feature instead of trying to develop a design through refactoring. This way you don't waste time. But, I never had the problem that the code didn't execute in the end. One aim of all the agile ideas was to have executable code on a daily basis. I never had a problem to reach this. So, the motivation behind all the agile ideas is important to understand why those "processes" are as they look like today. Nevertheless, we shouldn't stop to optimize those ideas. No one has the wisdom today to describe the ideal software development process. So, everyone should think about to find a better answer.
By rainwebs, At July 16, 2009 at 1:30 PM
Good post.
I've been reading "The RSpec Book", and they cover this issue. They say to dive in with no design is reckless and irresponsible. Yes, that was written by guys who believe in BDD ;) Rather, they propose "just enough design" to get going. I can agree with that, at least. I've never been one to try to nail down every last detail in the design anyway.
I completely agree with your earlier point. You do have to pick a language. You do have to decide if this is a Web app or a Microsoft Word macro. Once you've picked Rails, a lot of the decisions are made for you, but not everything in this life is a Rails app.
By jjinux, At July 20, 2009 at 6:50 PM
Post a Comment
Subscribe to Post Comments [Atom]
<< Home