Agility and Analysis

As developers, analysts, adminstrators, etc, I think most of us tend towards perfectionism. We want to do the job right and we want to do the job right the first time around.

Agility means being able to move forward with less than perfect information. Sometimes it means moving forward with information that is known to incomplete or even potentially inaccurate.

So what does that mean for analysis? Analysis, by definition, is the separation of individual components from the whole. The goal being, at least in my universe of activities, to be the understanding of a business proccess’s basic feature set and the relationships between those features.

To understand the components, you must first understand the whole. How is that possible when you are time boxed to 7 or 10 or 20 or whatever number of days for a sprint?

When developing screens, or reports, or other already componentized feature sets, an agile methodology works well. Screens and reports are ideal for iterative development.

Where I think agile methods need work is in handling things like data modeling. I am a huge believer in not over analyzing; and not over modeling.

An example of that is a recent blog entry I wrote Modeling Addresses. In the comments it was brought up that we could model out the address lines (although that would only be accurate for a subset of addresses) or we could even model out to the point of geocoding.

That fact is, the stated requirement showed that either of those would be taking that particular instance of a model beyond what it should. Rather than a “good enough” model, we can end up over modeling.

But, and this is a large but, how do you know when you get to “good enough” without having fully analyzed a requirement? This applies especially to data models.

Time boxing, and other “hurry up” limitations, in analysis will actually increase time spent on development and will raise the cost of the final product.

Success might be achieved because each sprint is regarded individually, ignoring the whole. Analysis, though, is all about the whole. You win the battle and lose the war.

This is where the one size fits all developer argument (which seems to be a major requirement of agile) fails. It takes experience to know when you are “good enough” to take the next step. That is not generic experience, that is specific experience in a particular skill set and possibly even industry.

So what is my point here? I have spent the last 18 months in an agile environment. There would be a 1 week analysis event followed by a 20 day sprint.

My point is that a better approach would be to remove the analysis from the sprint altogether. Business creates a backlog of user stories and prioritizes them. An analysis team performs triage: some things can enter a sprint immediately, some get passed to a team for further business analysis and some things get passed off to architecture/infrastructure.

BAs and architects stay on the sprint teams. At times, stories may need to be delayed for a future sprint because of new discoveries.

In my mind, this is sort of a hybrid of agile and traditional iterative planning. Called it hyper responsive iterative development. The business is always in the loop, whether it is a development sprint or an analysis activity.

Feedback can be immediately incorporated and as soon as it is possible to deliver something that won’t be total crap, it will be. Business gets what they want and need but IT is not busy coding themselves into an unanalyzed, unmanageable, unmaintainable nightmare of poopware.

So what do you think? Is agile analysis perfect the way it is, is it the most evil invention ever or is there some kind of a compromise we can make to stay responsive to the business without painting ourselves into a corner?

LewisC

You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.

Comments are closed.