Last year I wrote about how we are doing planning all wrong, or rather, how we seem to focus on the wrong things when we do planning. We obsess about stories and story points and estimation, because that’s what we’ve been taught to do.
Once upon a time there was a lady in a taxi. It took such a long time for the lady to get to her destination in the taxi that she went to the town hall and told the man from the council. The man from the council wanted to figure out why the taxi journey was so slow, so he placed cameras at all the traffic lights in the town to measure how many cars went past, and how quickly. The traffic light cameras would click every time a car went past the lights.
Business people want estimates. They want to know how much it’s going to cost them to get a solution, and they want to know how likely it is to come in on time and on budget. And of course quality is not negotiable.
During the past few years, service-oriented architecture (SOA) has become a mainstream approach for designing and developing enterprise software. As with most new technologies, the adoption of SOA has been led by software vendors and shaped by their particular tools and sales targets. Naturally it is in the vendors’ interest to emphasize the complexity of SOA and then provide a timely and profitable solution. This leads many systems architects into a technology-centric view of SOA, when, in fact, the most important criteria for a service-oriented architect—before tackling the technology—should be a keen understanding of the business.
Behaviour-driven development is an “outside-in” methodology. It starts at the outside by identifying business outcomes, and then drills down into the feature set that will achieve those outcomes. Each feature is captured as a “story”, which defines the scope of the feature along with its acceptance criteria. This article introduces the BDD approach to defining and identifying stories and their acceptance criteria.
I had a problem. While using and teaching agile practices like test-driven development (TDD) on projects in different environments, I kept coming across the same confusion and misunderstandings. Programmers wanted to know where to start, what to test and what not to test, how much to test in one go, what to call their tests, and how to understand why a test fails.
The deeper I got into TDD, the more I felt that my own journey had been less of a wax-on, wax-off process of gradual mastery than a series of blind alleys. I remember thinking “If only someone had told me that!” far more often than I thought “Wow, a door has opened.” I decided it must be possible to present TDD in a way that gets straight to the good stuff and avoids all the pitfalls.