Blink Estimation

Experienced delivery folks can have surprisingly good instincts for macro-level estimation, as long as we are careful to manage blind spots and cognitive biases. This can be an important tool in early project investment discussions, and can remove roadblocks where people are uncomfortable or unwilling to provide estimates.

Prologue

Back in 2003 I was working at an Internet bank with the remarkable Dave Leigh-Fellows. I could tell you many stories about Dave. One of my favourites involves roller skates. On this occasion we were running a pre-planning session for a major new product that would integrate all your bank accounts into one place. You could see at a glance your mortgage, savings, loans, current accounts, credit cards, everything in one place, from any number of banks or financial institutes. It was quite an audacious plan, and understandably there were many risks and integration points. Before we even got into release planning it was decided we would spend a day thinking about how that planning should work. It was like the United Nations, with 18 people sitting around a large, U-shaped table, representing technology, the business, senior management, operations, legal and compliance, and no doubt various other folks.

We were about two hours into the day and things were moving like treacle. The sticking point was scope and project size. How big was the box? How many people and how much time would it all take? We kept going round the same meta-discussions: What could we do to start to discover how much effort would be involved? Exhausted and frustrated, we stopped for a break. In the hallway Dave said something to me. He was convinced all the people in the room already knew how big the problem was, but no-one wanted to go out on a limb and suggest the scope for fear of getting shot down. So he came up with an idea. As the external consultant I was to be the fall guy. I would suggest Dave’s plan as though it were my own—no-one would have listened to Dave because he was “just” a delivery lead and there was too much at stake. Once we were all sat back down I suggested an exercise. We had been talking for over two hours now and everyone had a sense of the project we were taking on and the parameters involved. We had confidence in a third party vendor who would supply the links into the various external banking platforms, so the real product would be the application on top of all this to wow our customers.

I asked everyone to take an index card and to write on it, based on nothing but their gut instinct and what they had heard so far, how many people and how much time they thought the project would involve. I suspect the number of raised eyebrows outnumbered the people in the room, but I said that the whole exercise would only take five minutes and asked them to humour me. You can imagine what happened. On a count of 3 we all raised our cards, and everyone’s estimate was the same, around 6-8 people for around 6-8 months. Some were towards the higher end and some lower, but no-one was out by an order of magnitude and everyone had an opinion.

“Well,” said Dave, laughing, “it looks like we’ve got our project estimate. What’s next?” Indeed we had a consensus around the room that 8 people for 8 months should be able to take a reasonable-sized chunk out of the problem. Eight months later, a team of eight people delivered the first release of Money Manager and it was a great product.

The second time

I remember being completely blown away by how well this exercise had worked. And then promptly forgot about it for five years. The intervening years were filled with ever more detailed and sophisticated planning techniques. The zenith of these was a multi-column spreadsheet detailing, story by story, minimum, likely and maximum effort, numbers for clarity and volatility, and other things I didn’t really understand. I’ve written about this before, but I didn’t tell you what happened next.

I came out of one such planning exercise—two days of gruelling micro-planning and estimation that had generated in excess of 400 stories in amongst some genuinely useful conversations—and was greeted by one of my colleagues who asked why I was looking so ragged. I told him we’d been in a two day planning workshop for a business website. He asked what our final rolled-up estimate was for the project so I told him.

“What??” he exploded. “My grandmother could do it for that much!” And sadly he was right. This was the shot in the arm I needed to realise that a) micro-planning is fractal: The more detail you go into the more the rolled-up result expands, and that b) there must be a better way. The fact that everyone else in the agile world was doing it this way didn’t make me feel any better about it.

A couple of days later we got an inbound call from a TV company looking to rebuild their web platform ahead of a new season of a reality show. The site traffic would be very spiky: after each episode there would be a two hour window during which several million people would register a vote for the person to be ejected from the show. The rest of the time it would be just a regular media site with forums, news updates and suchlike. I decided to try an experiment, and as I look back I realise it must have been informed by my earlier Leigh-Fellows experience. I took a small number of people into a room. The group included a couple of senior developers, a tester, a business analyst and a project manager. I seem to recall the analyst had some previous media website experience. I described the project brief to them and we discussed it for a while, maybe 20 minutes. What kind of traffic could we expect during the voting window? What sort of resilience and redundancy should we plan for? What about tying in with their media style guide and design constraints? What kinds of technologies were at our disposal? How much access would we have to the client, and how frequently?

Then I did the index card thing again: I asked each of them to write down roughly how long it would take and how big a team they would need. Extra credit if they had specific skills or people in mind for the team. Again there was a surprising degree of consensus, so I called the client and gave him the project estimate.

“How did you come up with that number?” he asked.

I could have bluffed it but instead I said “I got a group of really smart people in a room, with at least 10 years’ experience each, and asked them.”

After a pause he replied: “Sounds like a pretty sensible approach to me. You’ve got the gig.”

Sometimes you just don’t know

Of course now I had a problem. I had stumbled upon a fast, accurate way to estimate at project scale, and I was pretty confident no-one would believe me. It simply didn’t have enough voodoo. I tried it a couple more times and each time it seemed to just work. I wasn’t allowed to run free with this of course. The pseudo-scientific micro-planning would still take place in parallel. But my blink estimates, as I had taken to calling them, were at least acting as a foil to project inflation. “It just doesn’t feel like a one year project” was often enough to push back on some of the more bonkers estimates.

Then I hit a stumbling block. A client had asked us to estimate a piece of work and I had pulled a group of people into a room. They knew the drill by now and we had the discussion and did the estimation piece, and we all held up our cards. They all had a number of people and a number of months, except one card that the business analyst was holding up. Hers just had a large question mark written on it.

“What does that mean?” I asked. “It means we can’t possibly know, and here’s why.” And she proceeded to list half a dozen reasons why we couldn’t possibly know. As she was speaking, we all lowered our cards, feeling increasingly silly. She had driven a truck through any possible blink estimate. So I went back to the client and told him. We can’t give you an estimate but we can describe where we see the uncertainty, and we suggest you spend a couple of weeks investigating some of these unknowns, in order to come up with some kind of sizing.

The client was quiet for a minute or two. Then he said: “That’s interesting, and I believe you. So now I’m left wondering how the other suppliers I asked were able to come up with an estimate. They must have been just guessing, because surely they have the same holes as the rest of us.” On the back of this he gave us the investigative piece, and we went on to win the follow-up work once we had given him an idea of the size of the main delivery.

I’ve been teaching Blink Estimation in my Accelerated Agile class and introducing it in my consulting work for about a year now and a number of people have been trying it. I usually recommend running it in parallel: do it alongside whatever you currently do, and if it differs significantly keep with your current method but maybe question its veracity. The results have been surprisingly positive. One of my clients told me they used blink estimation to break a planning deadlock where for various reasons people were uncomfortable about giving estimates he knew they had in their head.

There are a small number of “rules” to increase the likelihood of it working, and a number of caveats. First the rules. There are three things you need to do blink estimation:

  1. Experts estimating
  2. Expert messenger
  3. Expert customer

Experts estimating

This should be obvious. Blink estimation is a comparison exercise that draws on the context you’ve built up becoming an expert, in the Dreyfus sense. It says: Based on all the projects you’ve delivered in the past, and given what you know of the current context and constraints, how big do you think this new one is? Without the context of many other projects you simply don’t have a suitable frame of reference for estimating this one, so it requires the experience of many previous projects, both successful and unsuccessful!

Your role as facilitator is to gather people from different disciplines: testers, analysts, project managers, programmers, architects, operations and support engineers among others. Choose disciplines that are likely to have overlapping but distinct areas of expertise. You frame the discussion using neutral language and open questions, and try to dissuade the participants from over-thinking things. Using words like roughly or about frees the participants from feeling committed or constrained by their estimates. And remember, these are just estimates.

You want to minimise the likelihood of shared blind spots, which is why you want people from different backgrounds discussing the potential risks and pitfalls. You’ll be able to sense when people are moving towards a decision point. At that point you ask everyone to estimate size and duration, and then if there are large variances you can explore the assumptions behind the outliers.

This is just the same as planning poker but at project scale. The security expert might see an online store project as much harder than a developer, because she knows the enormous variability that PCI audits, injection attack testing and other hardening activities can introduce. The developer can then put her mind at rest by suggesting they use a third party pass-through credit card processor to remove PCI risk, and asking her to be available at each mini planning session to advise on where the security dragons might emerge. In practice I’ve not seen macro-level estimation take more than a couple of rounds, with diverse conversations and discussions at each round, to reach a reasonable consensus. It’s worth repeating, this is an estimation session. We want an order of magnitude, not a precise answer.

Expert messenger

The expert messenger is about knowing how to frame the output of the exercise so it makes sense to the customer. You need to be comfortable describing how experts rely on instinct and intuition, and defending that as a legitimate basis for estimation. You should consider the various cognitive biases I describe below, such as anchoring, priming and environment, that can impair the estimate. One counter-intuitive outcome of blink estimation is that it doesn’t require the delivery team to estimate their own work. As long as the experts are aware of who will likely be delivering, especially if this materially affects their estimate, and as long as the team trusts the people doing the estimating, then it still works.

Expert customer

Having an expert customer is the condition least likely to be under your control. I was lucky the first time, having Dave in the room, and the second time with the media client who just “got it”. Back then I doubt I would have been able to convince him. If he’d demanded the full story breakdown I would probably have just caved and spent two days building the spreadsheet. Nowadays I recommend investing effort into educating your customer or client. Once they are on board with the idea of trusting the delivery team’s experience lots of things become a lot smoother. I’ve even done blink estimation sessions with the customer as part of the group. It’s pretty powerful!

So where do these expert estimators come from, and how do we grow the next generation of them? Once a team is used to the dynamics of blink estimation you can use it as a learning workshop for the non-experts. The non-expert observers are just that: observers. They are not allowed to ask questions in the blink estimation session because their questions can unintentionally lead to priming or anchoring biases, but they can take whatever notes they like and afterwards they can ask as many questions as they want. One that often comes up is “Why didn’t they discuss X?” Where X is someone’s pet topic, say security or testing. It usually transpires that everyone was aware of X, but no-one thought it would affect the estimate so they didn’t bother raising the subject, and this is the entire point of blink estimation. The estimators only discuss issues they feel could materially affect the estimate. Everything else, however important a topic in its own right, can be safely deferred until the project is under way.

Programmed to fail

The huge elephant in the room with blink estimation, and the reason I recommend trialling it alongside whatever you currently do, is that any instinctual exercise is subject to any number of cognitive biases. From the story you read in the newspaper on the way to work, to the temperature in the room, to the last number you heard—however irrelevant it might be—your instinctual self is subject to unexpected recalibration at a moment’s notice. Daniel Kahneman won a Nobel Prize for showing us how easily fooled we can be, and how utterly confident we are that it only happens to Other People. I highly recommend his book Thinking Fast and Slow. As a facilitator of blink estimation your job is to try to manage those biases. For example everyone showing their estimates at the same time is one countermeasure. If we went round one at a time the earlier estimates would act as an unconscious anchor to the later ones.

Estimation as an investment decision

When someone asks me for an estimate I try to invert the conversation to talk about investment. When they ask how long you think it will take, they are really asking: How much should I expect to invest in this in order to get a decent return from it? If you can surface this question instead you can help them explore the expected return, and it becomes a straightforward return-on-investment discussion rather than the traditional estimation-as-contract negotiation.

Once you have agreed an investment in terms of people and time and other resources, you can choose to deliver to that timescale. By attempting to surface uncertainty early on and keeping an eye on the goal of the project, you simply work towards the date and declare victory when you get there. In reality you can often declare mini-victories on the way through, with interim releases to demonstrate progress and validate your assumptions. It’s like firing an arrow and then painting the target around it—you start hitting a lot of bulls-eyes! Focusing on uncertainty early increases the likelihood of surfacing assumptions that can have a material impact—either good or bad—on your original project estimate. This allows you to provide your delivery assurance and governance functions with information to support effective steering.

So what’s wrong with story-level estimation?

Every project contains uncertainty. If it doesn’t you shouldn’t be doing it —it’s already been solved! At a macro level you can have an educated opinion about how much space you should allow for dragons, those unknown unknowns that are waiting to derail your project. With sufficient experience and the right mix of people in the room you can reasonably assess how this project is similar to, and different from, previous ones, and how much you should reasonably invest to realise the business impact associated with it.

Breaking things down to story level is one way of exploring a space, and can lead to some good discoveries, but it is far from the best approach to surfacing uncertainty. If it were, all agile projects would contain no surprises because they would all have emerged during the backlog creation. Except that means we’ve just reinvented big, up-front analysis and design! We can’t know which individual stories will surface a particular dragon, not least because many surprises are cross-project rather than specific to a particular story, like unexpectedly unavailable hardware or unexpected organisational constraints, so it’s pretty random which story will be first to the line. At delivery time, one three-point story might take five times as long as another, for very good macro-level reasons. Where then is the value in estimating each story?

On the other hand, a group of people with complementary experience and skills, having a conversation whose sole intent is to identify gaps and surface assumptions, can quickly reach a strong consensus on how big the box really needs to be.


Colophon

This article has been translated into Russian by Denis Oleynik.