Site icon I Manage Products

Why we can’t help jumping to conclusions

Daniel Kahneman's book "Thinking, Fast And Slow" sits on the arm of a sofa (Photo by Mukul Joshi on Unsplash)

I gave a talk recently about how I’ve been using data and analytics to guide my decisions in product management. I’ve edited the transcript a little and split it into bite-size parts for your entertainment.

This bit is about why we can’t help jumping to conclusions. The last bit was about expensive and risky assumptions (and why you should check them).

Two systems of thinking #

Daniel Kahneman (Photo by Buster Benson on Flickr)

This chap is called Daniel Kahneman, and he is a recipient of the Nobel prize for economics. He tells us in his book, Thinking, Fast And Slow, that our brains actually have two systems of thinking: a fast, intuitive system that works from minimal information, minimal data; and a much slower, analytical system that does much more of the heavy lifting.

The fast system is designed to jump to conclusions very quickly. Back in ancient history, if we thought there was a tiger hiding in the grass, we’d probably want run first, analyse later. So that’s our fast system of thinking.

However, it’s also very prone to errors. Certainly I do – I occasionally mistake a tomato stalk for a spider and jump out of my skin. So that’s our fast system jumping to conclusions.

We engage our slower system when we undertake a more complex task like counting the number of letter ‘e’s on a page. It’s not something we can just jump to a conclusion about, it takes more effort, and so this slower, analytical system does the job properly.

So here’s the thing: because this slower system takes more effort to get going, our fast system always keeps jumping in first. It causes us to create a plausible narrative based on very little data – like my tomato stalk / spider mix-up.

Assuming without evidence #

Now, in product management terms, this makes it very easy and tempting for us to convince ourselves that we understand what our users need, even though we have very little evidence to support that. Off our fast system goes and says, oh yes, I’ve got relatively little evidence, but I can make a plausible narrative for what users need. And it’s that kind of assumption, without any evidence, that leads us astray.

And yet I’m sure we’ve all experienced those lightbulb moments of realisation when we do actually go out and talk to our real users. Suddenly, when we get one of those lightbulb moments, all of our assumptions are turned on their heads – all it takes is just a little bit of extra data or evidence to flip around what we thought to be the case, and then we suddenly realise we had it all backwards.

(Credit Volvo/YouTube)

Don’t drive blindfolded #

It’s a little bit like driving blindfolded. When you want to drive somewhere, if you were to jump in the car and drive off with a blindfold on, there would be quite a high likelihood of coming to grief. You wouldn’t be able to see where you were going, you wouldn’t be able to react to things happening around you, you wouldn’t be able to see the pedestrians or the other cars around you as you drive along.

So why would you take the same approach when you’re plotting the course for your product? Without taking in the information around you about your product and reacting to it, you’re effectively increasing your risk and likelihood of failure.

So it’s a really good idea to open your eyes and use the information around you when you’re deciding what to do with you product – why would you want to do it in any other way?

Reduce risk – check your assumptions #

Another way of looking at this is is to check your assumptions, is to reduce the risk. One of the ways to do this is to eliminate as much uncertainty as you have as you go along. The way you do this is by learning from your users as quickly and frequently as possible. When you’re at your most uncertain, right at the very beginning, your main job should be to get as much learning as possible and to apply that learning to your products, and challenge all those assumptions you have.

This graph here, by a chap called Roman Pichler, who’s another great product manager, who blogs and teaches and so on, illustrates what I’m trying to get at here.

(Credit Roman Pichler)

Next time: how UK government digital services gather and use evidence

Exit mobile version