Error, Human

Much market analysis operates on the assumption that more data is better – more data leads to more accurate results. More data may require more complex processing, leading to greater and greater requirements for computing power but, in principle, the idea is that more is better.

Out in the real world, however, we don’t have the luxury of this kind of analysis. This leads to errors which sometimes we call biases. But surprisingly it also often leads to better results. It may just be that the reason we make so many mistakes is because we’re trying to process too much information, not because we’re naturally error prone.

Math Good, Instinct Bad

There’s a type of snobbery that’s grown up around thought processes: logical or statistical analysis is good, gut instinct is bad. But by and large, we’ve managed to be a fairly successful species without the majority of us being able to figure out the math behind, well, anything. Of course, the reality of everyday life is that we don’t have the luxury of carefully computing every decision so the fact we have ways of making quick and snappy choices isn’t accidental, it’s deliberate: we’re designed that way.

Now the idea that we don’t always make analyzed choices based on all the available information is often presented as sub-optimal behavior; often referred to as cognitive biases. We’re biased, and illogical, because we don’t perform the type of analysis that a bunch of academics think we should. And, of course, the word “bias” is pejorative – it implies that we’re doing something wrong, the idiot bunch of badly shaved apes that we are.

But this is a ridiculous standard to hold people to. We make thousands of decisions every day, it simply doesn’t make any sense to carefully analyze every single one of them. We need short-cuts to cut the load and we call these heuristics, basic rules of thumb that allow us to make good enough decisions to get through the days and weeks and months that constitute our lives.

Good-Enough Not Good Enough?

This split between “good” logical analysis and “bad” biased heuristics can be traced back to the work of the founders of the research into behavioral biases. On one hand we had Amos Tversky and Daniel Kahneman propounding a logical standard of behavior that later morphed into the two-system theory of reasoning – the intuitive System 1 and the logical System 2. On the other we had Herb Simon asking questions not about how we should reason, but about how we do reason – a concept we now refer to as bounded rationality.

Print Friendly, PDF & Email