Monday, August 11, 2008

Uncommon sense

A business unit manager and I were talking last week about a process change his team had put in place. The team is convinced their change has improved productivity. Unfortunately, no metric supports that -- in fact, the data say that the process change is much less productive than the previous standard. So where's the disconnect?

The team leader is convinced that his 'common sense' approach will yield improvements. I'm reading the book Sway: The Irresistible Pull of Irrational Behavior and it's given me some insight into the dynamic at play. Regardless of education or profession, people are more influenced by behavior and perceived value than they are by quantifiable, objective data when making decisions. This is so obvious and, simultaneously, stunning.

I think about the HR assessment tools that profile how a person makes a decision: fact-based, feelings-based, balanced mix? It's dawning on me that we're probably kidding ourselves, when safety experts, scientists and medical doctors (fact-based jobs if ever there were) have a clear track record of throwing facts to the wind in life-or-death decisions. This is obviously so hard-wired in the human brain that it feels quixotic to tilt against it.

Yet, acknowledging the sway of 'common sense' is necessary to understand what is required to instill the uncommon sense of fact-based decisions. Uncommon sense is artificial, and not natural, but we can still value it over the chaotic natural order of things. Rare things are often more valuable, after all.

1 comment:

Anonymous said...

I'm going to have to have a look at that book. I think that the common sense we often talk about is a quick-assessment faculty built into the species on a primordial level, to allow decisions to be made using complex and messy information-processing equipment (the human brain) without getting bogged down too much. The problem with such a system is that it's evolved to be right most of the time while operating in environments with very few rules and limited positive outcomes. For instance; Are you going to let that saber tooth cat eat you or are you going to chuck a rock at him? This is not an operational environment in which you want to go too far into the question of whether you're going to chuck the pretty brown rock or the dull gray one. If you do survive, you can debrief yourself and decide later whether or not you picked the right rock - whether your mate is going to hit you on the head with a mammoth bone for throwing away the rock your mother-in-law gave you as a cave-warming gift. On the other hand, this system does not help us much when we need to decide whether to commit resources to a complex project. Whether or not, for instance, we are we going to use ailerons or a flex-wing when we're trying to decide on control surfaces for a next-gen passenger jet that will be carrying 500 passengers over thousands of miles of water with little bitty life jackets under their seat that wouldn't float a donut for more than a few minutes, let alone your mother-in-law (who is flying in, of course, to present you with a pretty rock/door-stopper for a condo-warming gift). This environment is one in which common sense and gut feelings need to be intentionally restrained.
I don't know that anyone's ever figured out if there's a balance between common sense and purely rational decision making, a balance that would yield good outcomes on a basis better than pure luck, but I suspect that in today's complex environments, that balance if it exists is pretty well weighted toward rationality at the expense of common sense.

Cheers