September 21, 2011
I often wonder if over-analyzing anyhing is a positive or a negative. Obviously, second guessing yourself is dangerous, but in the context of getting as much information from a patient as possible, do we set ourselves up to fail as the amount of information offered from a patient’s perspective could conceivably bias our decision making process and set us on the wrong road. So for your perusal, something to peruse and ponder upon:
of the stories I tell in “Blink” is about the Emergency Room doctors
at Cook County Hospital in Chicago. That’s the big public hospital in Chicago,and a few years ago they changed the way they diagnosed heart attacks. They instructed their doctors to gather less
information on their patients: they encouraged them to zero in on just a
few critical pieces of information about patients suffering from chest pain–like
blood pressure and the ECG–while ignoring everything else,like the patient’s
age and weight and medical history. And what happened? Cook County is now one
of the best places in the United States at diagnosing chest pain.
Not surprisingly, it was
really hard to convince the physicians at Cook County to go along with the
plan, because, like all of us, they
were committed to the idea that more information is always better. But I
describe lots of cases in “Blink” where that simply isn’t true.
There’s a wonderful phrase in psychology–“the power of thin
slicing”–which says that as human beings we are capable of making sense
of situations based on the thinnest slice of experience. I have an entire
chapter in “Blink” on how unbelievably powerful our thin-slicing
skills are. I have to say that I still find some of the examples in that
chapter hard to believe.
Finally, in another telling experiment, the psychologist Paul Slovic
asked bookmakers to select from eighty-eight variables in past horse races
those that they found useful in computing the odds. These variables included
all manner of statistical information about past performances. The bookmakers
were given the ten most useful variables, then asked to predict the outcome of
races. Then they were given ten more and asked to predict again. The increase
in the information set did not lead to an increase in their accuracy; their
confidence in their choices, on the other hand, went up markedly. Information proved to be toxic. I’ve
struggled much of my life with the common middlebrow belief that “more is
better”–more is sometimes, but not always, better. This toxicity of
knowledge will show in our investigation of the so-called expert.