The I-Knew-It-All-Along Phenomenon
One
problem with commonsense explanations is that we tend to invoke them after we know the facts. Events are far
more “obvious” and predictable in hindsight than beforehand. As Baruch
Fischhoff and his colleagues (Slovic & Fischhoff, 1977; Wood, 1979) have
demonstrated many times, our recollection of what outcomes we would have
expected from some experiment or historical situation is instantly distorted
once we know what really did happen.
When people are told the outcome of an
experiment, the outcome suddenly seems less surprising to them than it is to
people who are simply told about the experimental procedure and its possible
outcomes. In one of Fischhoff’s experiments, Israeli students estimated the
likelihood of various possible outcomes of President Richard Nixon’s
forthcoming trips to Peking and Moscow (Fischhoff & Beyth, 1975). When,
after his visits, the students were asked unexpectedly to remember their
predictions, they mistakenly remembered them as coinciding closely with what
they now knew had happened. Finding out that something had happened made it
seem more inevitable.
Likewise,
in everyday life we often do not expect something to happen until it does. We
then suddenly see clearly the forces which brought it to be and thus seldom
feel surprised. We say we really “knew all along that he was going to act that
way.” As the Danish philosopher-theologian Soren Kierkegaard surmised, “Life is
lived forwards, but understood backwards.”
If the
I-knew-it-all-along phenomenon is pervasive, you may now be feeling that you
already knew about it. Indeed, almost any conceivable result of a psychological
experiment can seem like common sense—after
you know the result. The phenomenon can be crudely demonstrated by giving half
of a group some purported psychological finding and the other half the
opposite result. For example:
Social
psychologists have found that, whether choosing friends or falling in love, we
are most attracted to people whose traits are different from our own. There
seems to be wisdom in the old saying, “Opposites attract.”
Social
psychologists have found that, whether choosing friends or falling in love, we
are most attracted to people whose traits are similar to our own. There seems
to be wisdom in the old saying, “Birds of a feather flock together.”
It is my experience that when fifty people are given one of
these findings and fifty the opposite finding and all are asked to “explain”
the result and then indicate whether it is “surprising” or “not surprising,”
virtually all will find whichever result they were given “not surprising.”
As
these examples indicate, we can draw upon the stockpile of ancient proverbs to
make almost any result seem commonsensical. Nearly every possible outcome is
conceivable, so there are proverbs for almost all occasions. Shall we say with
John Donne, “No man is an island,” or with Thomas Wolfe, “Every man is an
island”? Does “haste make waste” or is “he who hesitates lost”? Is “A penny
saved is a penny earned” true or is it “Pennywise, pound foolish”? Of a social
psychologist reports that separation intensifies romantic attraction, someone
is sure to reply, “Of course, ‘Absence makes the heart grow fonder’.” Should it
turn out the reverse, the same person may remind us, “Out of sight, out of
mind.” No matter what happens, there will be someone who knew it would.
This
hindsight bias creates a problem for many psychology students. When you read
the results of experiments in your textbooks, the material often seems easy,
even commonsensical. When you subsequently take a multiple choice test on which
you must choose among several plausible outcomes to an experiment, the task may
become surprisingly difficult. “I don’t know what happened,” the befuddled
student later bemoans. “I thought I knew the material.”
The
I-knew-it-all-along phenomenon also affects our assessments of our knowledge.
If what we learn does not surprise us, then we are inclined to overestimate how
much we already knew. Consider this question (to which 1 is the correct
answer): “Which is longer, (1) the Suez Canal, or (2) the Panama Canal?” What
is the likelihood you could have answered this question correctly if I had not
told you the answer? Fischhoff found that University of Oregon students who
were not told the answers to such questions tended to rate them as toss-ups;
those who had been told the correct answers thought they probably would have
gotten most right.
Now
that you and I know about this tendency to overestimate our past wisdom, will
we be as vulnerable to it as these Oregon students? Fischhoff (1977) wondered
about this, also. He forewarned some more Oregon students that on these
questions people
Exaggerate
how much they have known without being told the answer. You might call this an
I-knew-it-all-along effect. . . In completing the present questionnaire, please
do everything you can to avoid this bias. One reason why it happens is that
people who are told the correct answer find it hard to imagine how they ever
could have believed in the incorrect one. In answering, make certain that you
haven’t forgotten any reasons that you might have thought of in favor of the
wrong answer—had you not been told it was wrong.
How much effort do you think these “debiasing instructions”
had? Incredibly, they had no effect. Being fully forewarned about the hindsight
bias did not reduce it at all! (Surely, though, now that you and I know the
result of this experiment. . .)
Is
there no way to reduce the hindsight bias? With Paul Slovic, Fischhoff did find
one way (Slovic and Fischhoff, 1977). People were told the results of several
experiments. Some were then asked “Had the study worked out the other way, how
would you explain it?” These people perceived the result as much less
inevitable than did those who had not imagined an opposite result.
The
I-knew-it-all-along phenomenon can have pernicious social and personal
consequences. It is conducive to arrogance—overestimation of our own
intellectual powers and of the perceptiveness of our after-the-fact
explanations. Moreover, since outcomes seem as if they should have been
foreseeable, we are most likely to blame decision makers for what are, in
retrospect, their “obvious” bad choices than to praise them for their good
choices, since these, too, were “obvious.” Thus, after the Japanese attack on Pearl Harbor, Monday-morning
historians could read the signs and see the “inevitability” of what had
happened. Likewise, we sometimes chastise ourselves for our “stupid
mistakes”—for not having better handled a situation or a person, for example.
Looking back now, we see how we obviously should have handled it. But sometimes
we are too hard on ourselves. We forget that what is now obvious to us was not
nearly so obvious at the time.
The
conclusion to be drawn is not that
common sense is usually wrong. My hunch is that most conventional wisdom likely
does apply—under certain conditions. After all, both amateur and professional
social psychologists observe and form theories about the same human nature. The
point is that our common sense is often after
the fact—it describes events more easily than it predicts them—and we
therefore easily deceive ourselves into thinking that we know and knew more
than we do and did.
(Source: Myers,
David G. 1983. Social Psychology. United States of America: McGraw-Hill, Inc.)
× 『rui@96yR』【butterflyuu】 ×
Komentar
Posting Komentar