About This Site
> The Mind
Not Liking UncertaintyThe idea I want to talk about here is that people don't like uncertainty—or, to put it another way, that there's something about uncertainty that makes it an unpleasant experience for people, much like cognitive dissonance. The idea isn't just my own, but unfortunately I can't remember where else I've seen it.
Of course, by “uncertainty” I don't mean just any old uncertainty. I might be uncertain whether the mail has arrived, but I can always go check, and anyway it's no big deal one way or the other. To be unpleasant, the uncertainty has to satisfy certain conditions.
As a nice impersonal example, consider the uncertainty following the presidential election in 2000. It was important, indefinite, and, for most of us, unresolvable. Was it unpleasant? Yes, but there's more to it than that, as I should have explained earlier. For me, at least, uncertainty is unpleasant in a very specific way. I get fixated on wanting the uncertainty to be resolved, and I don't want to think about anything else in the meantime.
Although I tend to think of it that way, uncertainty doesn't have to be only about facts and outcomes, it can also be about (for example) reasons—why someone did something, why something happened. That kind of uncertainty, unfortunately, doesn't always get resolved in the end. Sometimes, in fact, things happen for essentially no reason at all. Then, I think, not liking uncertainty pushes the mind to invent reasons—religion, among others.
The following passage, from Why People Believe Weird Things, is interesting mainly because it's the only reference to the idea of not liking uncertainty that I could find, but also because it makes the same connection between explanations (reasons) and certainty.
Most of us, most of the time, want certainty, want to control our environment, and want nice, neat, simple explanations. All this may have some evolutionary basis, but in a multifarious society with complex problems, these characteristics can radically oversimplify reality and interfere with critical thinking and problem solving.
Finally, here is one more thought I've had. If you're mathematically inclined, like me, you might be tempted to take the following view of things.
Uncertainty is really just a matter of probabilities and outcomes; not liking uncertainty is nothing but the fact that a small probability of a bad outcome is still relatively bad, i.e., has a negative expectation value.
I can see two problems with that view. For one thing, nothing I've said about uncertainty implies a bad outcome! We could equally well be talking about a small probability of a good outcome, which ought to be a good thing. And, perhaps it is, but to me it seems diminished by the presence of uncertainty. That leads to a different view.
Not liking uncertainty is the fact that there is a small cost or penalty that modifies the expectation value.
The real problem (with either view) is that expectation values are a tool for rational thought, and not liking uncertainty is essentially irrational. It is a quirk of the mind, which of course is why I'm writing about it here. If I'm troubled by an important, unresolvable uncertainty, you can adjust the expectation value all you like—by, say, paying me money—and I'll still be troubled.
Restaurant Effect, The
o August (2000)
@ July (2002)