Quick to Listen to Reality

Nostalgebraist writes about Bayesian updating:

Subjectively, I feel like I’m only capable of a fairly small discrete set of “degrees of belief.”  I think I can distinguish between, say, things I am 90% confident of and things I am only 60% confident of, but I don’t think I can distinguish between being 60% confident in something and 65% confident in it.  Those both just fall under some big mental category called “a bit more likely to be true than false. ”  (I’m sure psychologists have studied this, and I don’t know anything about their findings.  This is just what seems likely to me based on introspection.)

I’ve talked before about whether Bayesian updating makes sense as an ideal for how reasoning should work.  Suppose for now that it is a good ideal.  The “perfect” Bayesian reasoner would have a whole continuum of degrees of belief.  They would typically respond to new evidence by changing some of their degrees of beliefs, although for “weak” or “unconvincing” evidence, the change might be very small.  But since they have a whole continuum of degrees, they can make arbitrarily small changes.

Often when the Bayesian ideal is distilled down to principles that mere humans can follow, one of the principles seems to be “when you learn something new, modify your degrees of belief.”  This sounds nice, and accords with common sense ideas about being open-minded and changing your mind when it is warranted.

However, this principle can easily be read as implying: “if you learn something new, don’tnot modify your degrees of belief.”  Leaving your degrees of belief the same as they were before is what irrational, closed-minded, closed-eyed people do.  (One sometimes hears Bayesians responding to each other’s arguments by saying things like “I have updated in the direction of [your position],” as though they feel that this demonstrates that they are thinking in a responsible manner.  Wouldn’t want to be caught not updating when you learn something new!)

The problem here is not that hard to see.  If you only have, say, 10 different possible degrees of belief, then your smallest possible updates are (on average) going to be jumps of 10% at once.  If you agree to always update in response to new information, no matter how weak it is, then seeing ten pieces of very weak evidence in favor of P will ramp your confidence in P up to the maximum.

In each case, the perfect Bayesian might update by only a very small amount, say 0.01%.  Clearly, if you have the choice between changing by 0% and changing by 10%, the former is closer to the “perfect” choice of 0.01%.  But if you have trained yourself to feel like changing by 0% (i.e. not updating) is irrational and bad, you will keep making 10% jumps until you and the perfect Bayesian are very far apart.

This means that Bayesians – in the sense of “people who follow the norm I’m talking about” – will tend to over-respond to weak but frequently presented evidence.  This will make them tend to be overconfident of ideas that are favored within the communities they belong to, since they’ll be frequently exposed to arguments for those ideas, although those arguments will be of varying quality.

“Overconfident of ideas that are favored within the communities they belong to” is basically a description of everyone, not simply people who accept the norm he is talking about, so even if this happens, it is not much of an objection in comparison to the situation of people in general.

Nonetheless, Nostalgebraist misunderstands the idea of Bayesian updating as applied in real life. Bayes’ theorem is a theorem of probability theory that describes how a numerical probability is updated upon receiving new evidence, and probability theory in general is a formalization of degrees of belief. Since it is a formalization, it is not expected to be a literal description of real life. People do not typically have an exact numerical probability that they assign to a belief. Nonetheless, there is a reasonable way to respond to evidence, and this basically corresponds to Bayes’ theorem, even though it is not a literal numerical calculation.

Nostalgebraist’s objection is that there are only a limited number of ways that it is possible to feel about a proposition. He is likely right that to an untrained person this is likely to be less than ten. Just as people can acquire perfect pitch by training, however, it is likely that someone could learn to distinguish many more than ten degrees of certainty. However, this is not a reasonable way to respond to his argument, because even if someone was calibrated to a precision of 1%, Nostalgebraist’s objection would still be valid. If a person were assigning a numerical probability, he could not always change it by even 1% every time he heard a new argument, or it would be easy for an opponent to move him to absolute certainty of nearly anything.

The real answer is that he is looking in the wrong place for a person’s degree of belief. A belief is not how one happens to feel about a statement. A belief is a voluntary act or habit, and adjusting one’s degree of belief would mean adjusting that habit. The feeling he is talking about, on the other hand, is not in general something voluntary, which means that it is literally impossible to follow the norm he is discussing consistently, applied in the way that he suggests. One cannot simply choose to feel more certain about something. It is true that voluntary actions may be able to affect that feeling, in the same way that voluntary actions can affect anger or fear. But we do not directly choose to be angry or afraid, and we do not directly choose to feel certain or uncertain.

What we can affect, however, is the way we think, speak, and act, and we can change our habits by choosing particular acts of thinking, speaking, and acting. And this is where our subjective degree of belief is found, namely in our pattern of behavior. This pattern can vary in an unlimited number of ways and degrees, and thus his objection cannot be applied to updating in this way. Updating on evidence, then, would be adjusting our pattern of behavior, and not updating would be failing to adjust that pattern. That would begin by the simple recognition that something is new evidence: saying that “I have updated in the direction of your position” would simply mean acknowledging the fact that one has been presented with new evidence, with the implicit commitment to allowing that evidence to affect one’s behavior in the future, as for example by not simply forgetting about that new argument, by having more respect for people who hold that position, and so on in any number of ways.

Of course, it may be that in practice people cannot even do this consistently, or at least not without sometimes adjusting excessively. But this is the same with every human virtue: consistently hitting the precise mean of virtue is impossible. That does not mean that we should adopt the norm of ignoring virtue, which is Nostalgebraist’s implicit suggestion.

This is related to the suggestion of St. James that one should be quick to hear and slow to speak. Being quick to hear implies, among other things, this kind of updating based on the arguments and positions that one hears from others. But the same thing applies to evidence in general, whether it is received from other persons or in other ways. One should be quick to listen to reality.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s