Biases Affecting Information Processing

0
1313

The information that comes into our lives is seldom received in an objective or neutral way. Instead, there are several biases which affect the degree of influence a given amount of information has on our knowledge, beliefs, and decision making. Here, briefly, are some of those biases.

1. Availability. We are more likely to be influenced by information that is already present, handy, or easy to find than by information that requires effort to locate it. The definitions in the dictionary at your elbow are more likely to be used (and affect your knowledge) than are those in the dictionary in the next room. The problem is that the unavailable information, or the information that is harder to get, may actually be the more reliable or the more definitive for use in making good decisions. The information does not have to be (but could be) trade or state secrets; it might just be a more accurate appraisal of a situation. It is easy to be lazy and just take what is handy. Many of us are already finding that we will take what we can find on the Internet rather than supplement this information with what we can find in printed resources. Just because it is handy, does not mean that it is dandy.

2. Familiarity. We tend to believe whatever is most familiar, or what is an extension of the familiar. What we have long known tends to get by with little scrutiny. And what is made familiar enters our beliefs with less examination than it would if strange. Advertising slogans are repeated over and over because advertisers know that by such repetition we come to believe the slogans. Repetition and its resultant familiarity can be a more important factor in our belief than any kind of analysis or evidence. Unfortunately, “common knowledge” is often incorrect. And if someone tells you something false a thousand times, the thing is still false. Unfamiliarity or strangeness, affects us, too. We tend to reject, to disbelieve, the unfamiliar. Strange things, for some reason, just cannot be true. But then they become familiar and they are suddenly true.

3. Memorability. Information that a person finds interesting and memorable is much more likely to influence a judgment about the probability of its occurrence. That is, people who can recall specific examples of an event may believe that event occurs more often than it really does. Similarly, when examples cannot be brought to mind, the frequency of the event is judged to be lower. Chance or coincidental events that are highly memorable are thus often judged to be regular or common. And, of course, information you cannot remember is not going to be used in making a decision.

4. Recency. We all have so much information flowing into our consciousness every day that we just cannot keep track of all of it. New information pushes out old. We therefore tend to favor the most recently gained information, and base our decisions (and change our minds) because of this “new” information. Had the information arrived in a different order, our decision would have been different. The cure to this bias is to take good notes and review them regularly, so that newer information will not automatically enjoy a privileged position of influence.

5. Sequence. The two periods of greatest attention, whether in moments or months, are at the very beginning and the very end. Therefore, information presented or received first and last in a project, problem, research project, class session, or meeting will be remembered better and given more importance than it would otherwise deserve. Similarly, information arriving in the middle of a project may be unfairly discounted or ignored, simply because of its position in the data stream. When information is received a little at a time, a processing bias toward the first part of the information is established. Later information will receive less attention. When information is received all at once, a different and perhaps better selection of what to process can be made. When all the information is together, it is easier to see what is important and what is not.

6. Sparkle. Lively, immediate, personal experience overwhelms theory and generalization. Many people base their personal behavior and values on generalizations formed by single personal experiences, even when those generalizations conflict with much better established facts based on thorough empirical investigation. Abstract truths, detailed statistics, even moral values–all can be ignored when a strong personal experience points to a different conclusion. It is no wonder so many people seem bent on giving us the “razzle dazzle” instead of cogent arguments or facts. Additionally, people who can be convinced that they have experienced a truth or something that they see as evidence toward a truth, will often not listen to any arguments to the contrary. The emotion attached to the experience, the reality of it all, and the feeling of being an eyewitness, are too much to confute. But if you have watch any magic acts, product demonstrations, or suave Romeos, you know that substance and style are not necessarily the same.

7. Compatibility. A stable and sane personality requires that we have a pretty firm idea of what reality is like, causing us to reject ideas that do not conform with our sense of how things are. We tend to accept ideas that agree with our own beliefs and reject those that conflict. Of course, when we are wrong, we continue to reject what is true and continue to build a false world. Thus it is recommended that we examine our biases once a year, and always entertain the idea that we might be wrong in our beliefs.

8. Preconception. Our current concerns tend to control our perceptions and interpretations of incoming information. If we believe the company is having financial trouble, we will interpret ambiguous data as support for that conclusion. If we believe someone likes or dislikes us, we will interpret that person’s acts in a way consistent with our expectations. When you buy a blue car, you suddenly see blue cars everywhere. This selective perception can change a person’s view of reality: (1) What people expect or wish to see, they will see. (2) People seek and give weight to information that supports or harmonizes with information they already believe. (3) People depreciate or reject information that conflicts with beliefs or conclusions already held. The facts do not speak for themselves. Most information is ambiguous enough to allow more than one interpretation to be put on it, and our interpretations are substantially controlled by what we already know and expect to find. However, if we are aware of these prejudices of interpretation, we can prevent them from misleading us.

9. Privilege. Information we are convinced is scarce, secret, special, or restricted takes on an automatically greater value and appears more credible than information that just anybody can obtain. (This is why most efforts at censorship fail–banning or censoring a book or film makes people think it is better and more desirable than ever.) You can imagine that the manipulators have long ago caught on to this particular bias, and use it to make us want something that otherwise would make us yawn. The next version of this article will be called “Exclusive Secrets of Information Bias.”

10. Visual Presentation. Information presented visually often influences us more than information presented textually. Visual items are immediate, graphic, colorful, and do not require the processing of symbolic manipulation (that is, reading). Hence, the Chinese proverb, “A picture is worth a thousand words.” But what if the text version is the more accurate one, while the picture deceives?

11. Mental Effort. Information that is easy to understand, presented clearly and simply, described in exact and graspable terms, is much more likely to influence us than difficult, tedious, or ambiguous information. This fact may explain why so many people are more persuaded by anecdotes and stories than by facts. A good story beats a table of data any day. And yet, it is the truth we need, not an entertaining story.

In addition to the above biases, several factors can hamper the best use of information.

Hasty Generalization. Many people formulate generalizations on the basis of very small samples, often one or two or three instances. The first two or three examples of something (especially if experiential, see Sparkle above) are judged to be representative, though they usually aren’t. Generalizing from one’s own limited experience and then adjusting one’s interpretation of subsequent events is a major problem in life itself as well as in information processing.

Inconsistency. Most people have trouble applying consistent judgmental and evaluative strategies in similar cases. Information from one source will receive more favorable treatment than information from another. Information received, say, in the morning will be viewed more favorably or more critically than similar information received in the afternoon.

Pressure. Under pressure, information tends to be processed using shortcuts, simplification, and superficial analysis. Techniques that most good analytical thinkers would condemn–such as stereotyping, pigeonholing, quick impressions, skimming, and so on–will be used simply as a means of coping with time or action constraints.

Contrast. Two differing items considered closely together in time tend to appear more different than they really are. The mind exaggerates differences, perhaps as a means of distinguishing the items. This principle may result in part from our nasty habit of wanting to see the world in black and white terms. If this report is not so good and that one is pretty good, why, one must be perfect and the other terrible.