Confirmation bias

From New World Encyclopedia
Confirmation bias has been described as an internal "yes man," echoing back a person's beliefs like Charles Dickens' character Uriah Heep.[1]

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. This bias has the unfortunate consequence of people holding on to beliefs that are contradicted by evidence. This can lead to polarization of opinions, in which a disagreement becomes more extreme and has the possibility of a tragic outcome.

Flawed decisions due to confirmation bias have been found in a wide range of political, organizational, financial, and scientific contexts. These biases contribute to overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. For example, confirmation bias produces systematic errors in scientific research based on inductive reasoning (the gradual accumulation of supportive evidence). Similarly, a police detective may identify a suspect early in an investigation, but then may only seek confirming rather than disconfirming evidence. A medical practitioner may prematurely focus on a particular disorder early in a diagnostic session, and then seek only confirming evidence. Confirmation bias is amplified by the use of filter bubbles, or "algorithmic editing" commonly used in social media and search engines, which display to individuals only information they are likely to agree with, while excluding opposing views.

Definition

Confirmation bias is a term coined by English psychologist Peter Wason, to describe the tendency for people to immediately favor information that validates their preconceptions, hypotheses, and personal beliefs regardless of whether they are true or not. It also includes the tendency to strive toward proving one’s hypothesis instead of disproving it.[2]

Confirmation bias (or confirmatory bias) has also been termed myside bias, a term suggested by David Perkins, a professor and researcher at the Harvard Graduate School of Education. This reflects the bias as a preference for "my" side of an issue.[3]

Confirmation biases differ from what is sometimes called the behavioral confirmation effect, commonly known as self-fulfilling prophecy, in which a person's expectations influence their own behavior, bringing about the expected result.

Discovery

The phenomenon that has come to be known as "confirmation bias" was observed throughout history. Beginning with the Greek historian Thucydides (c. 460 B.C.E. - c. 400 B.C.E.) wrote of misguided reason in History of the Peloponnesian War: "... for it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy."[4] Italian poet Dante Alighieri (1265–1321) noted it in The Divine Comedy, in which St. Thomas Aquinas cautions Dante upon meeting in Paradise, "opinion—hasty—often can incline to the wrong side, and then affection for one's own opinion binds, confines the mind."[5] Ibn Khaldun noticed the same effect in his Muqaddimah:

Untruth naturally afflicts historical information. There are various reasons that make this unavoidable. One of them is partisanship for opinions and schools. ... if the soul is infected with partisanship for a particular opinion or sect, it accepts without a moment's hesitation the information that is agreeable to it. Prejudice and partisanship obscure the critical faculty and preclude critical investigation. The result is that falsehoods are accepted and transmitted.[6]

In the Novum Organum, English philosopher and scientist Francis Bacon (1561–1626) noted that biased assessment of evidence drove "all superstitions, whether in astrology, dreams, omens, divine judgments or the like."[7] He wrote:

The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or rejects.[7]

In the second volume of his The World as Will and Representation (1844), German philosopher Arthur Schopenhauer observed that "An adopted hypothesis gives us lynx-eyes for everything that confirms it and makes us blind to everything that contradicts it."[8]

In his essay (1894) The Kingdom of God Is Within You, Russian novelist Leo Tolstoy wrote:

The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.[9]

And in his essay (1897) What Is Art?, Tolstoy wrote:

I know that most men—not only those considered clever, but even those who are very clever, and capable of understanding most difficult scientific, mathematical, or philosophic problems—can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed, perhaps with much difficulty—conclusions of which they are proud, which they have taught to others, and on which they have built their lives.[10]

In the 1960s, psychologists gathered experimental data to support such observations that people are biased toward confirming their existing beliefs.[2] The initial experiment that generated the term "confirmation bias" was published by Peter Wason in 1960 (although that published article does not mention the term "confirmation bias").[11] Wason repeatedly challenged participants to identify a rule applying to triples of numbers. They were told that (2,4,6) fits the rule. They generated triples, and the experimenter told them whether each triple conformed to the rule. The actual rule was simply "any ascending sequence," but participants had great difficulty in finding it, often announcing rules that were far more specific, such as "the middle number is the average of the first and last." The participants seemed to test only positive examples—triples that obeyed their hypothesized rule. For example, if they thought the rule was, "Each number is two greater than its predecessor," they would offer a triple that fitted (confirmed) this rule, such as (11,13,15) rather than a triple that violated (falsified) it, such as (11,12,19).[11]

Wason interpreted his results as showing a preference for confirmation over falsification, hence he coined the term "confirmation bias" or "verification bias."[12] He used this bias to explain the results of his selection task experiment.[13] Participants repeatedly performed badly on various forms of this test, in most cases ignoring information that could potentially refute (falsify) the specified rule.[14]

Types of confirmation bias

Biased search for information

It has been found repeatedly that people tend to test hypotheses in a one-sided way, by searching for evidence consistent with their current hypothesis.[15][16] Rather than searching through all the relevant evidence, they phrase questions to receive an affirmative answer that supports their theory. They look for the consequences that they would expect if their hypothesis was true, rather than what would happen if it was false.[3] For example, someone using yes/no questions to find a number they suspect to be the number 3 might ask, "Is it an odd number?" People prefer this type of question, called a "positive test," even when a negative test such as "Is it an even number?" would yield exactly the same information.[17]

The preference for positive tests in itself is not a bias, since positive tests can be highly informative.[18] However, in combination with other effects, this strategy can confirm existing beliefs or assumptions, independently of whether they are true. In real-world situations, evidence is often complex and mixed. For example, various contradictory ideas about someone could each be supported by concentrating on one aspect of his or her behavior. Thus any search for evidence in favor of a hypothesis is likely to succeed. One illustration of this is the way the phrasing of a question can significantly change the answer.[16] For example, people who are asked, "Are you happy with your social life?" report greater satisfaction than those asked, "Are you unhappy with your social life?"[19]

The biased search for information may be reduced by a preference for genuine diagnostic tests.[16] For example, when asked to rate a person on the introversion–extroversion personality dimension on the basis of an interview, if the interviewee was introduced as an introvert, the participants chose questions from the list that presumed introversion, such as, "What do you find unpleasant about noisy parties?" When the interviewee was described as extroverted, they mostly asked questions that presumed extroversion, such as, "What would you do to liven up a dull party?" These loaded questions gave the interviewees little or no opportunity to falsify the hypothesis about them. However, when given less presumptive questions to choose from, such as, "Do you shy away from social interactions?" participants preferred to ask these more diagnostic questions, showing only a weak bias towards positive tests.[12]

Biased interpretation

Confirmation biases are not limited to the collection of evidence. Even if two individuals have the same information, the way they interpret it can be biased:

People who hold strong opinions on complex social issues are likely to examine relevant empirical evidence in a biased manner. They are apt to accept "confirming" evidence at face value while subjecting "disconfirming" evidence to critical evaluation, and, as a result, draw undue support for their initial positions from mixed or random empirical findings. Thus, the result of exposing contending factions in a social dispute to an identical body of relevant empirical evidence may be not a narrowing of disagreement but rather an increase in polarization.[20]

For example, people who felt strongly about capital punishment, half in favor and half against it, were given descriptions of two studies: a comparison of U.S. states with and without the death penalty, and a comparison of murder rates in a state before and after the introduction of the death penalty. After reading a quick description of each study, the participants were asked whether their opinions had changed. Then, they read a more detailed account of each study's procedure and had to rate whether the research was well-conducted and convincing. In fact, the studies were fictional. Half the participants were told that one kind of study supported the deterrent effect and the other undermined it, while for other participants the conclusions were reversed.[3]

The participants, whether supporters or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Participants described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways: Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, "The research didn't cover a long enough period of time," while an opponent's comment on the same study said, "No strong evidence to contradict the researchers has been presented."[20] The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations. This effect, known as "disconfirmation bias," has been supported by other experiments.[21]

Another study of biased interpretation involved participants who reported having strong feelings about the candidates in the 2004 United States presidential election. They were shown apparently contradictory pairs of statements, either from Republican candidate George W. Bush, Democratic candidate John Kerry, or a politically neutral public figure. They were also given further statements that made the apparent contradiction seem reasonable. From these three pieces of information, they had to decide whether each individual's statements were inconsistent. There were strong differences in these evaluations, with participants much more likely to interpret statements from the candidate they opposed as contradictory.The participants made their judgments while in a magnetic resonance imaging (MRI) scanner which monitored their brain activity. As participants evaluated contradictory statements by their favored candidate, emotional centers of their brains were aroused. This did not happen with the statements by the other figures. The experimenters inferred that the different responses to the statements were not due to passive reasoning errors. Instead, the participants were actively reducing the cognitive dissonance induced by reading about their favored candidate's irrational or hypocritical behavior.[22]

Biased memory recall

People may remember evidence selectively to reinforce their expectations, even if they gather and interpret evidence in a neutral manner. This effect may be called "selective recall," "confirmatory memory," or "access-biased memory."[23]

Studies have shown, for example, that emotional memories are reconstructed by current emotional states. When widows and widowers rated the intensity of their experienced grief six months and five years after the deaths of their spouses, they noted a higher experience of grief at six months rather than at five years. Yet, when the participants were asked after five years how they had felt six months after the death of their significant other, the intensity of grief participants recalled was highly correlated with their current level of grief. Individuals appear to utilize their current emotional states to analyze how they must have felt when experiencing past events.[24]

A selective memory effect has also been shown in experiments that manipulate the desirability of personality types.[16] For example, a group of participants were shown evidence that extroverted people are more successful than introverts. Another group were told the opposite. In a subsequent, apparently unrelated study, participants were asked to recall events from their lives in which they had been either introverted or extroverted. Each group of participants provided more memories connecting themselves with the more desirable personality type, and recalled those memories more quickly.[16]

Another study showed how selective memory can maintain belief in extrasensory perception (ESP). Believers and disbelievers were each shown descriptions of ESP experiments. Half of each group were told that the experimental results supported the existence of ESP, while the others were told they did not. In a subsequent test, participants recalled the material accurately, apart from believers who had read the non-supportive evidence. This group remembered significantly less information and some of them incorrectly remembered the results as supporting ESP.[25]

Explanations

Confirmation bias was once believed to be correlated with intelligence; however, as Michael Shermer observed, "Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons."[17] It appears that this bias can cause an inability to effectively and logically evaluate the opposite side of an argument. In other words, it is an absence of "active open-mindedness," meaning the active search for why an initial idea may be wrong, rather than a lack of intelligence per se.[3]

How people view "what makes a good argument" can influence the way a person formulates their own arguments. In a study investigating individual differences of argumentation schema, participants were asked to write essays either for or against their preferred side of an argument. They were given research instructions that took either a balanced or an unrestricted approach. The balanced-research instructions directed participants to create a "balanced" argument, i.e., that included both pros and cons; the unrestricted-research instructions included nothing on how to create the argument.[26]

Overall, the results revealed that the balanced-research instructions significantly increased the incidence of opposing information in arguments. These data also revealed that personal belief was not a source of myside bias. This evidence is consistent with Baron's understanding—that people's opinions about what makes good thinking can influence how arguments are generated.[26]

Explanations for confirmation bias also include wishful thinking and the limited human capacity to process information. Another possibility is that people show confirmation bias because they are pragmatically assessing the costs of being wrong, rather than investigating in a neutral, scientific way.

Positive test strategy

Klayman and Ha argued that the Wason experiments do not actually demonstrate a bias towards confirmation, but instead a tendency to make tests consistent with the working hypothesis.[18] They called this the "positive test strategy."[16] This strategy is an example of a heuristic: a reasoning shortcut that is imperfect but easy to compute.[2] Klayman and Ha used Bayesian probability and information theory as their standard of hypothesis-testing, rather than the falsificationism used by Wason. According to these ideas, each answer to a question yields a different amount of information, which depends on the person's prior beliefs. Thus a scientific test of a hypothesis is one that is expected to produce the most information. Since the information content depends on initial probabilities, a positive test can either be highly informative or uninformative. Klayman and Ha argued that when people think about realistic problems, they are looking for a specific answer with a small initial probability. In this case, positive tests are usually more informative than negative tests.[18] However, in Wason's rule discovery task the answer—three numbers in ascending order—is very broad, so positive tests are unlikely to yield informative answers. Klayman and Ha supported their analysis by citing an experiment that used the labels "DAX" and "MED" in place of "fits the rule" and "doesn't fit the rule." This avoided implying that the aim was to find a low-probability rule. Participants had much more success with this version of the experiment.[27]

Information processing explanations

There are several information processing explanations of confirmation bias.

Cognitive versus motivational

Explanations for biased evidence processing include cognitive and motivational mechanisms.

Cognitive explanations for confirmation bias are based on limitations in people's ability to handle complex tasks, and the shortcuts, or heuristics, that they use. For example, people may judge the reliability of evidence by using the "availability heuristic" that is, how readily a particular idea comes to mind.[16] It is also possible that people can only focus on one thought at a time, so find it difficult to test alternative hypotheses in parallel.[15] Another heuristic is the positive test strategy identified by Klayman and Ha, in which people test a hypothesis by examining cases where they expect a property or event to occur. This heuristic avoids the difficult or impossible task of working out how diagnostic each possible question will be. However, it is not universally reliable, so people can overlook challenges to their existing beliefs.[18][15]

Motivational explanations involve an effect of desire on belief.[15][3] People prefer positive thoughts over negative ones, the so-called the "Pollyanna principle."[28] Applied to arguments or sources of evidence, this could explain why desired conclusions are more likely to be believed true. Although consistency is a desirable feature of attitudes, an excessive drive for consistency is another potential source of bias because it may prevent people from neutrally evaluating new, surprising information.

Social psychologist Ziva Kunda combined the cognitive and motivational theories, arguing that motivation creates the bias, but cognitive factors determine the size of the effect.[15]

Cost-benefit

Explanations in terms of cost-benefit analysis assume that people do not just test hypotheses in a disinterested way, but assess the costs of different errors.[29] Yaacov Trope and Akiva Liberman suggested that people compare the two different kinds of error: accepting a false hypothesis or rejecting a true hypothesis when seeking evidence. For instance, someone who underestimates a friend's honesty might treat him or her suspiciously and so undermine the friendship. Overestimating the friend's honesty may also be costly, but less so. In this case, it would be rational to seek, evaluate, or remember evidence of their honesty in a biased way.[30]

In this way, confirmation bias an be viewed as a social skill.[31] For example, when someone gives an initial impression of being introverted or extroverted, questions that match that impression come across as more empathic. This suggests that when talking to someone who seems to be an introvert, it is a sign of better social skills to ask, "Do you feel awkward in social situations?" rather than, "Do you like noisy parties?" The connection between confirmation bias and social skills was corroborated by a study of how college students get to know other people. Highly self-monitoring students, who are more sensitive to their environment and to social norms, asked more matching questions when interviewing a high-status staff member than when getting to know fellow students.[31]

Exploratory versus confirmatory

Psychologists Jennifer Lerner and Philip Tetlock distinguished two different kinds of thinking process. "Exploratory thought" neutrally considers multiple points of view and tries to anticipate all possible objections to a particular position, while "confirmatory thought" seeks to justify a specific point of view, namely a confirmation bias. They suggest that when people expect to justify their position to others whose views they already know, they will tend to adopt a similar position to those people, and then use confirmatory thought to bolster their own credibility. However, if the external parties are overly aggressive or critical, people will disengage from thought altogether, and simply assert their personal opinions without justification. Lerner and Tetlock argued that people only push themselves to think critically and logically when they know in advance they will need to explain themselves to others who are well-informed, genuinely interested in the truth, and whose views they do not already know. Because those conditions rarely exist, most people use confirmatory thought most of the time.[32]

Make-believe

Developmental psychologist Eve Whitmore has argued that beliefs and biases involved in confirmation bias have their roots in childhood coping through make-believe:

From the beginning, parents reinforce to their children the skill of pretending in order to cope with the realities inherent in culture and society. Children’s learning about make-believe and mastery of it becomes the basis for more complex forms of self-deception and illusion into adulthood.[33]

The friction brought on by questioning as an adolescent with developing critical thinking can lead to the rationalization of false beliefs, and the habit of such rationalization can become unconscious over the years.

Real-world effects

There are numerous real life situations in which confirmation bias affects people's decision-making. A striking illustration of confirmation bias in the real world is numerological pyramidology: the practice of finding meaning in the proportions of the Egyptian pyramids. There are many different length measurements that can be made of, for example, the Great Pyramid of Giza and many ways to combine or manipulate them. Hence it is almost inevitable that people who look at these numbers selectively will find superficially impressive correspondences, for example with the dimensions of the Earth.[15]

Confirmation bias is not only widespread, but can lead to unfortunate consequences:

If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. Many have written about this bias, and it appears to be sufficiently strong and pervasive that one is led to wonder whether the bias, by itself, might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations.[34]

Attempts have been made to discover ways to overcome, or at least attenuate, the effects of confirmation bias in some common situation.

Conflict and law

Mock trials allow researchers to examine confirmation biases in a realistic setting.

Confirmation bias can be a factor in creating or extending conflicts, from emotionally charged debates to wars: by interpreting the evidence in their favor, each opposing party can become overconfident that it is in the stronger position.[3] On the other hand, confirmation bias can result in people ignoring or misinterpreting the signs of an imminent or incipient conflict. For example, psychologists Stuart Sutherland and Thomas Kida have each argued that U.S. Navy Admiral Husband E. Kimmel showed confirmation bias when playing down the first signs of the Japanese attack on Pearl Harbor.[14][17] In police investigations, a detective may identify a suspect early in an investigation, but then sometimes largely seek supporting or confirming evidence, ignoring or downplaying falsifying evidence.

Reasoning in judicial and political contexts is sometimes subconsciously biased, favoring conclusions that judges, juries, or governments have already committed to.[15] Since the evidence in a jury trial can be complex, and jurors often reach decisions about the verdict early on, it is reasonable to expect an attitude polarization effect. The prediction that jurors will become more extreme in their views as they see more evidence has been borne out in experiments with mock trials.[35]

Finance

Confirmation bias can lead investors to be overconfident, ignoring evidence that their strategies will lose money.[1][36] To combat the effect of confirmation bias, investors can try to adopt a contrary viewpoint "for the sake of argument."[37] In one technique, they imagine that their investments have collapsed and ask themselves why this might happen.[1]

Mass delusions

Confirmation bias can play a key role in the propagation of mass delusions. Witch trials are frequently cited as an example.[38]

Medicine and health

Confirmation bias has significant impact on clinical decision-making by medical general practitioners (GPs) and medical specialists. A GP may make a diagnosis early on during an examination, and then seek confirming evidence rather than falsifying evidence. In emergency medicine, because of time pressure, there is a high density of decision-making, and shortcuts are frequently applied.

Raymond Nickerson, a psychologist, blames confirmation bias for the ineffective medical procedures that were used for centuries before the arrival of scientific medicine.[15] If a patient recovered, medical authorities counted the treatment as successful, rather than looking for alternative explanations such as that the disease had run its natural course. Biased assimilation is a factor in the modern appeal of alternative medicine, whose proponents are swayed by positive anecdotal evidence but treat scientific evidence hyper-critically.[39]

Paranormal beliefs

One factor in the appeal of alleged psychic readings is that listeners apply a confirmation bias which fits the psychic's statements to their own lives. By making a large number of ambiguous statements in each sitting, the psychic gives the client more opportunities to find a match. This is one of the techniques of cold reading, with which a psychic can deliver a subjectively impressive reading without any prior information about the client.[40]

Scientific research

A distinguishing feature of scientific thinking is the search for confirming or supportive evidence (inductive reasoning) as well as falsifying evidence (deductive reasoning). Inductive research in particular is susceptible to confirmation bias.

Many times in the history of science, scientists have resisted new discoveries by selectively interpreting or ignoring unfavorable data. In practice, researchers may misunderstand, misinterpret, or not read at all studies that contradict their preconceptions, or wrongly cite them anyway as if they actually supported their claims. Further, confirmation biases can sustain scientific theories or research programs in the face of inadequate or even contradictory evidence.[14] The discipline of parapsychology is often cited as an example in the context of whether it is a pseudoscience:

Some of the worst examples of confirmation bias are in research on parapsychology ... Arguably, there is a whole field here with no powerful confirming data at all. But people want to believe, and so they find ways to believe.[41]

An experimenter's confirmation bias can potentially affect which data are reported. Data that conflict with the experimenter's expectations may be more readily discarded as unreliable, producing the so-called file drawer effect. To combat this tendency, scientific training teaches ways to prevent bias. For example, experimental design of randomized controlled trials (coupled with their systematic review) aims to minimize sources of bias.[41]

The social process of peer review aims to mitigate the effect of individual scientists' biases, even though the peer review process itself may be susceptible to such biases[42] Confirmation bias may thus be especially harmful to objective evaluations regarding nonconforming results since biased individuals may regard opposing evidence to be weak in principle and give little serious thought to revising their beliefs. Scientific innovators often meet with resistance from the scientific community, and research presenting controversial results frequently receives harsh peer review.[43]

Social media and searches

In social media and personalized searches by internet search engines such as Google and Bing, confirmation bias is amplified by the use of filter bubbles, or "algorithmic editing," which displays to individuals only information they are likely to agree with, while excluding opposing views. A suggested consequence is the degrading of democracy given that this "algorithmic editing" removes diverse viewpoints and information, and that unless filter bubble algorithms are removed voters will be unable to make fully informed political decisions.[44]

The rise of social media has contributed greatly to the rapid spread of fake news, that is, false and misleading information that is presented as credible news from a seemingly reliable source. Confirmation bias in the form of selecting or reinterpreting evidence to support one's beliefs is one of the main hurdles cited as to why critical thinking goes astray in these circumstances:

The key to people’s accepting fake news as true, despite evidence to the contrary, is a phenomenon known as confirmation bias, or the tendency for people to seek and accept information that confirms their existing beliefs while rejecting or ignoring that which contradicts those beliefs. ... one could say the brain is hardwired to accept, reject, misremember or distort information based on whether it is viewed as accepting of or threatening to existing beliefs.”[33]

In combating the spread of fake news, social media sites have considered turning toward "digital nudging." This includes nudging of information and nudging of presentation. Nudging of information entails social media sites providing a disclaimer or label questioning or warning users of the validity of the source while nudging of presentation includes exposing users to new information which they may not have sought out but could introduce them to viewpoints that may combat their own confirmation biases.[45]

Associated effects

Confirmation bias has been invoked to explain four specific effects:

  • Attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence)
  • Belief perseverance (when beliefs persist after the evidence for them is shown to be false)
  • Irrational primacy effect (a greater reliance on information encountered early in a series)
  • Illusory correlation (when people falsely perceive an association between two events or situations).

Polarization of opinion

Attitude polarization, also known as belief polarization, is a phenomenon in which a disagreement becomes more extreme as the different parties consider evidence on the issue. It is one of the effects of confirmation bias: the tendency of people to search for and interpret evidence selectively, to reinforce their current beliefs or attitudes.[19] When people encounter ambiguous evidence, this bias can result in each of them interpreting it as support for their existing attitudes, widening rather than narrowing the disagreement between them.[20]

The related backfire effect refers to the way people may hold even more strongly onto their beliefs when shown contradictory evidence, which they reject. The phrase was coined by Brendan Nyhan and Jason Reifler in 2010.[46]

Persistence of discredited beliefs

Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential bases.
——Lee Ross and Craig Anderson[47]

Confirmation bias provides one plausible explanation for the persistence of beliefs when the initial evidence for them is removed or when they have been sharply contradicted.[15] This belief perseverance effect was first demonstrated experimentally by Festinger and colleagues, who described the effect as Cognitive dissonance. These psychologists spent time with a cult whose members were convinced that the world would end on December 21, 1954. After the prediction failed, most believers still clung to their faith. Their book describing this research is aptly named When Prophecy Fails.[48]

The term belief perseverance was coined in a series of experiments using what is called the "debriefing paradigm": participants read fake evidence for a hypothesis, their attitude change is measured, then the fakery is exposed in detail. Their attitudes are then measured once more to see if their belief returns to its previous level.[47] A common finding is that at least some of the initial belief remains even after a full debriefing.[16]

Preference for early information

Experiments have shown that information is weighted more strongly when it appears early in a series, even when the order is unimportant. For example, people form a more positive impression of someone described as "intelligent, industrious, impulsive, critical, stubborn, envious" than when they are given the same words in reverse order. This irrational primacy effect is independent of the primacy effect in memory in which the earlier items in a series leave a stronger memory trace.[3] Biased interpretation offers an explanation for this effect: seeing the initial evidence, people form a working hypothesis that affects how they interpret the rest of the information.[15]

Illusory association between events

Illusory correlation is the tendency to see non-existent correlations in a set of data. This tendency was first demonstrated in a series of experiments in the late 1960s. In one experiment, participants read a set of psychiatric case studies, including responses to the Rorschach inkblot test. The participants reported that the homosexual men in the set were more likely to report seeing buttocks or sexually ambiguous figures in the inkblots. In fact the fictional case studies had been constructed so that the homosexual men were no more likely to report this imagery or, in one version of the experiment, were less likely to report it than heterosexual men. In a survey, a group of experienced psychoanalysts reported the same set of illusory associations with homosexuality.[19][2]

Another study recorded the symptoms experienced by arthritic patients, along with weather conditions over a 15-month period. Nearly all the patients reported that their pains were correlated with weather conditions, although the real correlation was zero.[49]

Example
Days Rain No rain
Arthritis 14 6
No arthritis 7 2

In judging whether the two events, illness and bad weather, were correlated, participants relied heavily on the number of positive-positive cases: in this example, instances of both pain and bad weather. They paid relatively little attention to the other kinds of observation (of no pain and/or good weather).[2] This parallels the reliance on positive tests in hypothesis testing. It may also reflect selective recall, in that people may have a sense that two events are correlated because it is easier to recall times when they happened together.[16]

Notes

  1. 1.0 1.1 1.2 Jason Zweig, How to ignore the yes-man in your head The Wall Street Journal, November 19, 2009. Retrieved May 4, 2023.
  2. 2.0 2.1 2.2 2.3 2.4 Scott Plous, The Psychology of Judgment and Decision Making (McGraw-Hill, 1993, ISBN 978-0070504776).
  3. 3.0 3.1 3.2 3.3 3.4 3.5 3.6 Jonathan Baron, Thinking and Deciding (New York: Cambridge University Press, 2000, ISBN 978-0521650304).
  4. Thucydides, History of the Peloponnesian War (Penguin Classic, 1972 (original 431 B.C.E.), ISBN 978-0140440393).
  5. Dante Alighieri, trans. Allen Mandelbaum, The Divine Comedy: Inferno; Purgatorio; Paradiso (Everyman's Library, 1995, ISBN 978-0679433132), "Paradiso" canto XIII: 118–120.
  6. Ibn Khaldun, The Muqadimmah (Pantheon Books, 1958, ISBN 978-0710001955).
  7. 7.0 7.1 Francis Bacon, Novum Organum (Legare Street Press, 2022 (original 1620), ISBN 978-1015466555).
  8. Arthur Schopenhauer, The World as Will and Presentation Volume 2 (Routledge, 2010, ISBN 0321355806).
  9. Leo Tolstoy, trans. Constance Garnett, The Kingdom of God Is Within You Wentworth Press, 2016 (original 1894), ISBN 978-1371256289).
  10. Leo Tolstoy, trans. Aylmer Maude, What Is Art? (Hackett Publishing Company, Inc., 1996 (original 1897), ISBN 978-0872202955).
  11. 11.0 11.1 Peter C. Wason, On the Failure to Eliminate Hypotheses in a Conceptual Task Quarterly Journal of Experimental Psychology 12(3) (1960): 129–140. Retrieved May 6, 2023.
  12. 12.0 12.1 Fenna H. Poletiek, Hypothesis-testing Behaviour (Psychology Press, 2000, ISBN 978-1841691596).
  13. Peter C. Wason, Reasoning about a rule Quarterly Journal of Experimental Psychology 20(3) (1968):273–281. Retrieved May 6, 2023.
  14. 14.0 14.1 14.2 Stuart Sutherland, Irrationality: The Enemy within (Pinter & Martin, 2013, ISBN 978-1780660257).
  15. 15.0 15.1 15.2 15.3 15.4 15.5 15.6 15.7 15.8 15.9 Raymond S. Nickerson, Argumentation (Cambridge University Press, 2020, ISBN 978-1108799874).
  16. 16.0 16.1 16.2 16.3 16.4 16.5 16.6 16.7 16.8 Ziva Kunda, Social Cognition: Making Sense of People (Bradford Book, 1999, ISBN 0262611430).
  17. 17.0 17.1 17.2 Thomas E. Kida, Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking (Prometheus, 2006, ISBN 1591024080).
  18. 18.0 18.1 18.2 18.3 Joshua Klayman and Young-Won Ha, Confirmation, Disconfirmation, and Information in Hypothesis Testing Psychological Review 94(2) (1987): 211-228. Retrieved May 4, 2023.
  19. 19.0 19.1 19.2 Cordelia Fine, A Mind of Its Own : How Your Brain Distorts and Deceives (Icon Books, 2005, ISBN 978-1840466782).
  20. 20.0 20.1 20.2 Charles G. Lord, Lee Ross, and Mark R. Lepper, Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence Journal of Personality and Social Psychology 37(11) (1979):2098–2109. Retrieved May 9, 2023.
  21. Kari Edwards and Edward E. Smith, A Disconfirmation Bias in the Evaluation of Arguments Journal of Personality and Social Psychology 71(1) (1996):5-24. Retrieved May 5, 2023.
  22. Drew Westen, Pavel S. Blagov, Keith Harenski, Clint Kilts, and Stephan Hamann, Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 U.S. Presidential election, Journal of Cognitive Neuroscience 18(11) (2006): 1947–1958.
  23. David L. Hamilton (ed.), Social Cognition: Key Readings (Psychology Press, 2005, ISBN 978-0863775918).
  24. Linda J. Levine, Vincent Prohaska, Stewart L. Burgess, John A. Rice, and Tracy M. Laulhere, Remembering past emotions: The role of current appraisals, Cognition and Emotion 15(4) (2001):393–417.
  25. Stuart A. Vyse, Believing in Magic: The Psychology of Superstition (Oxford University Press, 2000, ISBN 978-0195136340).
  26. 26.0 26.1 Christopher Wolfe and Anne Britt, "The locus of the myside bias in written argumentation" Thinking & Reasoning 14 (2008): 1–27.
  27. Maria Lewicka, "Confirmation bias: Cognitive error or adaptive strategy of action control?" in Mirosław Kofta, Gifford Weary, and Grzegorz Sedek (eds.), Personal Control in Action: Cognitive and Motivational Mechanisms (Springer, 1998, ISBN 978-0306457203), 233–255.
  28. Margaret W. Matilin, "Pollyanna Principle" in Rüdiger F. Pohl (ed.), Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory (Psychology Press, 2012, ISBN 978-0415646758).
  29. Margit E. Oswald and Stefan Grosjean "Confirmation bias" in Rüdiger F. Pohl (ed.), Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory (Psychology Press, 2012, ISBN 978-0415646758).
  30. Y. Trope and A. Liberman, "Social hypothesis testing: Cognitive and motivational mechanisms in E. Tory Higgins and Arie W. Kruglanski (eds.), Social Psychology: Handbook of Basic Principles (The Guilford Press, 1996, ISBN 978-1572301009).
  31. 31.0 31.1 Benoit Dardenne and Jacques-Philippe Leyens, Confirmation bias as a social skill Personality and Social Psychology Bulletin 21(11) (1995): 1229–1239. Retrieved May 8, 2023.
  32. Jonathan Haidt, The Righteous Mind: Why good people are divided by politics and religion (Penguin Books Ltd, 2013, ISBN 978-0141039169).
  33. 33.0 33.1 Why we're susceptible to fake news – and how to defend against it American Psychological Association, August 10, 2018. Retrieved May 8, 2023.
  34. Raymond S. Nickerson, Confirmation Bias: A Ubiquitous Phenomenon in Many Guises Review of General Psychology 2(2) (1998): 175-220. Retrieved May 10, 2023.
  35. Diane F. Halpern, Critical Thinking Across the Curriculum: A Brief Edition of Thought and Knowledge (Routledge, 1997, ISBN 978-0805827316).
  36. Michael M. Pompian, Behavioral Finance and Wealth Management: How to Build Optimal Portfolios That Account for Investor Biases (John Wiley and Sons, 2006, ISBN 0471745170).
  37. David Krueger and John David Mann, The Secret Language of Money: How to Make Smarter Financial Decisions and Live a Richer Life (McGraw Hill, 2009, ISBN 978-0071623391).
  38. Hugh R. Trevor-Roper, The European witch-craze of the Sixteenth and Seventeenth Centuries (Penguin Books, 1991, ISBN 978-0140137187).
  39. Ben Goldacre, Bad Science (Fourth Estate, 2008, ISBN 978-0007240197).
  40. Jonathan C. Smith, Pseudoscience and Extraordinary Claims of the Paranormal: A Critical Thinker's Toolkit (Wiley-Blackwell, 2009, ISBN 978-1405181228).
  41. 41.0 41.1 Robert J. Sternberg, Henry L. Roediger III, and Diane F. Halpern (eds.), Critical Thinking in Psychology (Cambridge University Press, 2006, ISBN 978-0521608343).
  42. Steven James Bartlett, "The psychology of abuse in publishing: Peer review and editorial bias," Normality Does Not Equal Mental Health: The Need to Look Elsewhere for Standards of Good Psychological Health (Santa Barbara, CA: Praeger, 2011, ISBN 978-0313399312), 147–177.
  43. David F. Horrobin, The philosophical basis of peer review and the suppression of innovation Journal of the American Medical Association 263(10) (1990):1438–1441. Retrieved May 9, 2023.
  44. Eli Pariser, Beware online "filter bubbles" TED, May 2, 2011. Retrieved May 9, 2023.
  45. Calum Thornhill, Quentin Meeus, Jeroen Peperkamp, and Bettina Berendt, "A digital nudge to counter confirmation bias," Frontiers in Big Data 2 (2019):11.
  46. Brendan Nyhan and Jason Reifler, When corrections fail: The persistence of political misperceptions, Political Behavior 32 (2010): 303–320.
  47. 47.0 47.1 Lee Ross and Craig A. Anderson, "Judgment under uncertainty: Heuristics and biases" in Daniel Kahneman, Paul Slovic, and Amos Tversky (eds.), Judgment Under Uncertainty: Heuristics and Biases (Cambridge University Press, 1982, ISBN 978-0521284141), 129-152.
  48. Leon Festinger, Henry W. Riecken, and Stanley Schachter, When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World (Harper-Torchbooks, 1956, ISBN 0061311324).
  49. D.A. Redelmeir and Amos Tversky, On the belief that arthritis pain is related to the weatherProceedings of the National Academy of Sciences 93(7) (1996):2895–2896. Retrieved May 10, 2023.

References
ISBN links support NWE through referral fees

  • Alighieri, Dante, trans. Allen Mandelbaum. The Divine Comedy: Inferno; Purgatorio; Paradiso. Everyman's Library, 1995. ISBN 978-0679433132
  • Bacon, Francis. Novum Organum. Legare Street Press, 2022 (original 1620). ISBN 978-1015466555)
  • Baron, Jonathan. Thinking and Deciding. New York: Cambridge University Press, 2000. ISBN 978-0521650304
  • Bartlett, Steven James. Normality Does Not Equal Mental Health: The Need to Look Elsewhere for Standards of Good Psychological Health. Santa Barbara, CA: Praeger, 2011. ISBN 978-0313399312
  • Festinger, Leon, Henry W. Riecken, and Stanley Schachter. When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World. Harper-Torchbooks, 1956. ISBN 0061311324
  • Fine, Cordelia. A Mind of Its Own : How Your Brain Distorts and Deceives. Icon Books, 2005. ISBN 978-1840466782
  • Goldacre, Ben. Bad Science. Fourth Estate, 2008. ISBN 978-0007240197
  • Haidt, Jonathan. The Righteous Mind: Why good people are divided by politics and religion. Penguin Books Ltd, 2013. ISBN 978-0141039169
  • Halpern, Diane F. Critical Thinking Across the Curriculum: A Brief Edition of Thought and Knowledge. Routledge, 1997. ISBN 978-0805827316
  • Hamilton, David L. (ed.). Social Cognition: Key Readings. Psychology Press, 2005. ISBN 978-0863775918
  • Higgins, E. Tory, and Arie W. Kruglanski (eds.). Social Psychology: Handbook of Basic Principles. The Guilford Press, 1996. ISBN 978-1572301009
  • Kahneman, Daniel, Paul Slovic, and Amos Tversky (eds.). Judgment Under Uncertainty: Heuristics and Biases. Cambridge University Press, 1982. ISBN 978-0521284141
  • Khaldun, Ibn. The Muqadimmah. Pantheon Books, 1958. ISBN 978-0710001955
  • Kida, Thomas E. Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking. Prometheus, 2006. ISBN 1591024080
  • Kofta, Mirosław, Gifford Weary, and Grzegorz Sedek. Personal Control in Action: Cognitive and Motivational Mechanisms. Springer, 1998. ISBN 978-0306457203
  • Krueger, David, and John David Mann. The Secret Language of Money: How to Make Smarter Financial Decisions and Live a Richer Life. McGraw Hill, 2009. ISBN 978-0071623391
  • Kunda, Ziva. Social Cognition: Making Sense of People. Bradford Book, 1999. ISBN 0262611430
  • Nickerson, Raymond S. Argumentation. Cambridge University Press, 2020. ISBN 978-1108799874
  • Plous, Scott. The Psychology of Judgment and Decision Making. McGraw-Hill, 1993. ISBN 978-0070504776
  • Pohl, Rüdiger F. (ed.). Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Psychology Press, 2012. ISBN 978-0415646758
  • Poletiek, Fenna H. Hypothesis-testing Behaviour. Psychology Press, 2000. ISBN 978-1841691596
  • Pompian, Michael M. Behavioral Finance and Wealth Management: How to Build Optimal Portfolios That Account for Investor Biases.(ohn Wiley and Sons, 2006. ISBN 0471745170
  • Schopenhauer, Arthur. The World as Will and Presentation Volume 2. Routledge, 2010. ISBN 0321355806
  • Smith, Jonathan C. Pseudoscience and Extraordinary Claims of the Paranormal: A Critical Thinker's Toolkit. Wiley-Blackwell, 2009. ISBN 978-1405181228
  • Sternberg, Robert J., Henry L. Roediger III, and Diane F. Halpern (eds.). Critical Thinking in Psychology. Cambridge University Press, 2006. ISBN 978-0521608343
  • Sutherland, Stuart. Irrationality: The Enemy within. Pinter & Martin, 2013. ISBN 978-1780660257
  • Thucydides. History of the Peloponnesian War. Penguin Classic, 1972 (original 431 B.C.E.). ISBN 978-0140440393
  • Tolstoy, Leo, trans. Constance Garnett. The Kingdom of God Is Within You. Wentworth Press, 2016 (original 1894). ISBN 978-
  • Tolstoy, Leo, trans. Aylmer Maude, What Is Art? Hackett Publishing Company, Inc., 1996 (original 1897). ISBN 978-0872202955
  • Vyse, Stuart A. Believing in Magic: The Psychology of Superstition. Oxford University Press, 2000. ISBN 978-0195136340

External links

All links retrieved May 10, 2023.

Credits

New World Encyclopedia writers and editors rewrote and completed the Wikipedia article in accordance with New World Encyclopedia standards. This article abides by terms of the Creative Commons CC-by-sa 3.0 License (CC-by-sa), which may be used and disseminated with proper attribution. Credit is due under the terms of this license that can reference both the New World Encyclopedia contributors and the selfless volunteer contributors of the Wikimedia Foundation. To cite this article click here for a list of acceptable citing formats.The history of earlier contributions by wikipedians is accessible to researchers here:

The history of this article since it was imported to New World Encyclopedia:

Note: Some restrictions may apply to use of individual images which are separately licensed.