Difference between revisions of "Confirmation bias" - New World Encyclopedia

From New World Encyclopedia
m (Protected "Confirmation bias": Completed ([Edit=Allow only administrators] (indefinite) [Move=Allow only administrators] (indefinite)))
 
(160 intermediate revisions by the same user not shown)
Line 1: Line 1:
 +
{{Images OK}}{{Submitted}}{{Approved}}{{Copyedited}}
 
[[File:Fred Barnard07.jpg|thumb|right|300px|Confirmation bias has been described as an internal "yes man," echoing back a person's beliefs like [[Charles Dickens]]' character [[Uriah Heep (character)|Uriah Heep]].<ref name="WSJ">Jason Zweig, [https://www.wsj.com/articles/SB10001424052748703811604574533680037778184 How to ignore the yes-man in your head] ''The Wall Street Journal'', November 19, 2009. Retrieved May 4, 2023.</ref>]]
 
[[File:Fred Barnard07.jpg|thumb|right|300px|Confirmation bias has been described as an internal "yes man," echoing back a person's beliefs like [[Charles Dickens]]' character [[Uriah Heep (character)|Uriah Heep]].<ref name="WSJ">Jason Zweig, [https://www.wsj.com/articles/SB10001424052748703811604574533680037778184 How to ignore the yes-man in your head] ''The Wall Street Journal'', November 19, 2009. Retrieved May 4, 2023.</ref>]]
'''Confirmation bias''' is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior [[belief]]s or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for [[emotion]]ally charged issues, and for deeply entrenched beliefs. Confirmation bias cannot be eliminated, but it can be managed, for example, by education and training in [[critical thinking]] skills.
+
'''Confirmation bias''' is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior [[belief]]s or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing [[attitude]]s. The effect is strongest for desired outcomes, for [[emotion]]ally charged issues, and for deeply entrenched beliefs. This bias has the unfortunate consequence of people holding on to beliefs that are contradicted by evidence. This can lead to polarization of opinions, in which a disagreement becomes more extreme and has the possibility of a tragic outcome.  
 
+
{{toc}}
A series of [[Experimental psychology|psychological experiments]] in the 1960s suggested that people are biased toward confirming their existing beliefs. Later work re-interpreted these results as a tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives. Explanations for the observed biases include [[wishful thinking]] and the limited human capacity to process information. Another proposal is that people show confirmation bias because they are pragmatically assessing the costs of being wrong, rather than investigating in a neutral, scientific way.
+
Flawed [[decision making|decisions]] due to confirmation bias have been found in a wide range of political, organizational, financial, and scientific contexts. These biases contribute to [[overconfidence effect|overconfidence]] in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. For example, confirmation bias produces systematic errors in scientific research based on [[inductive reasoning]] (the gradual accumulation of supportive evidence). Similarly, a police detective may identify a suspect early in an investigation, but then may only seek confirming rather than disconfirming evidence. A medical practitioner may prematurely focus on a particular disorder early in a diagnostic session, and then seek only confirming evidence. Confirmation bias is amplified by the use of [[filter bubble]]s, or "algorithmic editing" commonly used in [[social media]] and [[search engine]]s, which display to individuals only information they are likely to agree with, while excluding opposing views.  
 
 
Flawed [[decision making|decisions]] due to confirmation bias have been found in a wide range of political, organizational, financial and scientific contexts. These biases contribute to [[overconfidence effect|overconfidence]] in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. For example, confirmation bias produces systematic errors in scientific research based on [[inductive reasoning]] (the gradual accumulation of supportive evidence). Similarly, a police detective may identify a suspect early in an investigation, but then may only seek confirming rather than disconfirming evidence. A medical practitioner may prematurely focus on a particular disorder early in a diagnostic session, and then seek only confirming evidence. In [[social media]], confirmation bias is amplified by the use of [[filter bubble]]s, or "algorithmic editing", which display to individuals only information they are likely to agree with, while excluding opposing views.
 
  
 
== Definition ==
 
== Definition ==
 
Confirmation bias is a term coined by English psychologist [[Peter Wason]], to describe the tendency for people to immediately favor information that validates their preconceptions, hypotheses, and personal beliefs regardless of whether they are true or not. It also includes the tendency to strive toward proving one’s hypothesis instead of disproving it.<ref name=Plous> Scott Plous, ''The Psychology of Judgment and Decision Making'' (McGraw-Hill, 1993, ISBN 978-0070504776).</ref>  
 
Confirmation bias is a term coined by English psychologist [[Peter Wason]], to describe the tendency for people to immediately favor information that validates their preconceptions, hypotheses, and personal beliefs regardless of whether they are true or not. It also includes the tendency to strive toward proving one’s hypothesis instead of disproving it.<ref name=Plous> Scott Plous, ''The Psychology of Judgment and Decision Making'' (McGraw-Hill, 1993, ISBN 978-0070504776).</ref>  
  
Confirmation bias (or confirmatory bias) has also been termed '''myside bias''', a term suggested by [[David Perkins (geneticist)|David Perkins]], a professor and researcher at the Harvard Graduate School of Education. This reflects the bias as a preference for "my" side of an issue.<ref>Jonathan Baron, ''Thinking and Deciding'' (New York: Cambridge University Press, 2000, ISBN 978-0521650304). </ref>  
+
Confirmation bias (or confirmatory bias) has also been termed '''myside bias''', a term suggested by [[David Perkins (geneticist)|David Perkins]], a professor and researcher at the Harvard Graduate School of Education. This reflects the bias as a preference for "my" side of an issue.<ref name=Baron>Jonathan Baron, ''Thinking and Deciding'' (New York: Cambridge University Press, 2000, ISBN 978-0521650304). </ref>  
  
 
Confirmation biases differ from what is sometimes called the ''[[Behavioral confirmation|behavioral confirmation effect]]'', commonly known as ''[[self-fulfilling prophecy]]'', in which a person's expectations influence their own behavior, bringing about the expected result.
 
Confirmation biases differ from what is sometimes called the ''[[Behavioral confirmation|behavioral confirmation effect]]'', commonly known as ''[[self-fulfilling prophecy]]'', in which a person's expectations influence their own behavior, bringing about the expected result.
  
== Types of confirmation bias ==
+
== Discovery ==
=== Biased search for information ===
+
[[File:Somer Francis Bacon.jpg|thumb|300px|[[Francis Bacon]]]]
 +
The phenomenon that has come to be known as "confirmation bias" was observed throughout history. Beginning with the Greek historian [[Thucydides]] (c. 460 B.C.E. - c. 400 B.C.E.) wrote of misguided reason in ''History of the Peloponnesian War'': "...&nbsp;for it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy."<ref>Thucydides, ''History of the Peloponnesian War'' (Penguin Classic, 1972 (original 431 B.C.E.), ISBN 978-0140440393).</ref> Italian poet [[Dante Alighieri]] (1265–1321) noted it in ''[[The Divine Comedy]]'', in which [[St. Thomas Aquinas]] cautions Dante upon meeting in Paradise, "opinion—hasty—often can incline to the wrong side, and then affection for one's own opinion binds, confines the mind."<ref>Dante Alighieri, trans. Allen Mandelbaum, ''The Divine Comedy: Inferno; Purgatorio; Paradiso'' (Everyman's Library, 1995, ISBN 978-0679433132), "Paradiso" canto XIII: 118–120.</ref> [[Ibn Khaldun]] noticed the same effect in his ''[[Muqaddimah]]'':
 +
<blockquote>Untruth naturally afflicts historical information. There are various reasons that make this unavoidable. One of them is partisanship for opinions and schools. ... if the soul is infected with partisanship for a particular opinion or sect, it accepts without a moment's hesitation the information that is agreeable to it. Prejudice and partisanship obscure the critical faculty and preclude critical investigation. The result is that falsehoods are accepted and transmitted.<ref>Ibn Khaldun, ''The Muqadimmah'' (Pantheon Books, 1958, ISBN 978-0710001955).</ref></blockquote>
  
Experiments have found repeatedly that people tend to test hypotheses in a one-sided way, by searching for evidence consistent with their current [[hypothesis]].<ref name="nickerson">{{harvnb|Nickerson|1998|pp=175–220}}</ref>{{rp|177–178}}<ref name="kunda112" /> Rather than searching through all the relevant evidence, they phrase questions to receive an affirmative answer that supports their theory.<ref name="baron162" /> They look for the consequences that they would expect if their hypothesis was true, rather than what would happen if it was false.<ref name="baron162">{{Harvnb|Baron|2000 |pp=162–64}}</ref> For example, someone using yes/no questions to find a number they suspect to be the number 3 might ask, "Is it an [[odd number]]?" People prefer this type of question, called a "positive test", even when a negative test such as "Is it an even number?" would yield exactly the same information.<ref>{{Harvnb|Kida|2006|pp=162–65}}</ref> However, this does not mean that people seek tests that guarantee a positive answer. In studies where subjects could select either such pseudo-tests or genuinely diagnostic ones, they favored the genuinely diagnostic.<ref>{{Citation |last1=Devine |first1=Patricia G. |first2=Edward R. |last2=Hirt |first3=Elizabeth M. |last3=Gehrke |year=1990 |title=Diagnostic and confirmation strategies in trait hypothesis testing |journal=Journal of Personality and Social Psychology |volume=58 |issue=6 |pages=952–963 |issn=1939-1315 |doi=10.1037/0022-3514.58.6.952}}</ref><ref>{{Citation |last1=Trope |first1=Yaacov |first2=Miriam |last2=Bassok |year=1982 |title=Confirmatory and diagnosing strategies in social information gathering |journal=Journal of Personality and Social Psychology |volume=43 |issue=1 |pages=22–34 |issn=1939-1315 |doi=10.1037/0022-3514.43.1.22}}</ref>
+
In the ''[[Novum Organum]]'', English philosopher and scientist [[Francis Bacon]] (1561–1626) noted that biased assessment of evidence drove "all superstitions, whether in astrology, dreams, omens, divine judgments or the like."<ref name=Bacon>Francis Bacon, ''Novum Organum'' (Legare Street Press, 2022 (original 1620), ISBN 978-1015466555).</ref> He wrote:
 +
<blockquote>The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or rejects.<ref name=Bacon/></blockquote>
  
The preference for positive tests in itself is not a bias, since positive tests can be highly informative.<ref name="klaymanha" /> However, in combination with other effects, this strategy can confirm existing beliefs or assumptions, independently of whether they are true.<ref name="oswald82">{{Harvnb|Oswald|Grosjean|2004|pp=82–83}}</ref> In real-world situations, evidence is often complex and mixed. For example, various contradictory ideas about someone could each be supported by concentrating on one aspect of his or her behavior.<ref name="kunda112" /> Thus any search for evidence in favor of a hypothesis is likely to succeed.<ref name=oswald82 /> One illustration of this is the way the phrasing of a question can significantly change the answer.<ref name="kunda112">{{Harvnb|Kunda|1999|pp=112–115}}</ref> For example, people who are asked, "Are you happy with your social life?" report greater satisfaction than those asked, "Are you ''un''happy with your social life?"<ref>{{Citation |last1=Kunda |first1=Ziva |first2=G.T. |last2 =Fong |first3=R. |last3=Sanitoso |first4=E. |last4=Reber |year=1993 |title=Directional questions direct self-conceptions |journal=[[Journal of Experimental Social Psychology]] |volume=29 |pages=62–63 |issn=0022-1031|doi=10.1006/jesp.1993.1004 }} via {{Harvnb|Fine|2006|pp=63–65}}</ref>
+
In the second volume of his ''[[The World as Will and Representation]]'' (1844), German philosopher [[Arthur Schopenhauer]] observed that "An adopted hypothesis gives us lynx-eyes for everything that confirms it and makes us blind to everything that contradicts it."<ref>Arthur Schopenhauer, ''The World as Will and Presentation Volume 2'' (Routledge, 2010, ISBN 0321355806).</ref>
  
Even a small change in a question's wording can affect how people search through available information, and hence the conclusions they reach. This was shown using a fictional child custody case.<ref name="shafir" /> Participants read that Parent A was moderately suitable to be the guardian in multiple ways. Parent B had a mix of salient positive and negative qualities: a close relationship with the child but a job that would take them away for long periods of time. When asked, "Which parent should have custody of the child?" the majority of participants chose Parent B, looking mainly for positive attributes. However, when asked, "Which parent should be denied custody of the child?" they looked for negative attributes and the majority answered that Parent B should be denied custody, implying that Parent A should have custody.<ref name="shafir">{{Citation |last=Shafir |first=E. |year=1993 |title=Choosing versus rejecting: why some options are both better and worse than others |journal=Memory and Cognition |volume=21 |pages=546–556 |pmid= 8350746 |issue=4 |doi=10.3758/bf03197186|doi-access=free }} via {{Harvnb|Fine|2006|pp=63–65}}</ref>
+
In his essay (1894) ''[[The Kingdom of God Is Within You]]'', Russian novelist [[Leo Tolstoy]] wrote:
 +
<blockquote>The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.<ref>Leo Tolstoy, trans. Constance Garnett, ''The Kingdom of God Is Within You'' Wentworth Press, 2016 (original 1894), ISBN 978-1371256289).</ref></blockquote>
  
Similar studies have demonstrated how people engage in a biased search for information, but also that this phenomenon may be limited by a preference for genuine diagnostic tests. In an initial experiment, participants rated another person on the [[extroversion and introversion|introversion–extroversion]] personality dimension on the basis of an interview. They chose the interview questions from a given list. When the interviewee was introduced as an introvert, the participants chose questions that presumed introversion, such as, "What do you find unpleasant about noisy parties?" When the interviewee was described as extroverted, almost all the questions presumed extroversion, such as, "What would you do to liven up a dull party?" These [[loaded question]]s gave the interviewees little or no opportunity to falsify the hypothesis about them.<ref>{{Citation |last1=Snyder |first1=Mark | first2=William B. Jr. | last2=Swann |year=1978 |title=Hypothesis-testing processes in social interaction |journal=[[Journal of Personality and Social Psychology]] |volume=36 |issue=11 |pages=1202–1212 |doi=10.1037/0022-3514.36.11.1202}} via {{Harvnb|Poletiek|2001|p=131}}</ref> A later version of the experiment gave the participants less presumptive questions to choose from, such as, "Do you shy away from social interactions?"<ref name="kunda117" /> Participants preferred to ask these more diagnostic questions, showing only a weak bias towards positive tests. This pattern, of a main preference for diagnostic tests and a weaker preference for positive tests, has been replicated in other studies.<ref name="kunda117">{{Harvnb|Kunda|1999|pp=117–18}}</ref>
+
And in his essay (1897) ''[[What Is Art?]]'', Tolstoy wrote:
 +
<blockquote>I know that most men—not only those considered clever, but even those who are very clever, and capable of understanding most difficult scientific, mathematical, or philosophic problems—can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed, perhaps with much difficulty—conclusions of which they are proud, which they have taught to others, and on which they have built their lives.<ref>Leo Tolstoy, trans. Aylmer Maude, ''What Is Art?'' (Hackett Publishing Company, Inc., 1996 (original 1897), ISBN 978-0872202955).</ref></blockquote>
  
Personality traits influence and interact with biased search processes.<ref name=albarracin>{{Citation|last=Albarracin|first=D.|author2=Mitchell, A.L.|title=The role of defensive confidence in preference for proattitudinal information: How believing that one is strong can sometimes be a defensive weakness|journal=Personality and Social Psychology Bulletin|year=2004|volume=30|issue=12|pages=1565–1584|doi=10.1177/0146167204271180|pmid=15536240|pmc=4803283}}</ref> Individuals vary in their abilities to defend their attitudes from external attacks in relation to [[selective exposure theory|selective exposure]]. Selective exposure occurs when individuals search for information that is consistent, rather than inconsistent, with their personal beliefs.<ref>{{Citation|last=Fischer|first=P.|author2=Fischer, Julia K. |author3=Aydin, Nilüfer |author4= Frey, Dieter |title=Physically attractive social information sources lead to increased selective exposure to information|journal=Basic and Applied Social Psychology|year=2010|volume=32|issue=4|pages=340–347|doi=10.1080/01973533.2010.519208|s2cid=143133082}}</ref> An experiment examined the extent to which individuals could refute arguments that contradicted their personal beliefs.<ref name=albarracin /> People with high [[confidence]] levels more readily seek out contradictory information to their personal position to form an argument. This can take the form of an ''oppositional news consumption'', where individuals seek opposing partisan news in order to counterargue.<ref>{{cite book |last1=Dahlgren |first1=Peter M. |title=Media Echo Chambers: Selective Exposure and Confirmation Bias in Media Use, and its Consequences for Political Polarization |date=2020 |publisher=University of Gothenburg |location=Gothenburg |isbn=978-91-88212-95-5 |url=https://gupea.ub.gu.se/handle/2077/67023?locale=en |mode=cs2}}</ref> Individuals with low confidence levels do not seek out contradictory information and prefer information that supports their personal position. People generate and evaluate evidence in arguments that are biased towards their own beliefs and opinions.<ref name="stanovich"/> Heightened confidence levels decrease preference for information that supports individuals' personal beliefs.
+
In the 1960s, [[psychologist]]s gathered experimental data to support such observations that people are biased toward confirming their existing beliefs.<ref name=Plous/> The initial experiment that generated the term "confirmation bias" was published by Peter Wason in 1960 (although that published article does not mention the term "confirmation bias").<ref name=Wason>Peter C. Wason, [https://bear.warrington.ufl.edu/brenner/mar7588/Papers/wason-qjep1960.pdf On the Failure to Eliminate Hypotheses in a Conceptual Task] ''Quarterly Journal of Experimental Psychology'' 12(3) (1960): 129–140. Retrieved May 6, 2023.</ref> Wason repeatedly challenged participants to identify a rule applying to triples of numbers. They were told that (2,4,6) fits the rule. They generated triples, and the experimenter told them whether each triple conformed to the rule. The actual rule was simply "any ascending sequence," but participants had great difficulty in finding it, often announcing rules that were far more specific, such as "the middle number is the average of the first and last." The participants seemed to test only positive examples—triples that obeyed their hypothesized rule. For example, if they thought the rule was, "Each number is two greater than its predecessor," they would offer a triple that fitted (confirmed) this rule, such as (11,13,15) rather than a triple that violated (falsified) it, such as (11,12,19).<ref name=Wason/>
  
Another experiment gave participants a complex rule-discovery task that involved moving objects simulated by a computer.<ref name="mynatt1978">{{Citation |last1=Mynatt |first1=Clifford R. |first2=Michael E. |last2=Doherty |first3=Ryan D. |last3=Tweney |year=1978 |title=Consequences of confirmation and disconfirmation in a simulated research environment |journal=Quarterly Journal of Experimental Psychology |volume=30 |issue=3 |pages=395–406 |doi =10.1080/00335557843000007|s2cid=145419628 }}</ref> Objects on the computer screen followed specific laws, which the participants had to figure out. So, participants could "fire" objects across the screen to test their hypotheses. Despite making many attempts over a ten-hour session, none of the participants figured out the rules of the system. They typically attempted to confirm rather than falsify their hypotheses, and were reluctant to consider alternatives. Even after seeing objective evidence that refuted their working hypotheses, they frequently continued doing the same tests. Some of the participants were taught proper hypothesis-testing, but these instructions had almost no effect.<ref name="mynatt1978" />
+
Wason interpreted his results as showing a preference for confirmation over falsification, hence he coined the term "confirmation bias" or "verification bias."<ref name=Poletiek/> He used this bias to explain the results of his selection task experiment.<ref>Peter C. Wason, [https://psycnet.apa.org/record/1969-00332-001 Reasoning about a rule] ''Quarterly Journal of Experimental Psychology'' 20(3) (1968):273–281. Retrieved May 6, 2023.</ref> Participants repeatedly performed badly on various forms of this test, in most cases ignoring information that could potentially refute (falsify) the specified rule.<ref name=Sutherland>Stuart Sutherland, ''Irrationality: The Enemy within'' (Pinter & Martin, 2013, ISBN 978-1780660257).</ref>
  
=== Biased interpretation of information ===
+
== Types of confirmation bias ==
Confirmation biases are not limited to the collection of evidence. Even if two individuals have the same information, the way they interpret it can be biased.
+
=== Biased search for information ===
 +
It has been found repeatedly that people tend to test hypotheses in a one-sided way, by searching for evidence consistent with their current [[hypothesis]].<ref name=Nickerson>Raymond S. Nickerson, ''Argumentation'' (Cambridge University Press, 2020, ISBN 978-1108799874).</ref><ref name=Kunda>Ziva Kunda, ''Social Cognition: Making Sense of People'' (Bradford Book, 1999, ISBN 0262611430).</ref> Rather than searching through all the relevant evidence, they phrase questions to receive an affirmative answer that supports their theory. They look for the consequences that they would expect if their hypothesis was true, rather than what would happen if it was false.<ref name=Baron/> For example, someone using yes/no questions to find a number they suspect to be the number 3 might ask, "Is it an [[odd number]]?" People prefer this type of question, called a "positive test," even when a negative test such as "Is it an even number?" would yield exactly the same information.<ref name=Kida>Thomas E. Kida, ''Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking'' (Prometheus, 2006, ISBN 1591024080).</ref>
  
A team at [[Stanford University]] conducted an experiment involving participants who felt strongly about capital punishment, with half in favor and half against it.<ref name="lord1979" /><ref name=baron201 /> Each participant read descriptions of two studies: a comparison of [[U.S. state]]s with and without the death penalty, and a comparison of murder rates in a state before and after the introduction of the death penalty. After reading a quick description of each study, the participants were asked whether their opinions had changed. Then, they read a more detailed account of each study's procedure and had to rate whether the research was well-conducted and convincing.<ref name="lord1979" /> In fact, the studies were fictional. Half the participants were told that one kind of study supported the [[deterrence (psychology)|deterrent]] effect and the other undermined it, while for other participants the conclusions were swapped.<ref name="lord1979" /><ref name="baron201">{{Harvnb|Baron|2000|pp=201–202}}</ref>
+
The preference for positive tests in itself is not a bias, since positive tests can be highly informative.<ref name=Klaymanha>Joshua Klayman and Young-Won Ha, [http://www.stats.org.uk/statistical-inference/KlaymanHa1987.pdf Confirmation, Disconfirmation, and Information in Hypothesis Testing] ''Psychological Review'' 94(2) (1987): 211-228. Retrieved May 4, 2023.</ref>  However, in combination with other effects, this strategy can confirm existing beliefs or assumptions, independently of whether they are true. In real-world situations, evidence is often complex and mixed. For example, various contradictory ideas about someone could each be supported by concentrating on one aspect of his or her behavior. Thus any search for evidence in favor of a hypothesis is likely to succeed. One illustration of this is the way the phrasing of a question can significantly change the answer.<ref name=Kunda/> For example, people who are asked, "Are you happy with your social life?" report greater satisfaction than those asked, "Are you ''un''happy with your social life?"<ref name=Fine> Cordelia Fine, ''A Mind of Its Own : How Your Brain Distorts and Deceives '' (Icon Books, 2005, ISBN 978-1840466782).</ref>
  
The participants, whether supporters or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Participants described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways.<ref name="lord1979" /><ref name="vyse122">{{Harvnb|Vyse|1997|p=122}}</ref> Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, "The research didn't cover a long enough period of time," while an opponent's comment on the same study said, "No strong evidence to contradict the researchers has been presented."<ref name="lord1979" /> The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations. This effect, known as "disconfirmation bias", has been supported by other experiments.<ref name="taber_political" />
+
The biased search for information may be reduced by a preference for genuine diagnostic tests.<ref name=Kunda/> For example, when asked to rate a person on the [[Introversion and extroversion|introversion–extroversion]] personality dimension on the basis of an interview, if the interviewee was introduced as an introvert, the participants chose questions from the list that presumed introversion, such as, "What do you find unpleasant about noisy parties?" When the interviewee was described as extroverted, they mostly asked questions that presumed extroversion, such as, "What would you do to liven up a dull party?" These [[loaded question]]s gave the interviewees little or no opportunity to falsify the hypothesis about them. However, when given less presumptive questions to choose from, such as, "Do you shy away from social interactions?" participants preferred to ask these more diagnostic questions, showing only a weak bias towards positive tests.<ref name=Poletiek> Fenna H. Poletiek, ''Hypothesis-testing Behaviour'' (Psychology Press, 2000, ISBN 978-1841691596).</ref>
  
Another study of biased interpretation occurred during the [[2004 United States presidential election|2004 U.S. presidential election]] and involved participants who reported having strong feelings about the candidates. They were shown apparently contradictory pairs of statements, either from Republican candidate [[George W. Bush]], Democratic candidate [[John Kerry]] or a politically neutral public figure. They were also given further statements that made the apparent contradiction seem reasonable. From these three pieces of information, they had to decide whether each individual's statements were inconsistent.<ref name="westen2006">{{Citation|last1=Westen |first1=Drew |first2=Pavel S. |last2=Blagov |first3=Keith |last3=Harenski |first4=Clint |last4=Kilts |first5=Stephan |last5=Hamann |year=2006 |title=Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 U.S. Presidential election |journal=[[Journal of Cognitive Neuroscience]] |volume=18 |issue=11 |pages=1947–1958 |doi=10.1162/jocn.2006.18.11.1947 |pmid=17069484|citeseerx=10.1.1.578.8097 |s2cid=8625992 }}</ref>{{rp|1948}} There were strong differences in these evaluations, with participants much more likely to interpret statements from the candidate they opposed as contradictory.<ref name="westen2006" />{{rp|1951}}
+
=== Biased interpretation ===
 +
Confirmation biases are not limited to the collection of evidence. Even if two individuals have the same information, the way they interpret it can be biased:
 +
<blockquote>People who hold strong opinions on complex social issues are likely to examine relevant empirical evidence in a biased manner. They are apt to accept "confirming" evidence at face value while subjecting "disconfirming" evidence to critical evaluation, and, as a result, draw undue support for their initial positions from mixed or random empirical findings. Thus, the result of exposing contending factions in a social dispute to an identical body of relevant empirical evidence may be not a narrowing of disagreement but rather an increase in polarization.<ref name="lord1979" /></blockquote>  
  
In this experiment, the participants made their judgments while in a [[magnetic resonance imaging]] (MRI) scanner which monitored their brain activity. As participants evaluated contradictory statements by their favored candidate, [[emotion]]al centers of their brains were aroused. This did not happen with the statements by the other figures. The experimenters inferred that the different responses to the statements were not due to passive reasoning errors. Instead, the participants were actively reducing the [[cognitive dissonance]] induced by reading about their favored candidate's irrational or [[Hypocrisy|hypocritical]] behavior.<ref name="westen2006" />{{rp|1956}}
+
For example, people who felt strongly about [[capital punishment]], half in favor and half against it, were given descriptions of two studies: a comparison of [[U.S. state]]s with and without the death penalty, and a comparison of [[murder]] rates in a state before and after the introduction of the death penalty. After reading a quick description of each study, the participants were asked whether their opinions had changed. Then, they read a more detailed account of each study's procedure and had to rate whether the research was well-conducted and convincing. In fact, the studies were fictional. Half the participants were told that one kind of study supported the [[deterrence (psychology)|deterrent]] effect and the other undermined it, while for other participants the conclusions were reversed.<ref name=Baron/>
  
Biased interpretation is not restricted to emotionally significant topics. In another experiment, participants were told a story about a theft. They had to rate the evidential importance of statements arguing either for or against a particular character being responsible. When they hypothesized that character's guilt, they rated statements supporting that hypothesis as more important than conflicting statements.<ref>{{Citation |last1=Gadenne |first1=V. |first2=M. |last2=Oswald |year=1986 |title=Entstehung und Veränderung von Bestätigungstendenzen beim Testen von Hypothesen [Formation and alteration of confirmatory tendencies during the testing of hypotheses] |journal=Zeitschrift für Experimentelle und Angewandte Psychologie |volume=33 |pages=360–374}} via {{Harvnb|Oswald|Grosjean|2004|p=89}}</ref>
+
The participants, whether supporters or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Participants described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways: Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, "The research didn't cover a long enough period of time," while an opponent's comment on the same study said, "No strong evidence to contradict the researchers has been presented."<ref name="lord1979" /> The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations. This effect, known as "disconfirmation bias," has been supported by other experiments.<ref>Kari Edwards and Edward E. Smith, [https://fbaum.unc.edu/teaching/articles/JSPS-1996-Edwards.pdf A Disconfirmation Bias in the Evaluation of Arguments] ''Journal of Personality and Social Psychology'' 71(1) (1996):5-24. Retrieved May 5, 2023.</ref>
  
=== Biased memory recall of information ===
+
Another study of biased interpretation involved participants who reported having strong feelings about the candidates in the 2004 United States presidential election. They were shown apparently contradictory pairs of statements, either from Republican candidate [[George W. Bush]], Democratic candidate [[John Kerry]], or a politically neutral public figure. They were also given further statements that made the apparent contradiction seem reasonable. From these three pieces of information, they had to decide whether each individual's statements were inconsistent. There were strong differences in these evaluations, with participants much more likely to interpret statements from the candidate they opposed as contradictory.The participants made their judgments while in a [[magnetic resonance imaging]] (MRI) scanner which monitored their brain activity. As participants evaluated contradictory statements by their favored candidate, [[emotion]]al centers of their brains were aroused. This did not happen with the statements by the other figures. The experimenters inferred that the different responses to the statements were not due to passive reasoning errors. Instead, the participants were actively reducing the [[cognitive dissonance]] induced by reading about their favored candidate's irrational or [[Hypocrisy|hypocritical]] behavior.<ref>Drew Westen, Pavel S. Blagov, Keith Harenski, Clint Kilts, and Stephan Hamann, Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 U.S. Presidential election, ''Journal of Cognitive Neuroscience'' 18(11) (2006): 1947–1958.</ref>
People may remember evidence selectively to reinforce their expectations, even if they gather and interpret evidence in a neutral manner. This effect is called "selective recall", "confirmatory memory", or "access-biased memory".<ref>{{Citation |last1=Hastie |first1=Reid |first2=Bernadette |last2=Park |chapter=The relationship between memory and judgment depends on whether the judgment task is memory-based or on-line |title=Social cognition: key readings |editor-first=David L. |editor-last=Hamilton |publisher=Psychology Press |location=New York |year=2005 |page=394 |isbn=978-0-86377-591-8 |oclc=55078722}}</ref> Psychological theories differ in their predictions about selective recall. [[Schema (psychology)|Schema theory]] predicts that information matching prior expectations will be more easily stored and recalled than information that does not match.<ref name=oswald88 /> Some alternative approaches say that surprising information stands out and so is memorable.<ref name="oswald88">{{Harvnb|Oswald|Grosjean|2004|pp=88–89}}</ref> Predictions from both these theories have been confirmed in different experimental contexts, with no theory winning outright.<ref>{{Citation |last1=Stangor |first1=Charles |first2=David |last2=McMillan |year=1992 |title= Memory for expectancy-congruent and expectancy-incongruent information: A review of the social and social developmental literatures |journal=Psychological Bulletin |volume=111 |issue=1 |pages=42–61 |doi=10.1037/0033-2909.111.1.42}}</ref>
 
  
In one study, participants read a profile of a woman which described a mix of introverted and extroverted behaviors.<ref name="snydercantor" /> They later had to recall examples of her introversion and extroversion. One group was told this was to assess the woman for a job as a librarian, while a second group were told it was for a job in real estate sales. There was a significant difference between what these two groups recalled, with the "librarian" group recalling more examples of introversion and the "sales" groups recalling more extroverted behavior.<ref name="snydercantor">{{Citation |last1= Snyder |first1=M. |first2=N. |last2=Cantor |year=1979 |title=Testing hypotheses about other people: the use of historical knowledge |journal=Journal of Experimental Social Psychology |volume=15 |pages=330–342 |doi=10.1016/0022-1031(79)90042-8 |issue= 4}} via {{Harvnb|Goldacre|2008|p=231}}</ref> A selective memory effect has also been shown in experiments that manipulate the desirability of personality types.<ref name=oswald88 /><ref>{{Harvnb|Kunda|1999|pp=225–232}}</ref> In one of these, a group of participants were shown evidence that extroverted people are more successful than introverts. Another group were told the opposite. In a subsequent, apparently unrelated study, participants were asked to recall events from their lives in which they had been either introverted or extroverted. Each group of participants provided more memories connecting themselves with the more desirable personality type, and recalled those memories more quickly.<ref>{{Citation |last1=Sanitioso |first1=Rasyid |first2=Ziva |last2=Kunda |first3=G.T. |last3=Fong |year=1990 |title= Motivated recruitment of autobiographical memories |journal=Journal of Personality and Social Psychology |issn=0022-3514 |volume=59 |issue=2 |pages=229–241 |doi= 10.1037/0022-3514.59.2.229 |pmid=2213492}}</ref>
+
=== Biased memory recall ===
 +
People may remember evidence selectively to reinforce their expectations, even if they gather and interpret evidence in a neutral manner. This effect may be called "selective recall," "confirmatory memory," or "access-biased memory."<ref> David L. Hamilton (ed.), ''Social Cognition: Key Readings'' (Psychology Press, 2005, ISBN 978-0863775918).</ref>  
  
Changes in emotional states can also influence memory recall.<ref name=levine>{{Citation|last=Levine|first=L.|author2=Prohaska, V. |author3=Burgess, S.L. |author4=Rice, J.A. |author5=Laulhere, T.M. |title=Remembering past emotions: The role of current appraisals|journal=Cognition and Emotion|year=2001|volume=15|issue=4|pages=393–417|doi=10.1080/02699930125955|s2cid=22743423}}</ref><ref name=safer>{{Citation|last=Safer|first=M.A.|author2=Bonanno, G.A. |author3=Field, N. |title=It was never that bad: Biased recall of grief and long-term adjustment to the death of a spouse|journal=Memory|year=2001|volume=9|issue=3|pages=195–203|doi=10.1080/09658210143000065|pmid=11469313|s2cid=24729233}}</ref> Participants rated how they felt when they had first learned that [[O. J. Simpson]] had been acquitted of murder charges.<ref name="levine"/> They described their emotional reactions and confidence regarding the verdict one week, two months, and one year after the trial. Results indicated that participants' assessments for Simpson's guilt changed over time. The more that participants' opinion of the verdict had changed, the less stable were the participant's memories regarding their initial emotional reactions. When participants recalled their initial emotional reactions two months and a year later, past appraisals closely resembled current appraisals of emotion. People demonstrate sizable myside bias when discussing their opinions on controversial topics.<ref name="stanovich"/> Memory recall and construction of experiences undergo revision in relation to corresponding emotional states.
+
Studies have shown, for example, that emotional memories are reconstructed by current emotional states. When widows and widowers rated the intensity of their experienced grief six months and five years after the deaths of their spouses, they noted a higher experience of grief at six months rather than at five years. Yet, when the participants were asked after five years how they had felt six months after the death of their significant other, the intensity of grief participants recalled was highly [[correlation|correlated]] with their current level of grief. Individuals appear to utilize their current emotional states to analyze how they must have felt when experiencing past events.<ref name=levine>Linda J. Levine, Vincent Prohaska, Stewart L. Burgess, John A. Rice, and Tracy M. Laulhere, Remembering past emotions: The role of current appraisals, ''Cognition and Emotion'' 15(4) (2001):393–417.</ref>
  
Myside bias has been shown to influence the accuracy of memory recall.<ref name="safer"/> In an experiment, widows and widowers rated the intensity of their experienced grief six months and five years after the deaths of their spouses. Participants noted a higher experience of grief at six months rather than at five years. Yet, when the participants were asked after five years how they had felt six months after the death of their significant other, the intensity of grief participants recalled was highly [[correlation|correlated]] with their current level of grief. Individuals appear to utilize their current emotional states to analyze how they must have felt when experiencing past events.<ref name=levine /> Emotional memories are reconstructed by current emotional states.
+
A selective memory effect has also been shown in experiments that manipulate the desirability of [[personality]] types.<ref name=Kunda/> For example, a group of participants were shown evidence that extroverted people are more successful than introverts. Another group were told the opposite. In a subsequent, apparently unrelated study, participants were asked to recall events from their lives in which they had been either introverted or extroverted. Each group of participants provided more memories connecting themselves with the more desirable personality type, and recalled those memories more quickly.<ref name=Kunda/>
  
One study showed how selective memory can maintain belief in [[extrasensory perception]] (ESP).<ref name="russell_jones">{{Citation |last1=Russell |first1=Dan |first2=Warren H. |last2=Jones |year=1980 |title=When superstition fails: Reactions to disconfirmation of paranormal beliefs |journal=Personality and Social Psychology Bulletin |volume=6 |issue=1 |pages= 83–88 |issn=1552-7433 |doi=10.1177/014616728061012|s2cid=145060971 }} via {{Harvnb|Vyse|1997|p=121}}</ref> Believers and disbelievers were each shown descriptions of ESP experiments. Half of each group were told that the experimental results supported the existence of ESP, while the others were told they did not. In a subsequent test, participants recalled the material accurately, apart from believers who had read the non-supportive evidence. This group remembered significantly less information and some of them incorrectly remembered the results as supporting ESP.<ref name="russell_jones" />
+
Another study showed how selective memory can maintain belief in [[extrasensory perception]] (ESP). Believers and disbelievers were each shown descriptions of ESP experiments. Half of each group were told that the experimental results supported the existence of ESP, while the others were told they did not. In a subsequent test, participants recalled the material accurately, apart from believers who had read the non-supportive evidence. This group remembered significantly less information and some of them incorrectly remembered the results as supporting ESP.<ref> Stuart A. Vyse, ''Believing in Magic: The Psychology of Superstition'' (Oxford University Press, 2000, ISBN 978-0195136340).</ref>
  
== Individual differences ==
+
==Explanations==
Myside bias was once believed to be correlated with intelligence; however, studies have shown that myside bias can be more influenced by ability to rationally think as opposed to level of intelligence.<ref name = "stanovich" /> Myside bias can cause an inability to effectively and logically evaluate the opposite side of an argument. Studies have stated that myside bias is an absence of "active open-mindedness", meaning the active search for why an initial idea may be wrong.<ref name="Baron 1995">{{Citation|last=Baron|first=Jonathan|title=Myside bias in thinking about abortion.|journal=Thinking & Reasoning|volume=1|issue=3|year=1995|pages=221–235|doi=10.1080/13546789508256909|citeseerx=10.1.1.112.1603}}</ref> Typically, myside bias is operationalized in empirical studies as the quantity of evidence used in support of their side in comparison to the opposite side.<ref name="Wolfe 2008">{{Citation|last=Wolfe|first=Christopher|author2=Anne Britt|title=The locus of the myside bias in written argumentation|journal=Thinking & Reasoning|year=2008|volume=14|pages=1–27|url=http://think.psy.muohio.edu/home/WolfePublications/Wolfe_Locus_My%20Side%20Bias_2008.pdf|doi=10.1080/13546780701527674|s2cid=40527220}}</ref>
+
Confirmation bias was once believed to be correlated with [[intelligence]]; however, as Michael Shermer observed, "Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons."<ref name=Kida/> It appears that this bias can cause an inability to effectively and logically evaluate the opposite side of an argument. In other words, it is an absence of "active open-mindedness," meaning the active search for why an initial idea may be wrong, rather than a lack of intelligence per se.<ref name=Baron/>  
  
A study has found individual differences in myside bias. This study investigates individual differences that are acquired through learning in a cultural context and are mutable. The researcher found important individual difference in argumentation. Studies have suggested that individual differences such as deductive reasoning ability, ability to overcome belief bias, epistemological understanding, and thinking disposition are significant predictors of the reasoning and generating arguments, counterarguments, and rebuttals.<ref name="Mason 2006">{{Citation|last=Mason|first=Lucia|author2=Scirica, Fabio|title=Prediction of students' argumentation skills about controversial topics by epistemological understanding|journal=Learning and Instruction|date=October 2006|volume=16|issue=5|pages=492–509|doi=10.1016/j.learninstruc.2006.09.007}}</ref><ref name="Weinstock 2009">{{Citation|last=Weinstock|first=Michael|title=Relative expertise in an everyday reasoning task: Epistemic understanding, problem representation, and reasoning competence|journal=Learning and Individual Differences|date=2009|volume=19|issue=4|pages=423–434|doi=10.1016/j.lindif.2009.03.003}}</ref><ref name="Weinstock 2004">{{Citation|last=Weinstock|first=Michael|author2=Neuman, Yair |author3=Tabak, Iris |title=Missing the point or missing the norms? Epistemological norms as predictors of students' ability to identify fallacious arguments|journal=[[Contemporary Educational Psychology]]|date=2004|volume=29|issue=1|pages=77–94|doi=10.1016/S0361-476X(03)00024-9}}</ref>
+
How people view "what makes a good argument" can influence the way a person formulates their own arguments. In a study investigating individual differences of argumentation schema, participants were asked to write essays either for or against their preferred side of an argument. They were given research instructions that took either a balanced or an unrestricted approach. The balanced-research instructions directed participants to create a "balanced" argument, i.e., that included both pros and cons; the unrestricted-research instructions included nothing on how to create the argument.<ref name="Wolfe 2008">Christopher Wolfe and Anne Britt, "The locus of the myside bias in written argumentation" ''Thinking & Reasoning'' 14 (2008): 1–27.</ref>
  
A study by Christopher Wolfe and Anne Britt also investigated how participants' views of "what makes a good argument?" can be a source of myside bias that influences the way a person formulates their own arguments.<ref name="Wolfe 2008" /> The study investigated individual differences of argumentation schema and asked participants to write essays. The participants were randomly assigned to write essays either for or against their preferred side of an argument and were given research instructions that took either a balanced or an unrestricted approach. The balanced-research instructions directed participants to create a "balanced" argument, i.e., that included both pros and cons; the unrestricted-research instructions included nothing on how to create the argument.<ref name="Wolfe 2008" />
+
Overall, the results revealed that the balanced-research instructions significantly increased the incidence of opposing information in arguments. These data also revealed that personal belief was not a ''source'' of myside bias. This evidence is consistent with Baron's understanding—that people's opinions about what makes good thinking can influence how arguments are generated.<ref name="Wolfe 2008" />
  
Overall, the results revealed that the balanced-research instructions significantly increased the incidence of opposing information in arguments. These data also reveal that personal belief is not a ''source'' of myside bias; however, that those participants, who believe that a good argument is one that is based on facts, are more likely to exhibit myside bias than other participants. This evidence is consistent with the claims proposed in Baron's article—that people's opinions about what makes good thinking can influence how arguments are generated.<ref name="Wolfe 2008" />
+
Explanations for confirmation bias also include [[wishful thinking]] and the limited human capacity to process information. Another possibility is that people show confirmation bias because they are pragmatically assessing the costs of being wrong, rather than investigating in a neutral, scientific way.
  
== Discovery ==
+
=== Positive test strategy ===
=== Informal observations ===
+
Klayman and Ha argued that the Wason experiments do not actually demonstrate a bias towards confirmation, but instead a tendency to make tests consistent with the working hypothesis.<ref name=Klaymanha/> They called this the "positive test strategy."<ref name=Kunda/> This strategy is an example of a [[heuristic]]: a reasoning shortcut that is imperfect but easy to compute.<ref name=Plous/> Klayman and Ha used [[Bayesian probability]] and [[information theory]] as their standard of hypothesis-testing, rather than the falsificationism used by Wason. According to these ideas, each answer to a question yields a different amount of information, which depends on the person's prior beliefs. Thus a scientific test of a hypothesis is one that is expected to produce the most information. Since the information content depends on initial probabilities, a positive test can either be highly informative or uninformative. Klayman and Ha argued that when people think about realistic problems, they are looking for a specific answer with a small initial probability. In this case, positive tests are usually more informative than negative tests.<ref name=Klaymanha/> However, in Wason's rule discovery task the answer—three numbers in ascending order—is very broad, so positive tests are unlikely to yield informative answers. Klayman and Ha supported their analysis by citing an experiment that used the labels "DAX" and "MED" in place of "fits the rule" and "doesn't fit the rule." This avoided implying that the aim was to find a low-probability rule. Participants had much more success with this version of the experiment.<ref>Maria Lewicka, "Confirmation bias: Cognitive error or adaptive strategy of action control?" in Mirosław Kofta, Gifford Weary, and Grzegorz Sedek (eds.), ''Personal Control in Action: Cognitive and Motivational Mechanisms'' (Springer, 1998, ISBN 978-0306457203), 233–255.</ref>
[[File:Somer Francis Bacon.jpg|thumb|300px|[[Francis Bacon]]]]
 
Before psychological research on confirmation bias, the phenomenon had been observed throughout history. Beginning with the Greek historian [[Thucydides]] ({{circa|460&nbsp;BC}}&nbsp;– {{circa|395&nbsp;BC}}), who wrote of misguided reason in ''[[History of the Peloponnesian War|The Peloponnesian War]]''; "...&nbsp;for it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy".<ref>{{Thucydides|en|4|108|4|shortref}}.</ref> Italian poet [[Dante Alighieri]] (1265–1321) noted it in the ''[[Divine Comedy]]'', in which [[St. Thomas Aquinas]] cautions Dante upon meeting in Paradise, "opinion—hasty—often can incline to the wrong side, and then affection for one's own opinion binds, confines the mind".<ref>Alighieri, Dante. ''Paradiso'' canto XIII: 118–120. Trans. Allen Mandelbaum.</ref> [[Ibn Khaldun]] noticed the same effect in his ''[[Muqaddimah]]'':<ref>{{Citation |title=The Muqadimmah |author=Ibn Khaldun |publisher=[[Princeton University Press]] |location=Princeton, NJ |year=1958 |page=71}}.</ref>
 
{{blockquote|Untruth naturally afflicts historical information. There are various reasons that make this unavoidable. One of them is partisanship for opinions and schools. ... if the soul is infected with partisanship for a particular opinion or sect, it accepts without a moment's hesitation the information that is agreeable to it. Prejudice and partisanship obscure the critical faculty and preclude critical investigation. The result is that falsehoods are accepted and transmitted.}} In the ''[[Novum Organum]]'', English philosopher and scientist [[Francis Bacon]] (1561–1626)<ref name="baron195">{{Harvnb|Baron|2000|pp=195–196}}.</ref> noted that biased assessment of evidence drove "all superstitions, whether in astrology, dreams, omens, divine judgments or the like".<ref name="bacon">Bacon, Francis (1620). ''Novum Organum''. reprinted in {{Citation |title=The English philosophers from Bacon to Mill |editor-first=E.&nbsp;A. |editor-last=Burtt |publisher=[[Random House]] |location=New York |year=1939 |page=36}} via {{Harvnb|Nickerson|1998|p=176}}.</ref> He wrote:<ref name="bacon"/>
 
{{blockquote|The human understanding when it has once adopted an opinion&nbsp;...<!--"(either as being the received opinion or as being agreeable to itself)" omitted for space—> draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or rejects[.]}}
 
In the second volume of his ''[[The World as Will and Representation]]'' (1844), German philosopher [[Arthur Schopenhauer]] observed that "An adopted hypothesis gives us lynx-eyes for everything that confirms it and makes us blind to everything that contradicts it."<ref>{{Citation|last=Schopenhauer |first=Arthur |title=''The World as Will and Presentation'' |volume=2 |editor1-first=David |editor1-last=Carus |editor2-first=Richard E. |editor2-last=Aquila |location=New York |publisher=[[Routledge]] |year=2011 |orig-year=1844 |page=246}}.</ref>
 
 
 
In his essay (1897) ''[[What Is Art?]]'', Russian novelist [[Leo Tolstoy]] wrote:<ref name=":1">Tolstoy, Leo (1896). ''What Is Art?'' ch. 14 [https://www.gutenberg.org/files/43302/43302-h/43302-h.htm p. 143]. Translated from Russian by Aylmer Maude, New York, 1904. [https://www.gutenberg.org/files/64908/64908-h/64908-h.htm Project Gutenberg edition] released 23 March 2021. Retrieved 17 August 2021.</ref>
 
{{blockquote|I know that most men—not only those considered clever, but even those who are very clever, and capable of understanding most difficult scientific, mathematical, or philosophic problems—can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed, perhaps with much difficulty—conclusions of which they are proud, which they have taught to others, and on which they have built their lives.}} In his essay (1894) ''[[The Kingdom of God Is Within You]]'', Tolstoy had earlier written:<ref name=":2">Tolstoy, Leo (1894). ''The Kingdom of God Is Within You'' [https://www.gutenberg.org/files/43302/43302-h/43302-h.htm p. 49]. Translated from Russian by Constance Garnett, New York, 1894. [https://www.gutenberg.org/files/43302/43302-h/43302-h.htm Project Gutenberg edition] released 26 July 2013. Retrieved 17 August 2021.</ref>
 
{{Blockquote|text=The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.|author=|title=|source=}}
 
 
 
=== Falsification===
 
In Peter Wason's initial experiment published in 1960 (which does not mention the term "confirmation bias"), he repeatedly challenged participants to identify a rule applying to triples of numbers. They were told that (2,4,6) fits the rule. They generated triples, and the experimenter told them whether each triple conformed to the rule.<ref name="nickerson"/>{{rp|179}}
 
 
 
The actual rule was simply "any ascending sequence", but participants had great difficulty in finding it, often announcing rules that were far more specific, such as "the middle number is the average of the first and last".<ref>{{Harvnb|Wason|1960}}</ref> The participants seemed to test only positive examples—triples that obeyed their hypothesized rule. For example, if they thought the rule was, "Each number is two greater than its predecessor," they would offer a triple that fitted (confirmed) this rule, such as (11,13,15) rather than a triple that violated (falsified) it, such as (11,12,19).<ref>{{Harvnb|Lewicka|1998|page=238}}</ref>
 
  
Wason interpreted his results as showing a preference for confirmation over falsification, hence he coined the term "confirmation bias". Wason also used the term "verification bias".<ref>Poletiek, Fenna (2001), Hypothesis-testing behaviour, Hove, UK: Psychology Press, ISBN 978-1-84169-159-6</ref><ref name="oswald">{{Harvnb|Oswald|Grosjean|2004|pp=79–96}}</ref> Wason also used confirmation bias to explain the results of his [[Wason selection task|selection task]] experiment.<ref>{{Citation |last=Wason |first=Peter C. |year=1968 |title=Reasoning about a rule |journal=Quarterly Journal of Experimental Psychology |issn= 1747-0226 |volume=20 |issue=3 |pages=273–278 |doi=10.1080/14640746808400161 |pmid=5683766|s2cid=1212273 }}</ref> Participants repeatedly performed badly on various forms of this test, in most cases ignoring information that could potentially refute (falsify) the specified rule.<ref name="sutherland" /><ref>{{Citation |last1=Barkow |first1=Jerome H. |first2=Leda |last2=Cosmides |first3=John |last3=Tooby |title=The adapted mind: evolutionary psychology and the generation of culture |publisher=[[Oxford University Press]] US |year=1995 |pages=[https://archive.org/details/adaptedmindevolu0000unse/page/181 181–184] |isbn=978-0-19-510107-2 |oclc=33832963 |url=https://archive.org/details/adaptedmindevolu0000unse/page/181 }}</ref>
+
=== Information processing explanations ===
 +
There are several [[information processing]] explanations of confirmation bias.
  
=== Positive test strategy ===
+
==== Cognitive versus motivational ====
Klayman and Ha's 1987 paper argues that the Wason experiments do not actually demonstrate a bias towards confirmation, but instead a tendency to make tests consistent with the working hypothesis.<ref name="klaymanha" /><ref>{{Harvnb|Oswald|Grosjean|2004|pp=81–82, 86–87}}</ref> They called this the "positive test strategy".<ref name=kunda112 /> This strategy is an example of a [[heuristics in judgment and decision making|heuristic]]: a reasoning shortcut that is imperfect but easy to compute.<ref name=Plous/> Klayman and Ha used [[Bayesian probability]] and [[information theory]] as their standard of hypothesis-testing, rather than the falsificationism used by Wason. According to these ideas, each answer to a question yields a different amount of information, which depends on the person's prior beliefs. Thus a scientific test of a hypothesis is one that is expected to produce the most information. Since the information content depends on initial probabilities, a positive test can either be highly informative or uninformative. Klayman and Ha argued that when people think about realistic problems, they are looking for a specific answer with a small initial probability. In this case, positive tests are usually more informative than negative tests.<ref name="klaymanha">{{Citation |last1=Klayman |first1= Joshua |first2=Young-Won |last2=Ha |year=1987 |title=Confirmation, disconfirmation and information in hypothesis testing |journal=[[Psychological Review]] |volume=94 |issue=2 |pages=211–228 |issn=0033-295X |url=http://www.stats.org.uk/statistical-inference/KlaymanHa1987.pdf |access-date=14 August 2009 |doi=10.1037/0033-295X.94.2.211|citeseerx= 10.1.1.174.5232 }}</ref> However, in Wason's rule discovery task the answer—three numbers in ascending order—is very broad, so positive tests are unlikely to yield informative answers. Klayman and Ha supported their analysis by citing an experiment that used the labels "DAX" and "MED" in place of "fits the rule" and "doesn't fit the rule". This avoided implying that the aim was to find a low-probability rule. Participants had much more success with this version of the experiment.<ref>{{Harvnb|Lewicka|1998|page=239}}</ref><ref>{{Citation |last1=Tweney |first1=Ryan D. |first2=Michael E. |last2=Doherty |year=1980 |title=Strategies of rule discovery in an inference task |journal=[[The Quarterly Journal of Experimental Psychology]]|issn=1747-0226 |volume=32 |issue=1 |pages= 109–123 |doi=10.1080/00335558008248237|s2cid=143148831 }} (Experiment&nbsp;IV)</ref>
+
Explanations for biased evidence processing include cognitive and motivational mechanisms.
  
{| style="margin:auto"
+
Cognitive explanations for confirmation bias are based on limitations in people's ability to handle complex tasks, and the shortcuts, or heuristics, that they use. For example, people may judge the reliability of evidence by using the "[[availability heuristic]]" that is, how readily a particular idea comes to mind.<ref name=Kunda/> It is also possible that people can only focus on one thought at a time, so find it difficult to test alternative hypotheses in parallel.<ref name=Nickerson/> Another heuristic is the positive test strategy identified by Klayman and Ha, in which people test a hypothesis by examining cases where they expect a property or event to occur. This heuristic avoids the difficult or impossible task of working out how diagnostic each possible question will be. However, it is not universally reliable, so people can overlook challenges to their existing beliefs.<ref name=Klaymanha/><ref name=Nickerson/>
|-valign="top"
 
| [[Image:Klayman Ha1.svg|thumb|alt=Within the universe of all possible triples, those that fit the true rule are shown schematically as a circle. The hypothesized rule is a smaller circle enclosed within it. |If the true rule (T) encompasses the current hypothesis (H), then positive tests (examining an H to see if it is T) will not show that the hypothesis is false.]]
 
| [[Image:Klayman Ha2.svg|thumb|alt=Two overlapping circles represent the true rule and the hypothesized rule. Any observation falling in the non-overlapping parts of the circles shows that the two rules are not exactly the same. In other words, those observations falsify the hypothesis.|If the true rule (T) ''overlaps'' the current hypothesis (H), then either a negative test or a positive test can potentially falsify H.]]
 
| [[Image:Klayman ha3 annotations.svg|thumb|alt=The triples fitting the hypothesis are represented as a circle within the universe of all triples. The true rule is a smaller circle within this.|When the working hypothesis (H) includes the true rule (T) then positive tests are the ''only'' way to falsify H.]]
 
|}
 
  
In light of this and other critiques, the focus of research moved away from confirmation versus falsification of an hypothesis, to examining whether people test hypotheses in an informative way, or an uninformative but positive way. The search for "true" confirmation bias led psychologists to look at a wider range of effects in how people process information.<ref>{{Harvnb|Oswald|Grosjean|2004|pp=86–89}}</ref>
+
Motivational explanations involve an effect of [[desire (emotion)|desire]] on [[belief]].<ref name=Nickerson/><ref name=Baron/> People prefer positive thoughts over negative ones, the so-called the "[[Pollyanna principle]]."<ref>Margaret W. Matilin, "Pollyanna Principle" in Rüdiger F. Pohl (ed.), ''Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory'' (Psychology Press, 2012, ISBN 978-0415646758).</ref> Applied to [[argument]]s or sources of [[evidence]], this could explain why desired conclusions are more likely to be believed true. Although [[consistency]] is a desirable feature of [[attitude]]s, an excessive drive for consistency is another potential source of bias because it may prevent people from neutrally evaluating new, surprising information.  
  
== Information processing explanations ==
+
Social psychologist [[Ziva Kunda]] combined the cognitive and motivational theories, arguing that motivation creates the bias, but cognitive factors determine the size of the effect.<ref name=Nickerson/>
There are currently three main [[information processing]] explanations of confirmation bias, plus a recent addition.
 
  
=== Cognitive versus motivational ===
+
==== Cost-benefit ====
According to [[Robert MacCoun]], most biased evidence processing occurs through a combination of "cold" (cognitive) and "hot" (motivated) mechanisms.<ref>{{Harvnb|MacCoun|1998}}</ref>
+
Explanations in terms of [[cost-benefit analysis]] assume that people do not just test hypotheses in a disinterested way, but assess the costs of different errors.<ref>Margit E. Oswald and Stefan Grosjean "Confirmation bias" in Rüdiger F. Pohl (ed.), ''Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory'' (Psychology Press, 2012, ISBN 978-0415646758).</ref> [[Yaacov Trope]] and Akiva Liberman suggested that people compare the two different kinds of error: accepting a false hypothesis or rejecting a true hypothesis when seeking evidence. For instance, someone who underestimates a friend's honesty might treat him or her suspiciously and so undermine the friendship. Overestimating the friend's honesty may also be costly, but less so. In this case, it would be rational to seek, evaluate, or remember evidence of their honesty in a biased way.<ref>Y. Trope and A. Liberman, "Social hypothesis testing: Cognitive and motivational mechanisms in E. Tory Higgins and Arie W. Kruglanski (eds.), ''Social Psychology: Handbook of Basic Principles'' (The Guilford Press, 1996, ISBN 978-1572301009).</ref>  
  
Cognitive explanations for confirmation bias are based on limitations in people's ability to handle complex tasks, and the shortcuts, called ''[[heuristics in judgment and decision making|heuristics]]'', that they use.<ref>{{Harvnb|Friedrich|1993|p=298}}</ref> For example, people may judge the reliability of evidence by using the ''[[availability heuristic]]'' that is, how readily a particular idea comes to mind.<ref>{{Harvnb|Kunda|1999|p=94}}</ref> It is also possible that people can only focus on one thought at a time, so find it difficult to test alternative hypotheses in parallel.<ref name ="nickerson"/>{{rp|198–199}} Another heuristic is the positive test strategy identified by Klayman and Ha, in which people test a hypothesis by examining cases where they expect a property or event to occur. This heuristic avoids the difficult or impossible task of working out how diagnostic each possible question will be. However, it is not universally reliable, so people can overlook challenges to their existing beliefs.<ref name="klaymanha" /><ref name ="nickerson"/>{{rp|200}}
+
In this way, confirmation bias an be viewed as a social skill.<ref name=Dardenne>Benoit Dardenne and Jacques-Philippe Leyens, [https://orbi.uliege.be/bitstream/2268/28639/1/dardenne%26leyens_pspb_95.pdf Confirmation bias as a social skill] ''Personality and Social Psychology Bulletin'' 21(11) (1995): 1229–1239. Retrieved May 8, 2023.</ref> For example, when someone gives an initial impression of being introverted or extroverted, questions that match that impression come across as more [[empathic]]. This suggests that when talking to someone who seems to be an introvert, it is a sign of better [[social skills]] to ask, "Do you feel awkward in social situations?" rather than, "Do you like noisy parties?" The connection between confirmation bias and social skills was corroborated by a study of how college students get to know other people. Highly [[self-monitoring]] students, who are more sensitive to their environment and to [[social norm]]s, asked more matching questions when interviewing a high-status staff member than when getting to know fellow students.<ref name=Dardenne/>
  
Motivational explanations involve an effect of [[desire (emotion)|desire]] on [[belief]].<ref name ="nickerson"/>{{rp|197}}<ref>{{Harvnb|Baron|2000|p=206}}</ref> It is known that people prefer positive thoughts over negative ones in a number of ways: this is called the "[[Pollyanna principle]]".<ref>{{Citation |last=Matlin |first=Margaret W. |title=Cognitive illusions: A handbook on fallacies and biases in thinking, judgement and memory |editor-first=Rüdiger F. |editor-last=Pohl |publisher=[[Psychology Press]] |location=Hove, UK |year=2004 |pages=[https://archive.org/details/cognitiveillusio0000unse/page/255 255–272] |chapter=Pollyanna Principle |isbn=978-1-84169-351-4 |oclc=55124398 |chapter-url=https://archive.org/details/cognitiveillusio0000unse/page/255}}</ref> Applied to [[argument]]s or sources of [[evidence]], this could explain why desired conclusions are more likely to be believed true. According to experiments that manipulate the desirability of the conclusion, people demand a high standard of evidence for unpalatable ideas and a low standard for preferred ideas. In other words, they ask, "Can I believe this?" for some suggestions and, "Must I believe this?" for others.<ref>{{Citation|last1=Dawson |first1=Erica |first2=Thomas |last2=Gilovich |first3=Dennis T. |last3=Regan |date=October 2002 |title=Motivated reasoning and performance on the Wason Selection Task |journal=Personality and Social Psychology Bulletin |volume=28 |issue=10 |pages=1379–1387 |doi=10.1177/014616702236869|s2cid=143957893 }}</ref><ref>{{Citation |last1=Ditto |first1=Peter H. |first2= David F. |last2=Lopez |year=1992 |title=Motivated skepticism: Use of differential decision criteria for preferred and nonpreferred conclusions |journal=[[Journal of Personality and Social Psychology]] |volume=63 |issue=4 |pages=568–584 |issn=0022-3514 |doi=10.1037/0022-3514.63.4.568}}</ref> Although [[consistency]] is a desirable feature of attitudes, an excessive drive for consistency is another potential source of bias because it may prevent people from neutrally evaluating new, surprising information. Social psychologist [[Ziva Kunda]] combines the cognitive and motivational theories, arguing that motivation creates the bias, but cognitive factors determine the size of the effect.<ref name ="nickerson"/>{{rp|198}}
+
==== Exploratory versus confirmatory ====
 +
Psychologists [[Jennifer Lerner]] and [[Philip Tetlock]] distinguished two different kinds of thinking process. "[[Exploratory thought]]" neutrally considers multiple points of view and tries to anticipate all possible objections to a particular position, while "confirmatory thought" seeks to justify a specific point of view, namely a confirmation bias. They suggest that when people expect to justify their position to others whose views they already know, they will tend to adopt a similar position to those people, and then use confirmatory thought to bolster their own credibility. However, if the external parties are overly aggressive or critical, people will disengage from thought altogether, and simply assert their personal opinions without justification. Lerner and Tetlock argued that people only push themselves to think critically and logically when they know in advance they will need to explain themselves to others who are well-informed, genuinely interested in the truth, and whose views they do not already know. Because those conditions rarely exist, most people use confirmatory thought most of the time.<ref>Jonathan Haidt, ''The Righteous Mind: Why good people are divided by politics and religion'' (Penguin Books Ltd, 2013, ISBN 978-0141039169).</ref>
  
=== Cost-benefit ===
+
==== Make-believe ====
Explanations in terms of [[cost-benefit analysis]] assume that people do not just test hypotheses in a disinterested way, but assess the costs of different errors.<ref>{{Harvnb|Oswald|Grosjean|2004|pp=91–93}}</ref> Using ideas from [[evolutionary psychology]], James Friedrich suggests that people do not primarily aim at [[truth]] in testing hypotheses, but try to avoid the most costly errors. For example, employers might ask one-sided questions in job interviews because they are focused on weeding out unsuitable candidates.<ref>{{Harvnb|Friedrich|1993|pp=299, 316–317}}</ref> [[Yaacov Trope]] and Akiva Liberman's refinement of this theory assumes that people compare the two different kinds of error: accepting a false hypothesis or rejecting a true hypothesis. For instance, someone who underestimates a friend's honesty might treat him or her suspiciously and so undermine the friendship. Overestimating the friend's honesty may also be costly, but less so. In this case, it would be rational to seek, evaluate or remember evidence of their honesty in a biased way.<ref>{{Citation |last1=Trope |first1=Y. |first2=A. |last2=Liberman |title=Social psychology: Handbook of basic principles |editor1-first=E. Tory |editor1-last=Higgins |editor2-first=Arie W. |editor2-last=Kruglanski |publisher=Guilford Press |location=New York |year=1996 |chapter=Social hypothesis testing: Cognitive and motivational mechanisms |isbn=978-1-57230-100-9 |oclc=34731629}} via {{Harvnb |Oswald|Grosjean|2004|pp=91–93}}</ref> When someone gives an initial impression of being introverted or extroverted, questions that match that impression come across as more [[empathic]].<ref name=dardenne/> This suggests that when talking to someone who seems to be an introvert, it is a sign of better [[social skills]] to ask, "Do you feel awkward in social situations?" rather than, "Do you like noisy parties?" The connection between confirmation bias and social skills was corroborated by a study of how college students get to know other people. Highly [[self-monitoring]] students, who are more sensitive to their environment and to [[social norms]], asked more matching questions when interviewing a high-status staff member than when getting to know fellow students.<ref name="dardenne">{{Citation |last1=Dardenne |first1=Benoit |first2=Jacques-Philippe |last2=Leyens |title=Confirmation bias as a social skill |journal=[[Personality and Social Psychology Bulletin]] |year=1995 |volume=21 |issue=11 |pages=1229–1239 |doi=10.1177/01461672952111011 |s2cid=146709087 |issn=1552-7433|url=https://orbi.uliege.be/bitstream/2268/28639/1/dardenne%26leyens_pspb_95.pdf }}</ref>
+
[[Developmental psychology|Developmental psychologist]] Eve Whitmore has argued that beliefs and biases involved in confirmation bias have their roots in childhood coping through make-believe:  
 +
<blockquote>From the beginning, parents reinforce to their children the skill of pretending in order to cope with the realities inherent in culture and society. Children’s learning about make-believe and mastery of it becomes the basis for more complex forms of self-deception and illusion into adulthood.<ref name =APA>[https://www.apa.org/news/press/releases/2018/08/fake-news Why we're susceptible to fake news – and how to defend against it] ''American Psychological Association'', August 10, 2018. Retrieved May 8, 2023.</ref></blockquote>  
  
=== Exploratory versus confirmatory ===
+
The friction brought on by questioning as an adolescent with developing critical thinking can lead to the rationalization of false beliefs, and the habit of such rationalization can become unconscious over the years.
Psychologists [[Jennifer Lerner]] and [[Philip Tetlock]] distinguish two different kinds of thinking process. ''[[Exploratory thought]]'' neutrally considers multiple points of view and tries to anticipate all possible objections to a particular position, while ''confirmatory thought'' seeks to justify a specific point of view. Lerner and Tetlock say that when people expect to justify their position to others whose views they already know, they will tend to adopt a similar position to those people, and then use confirmatory thought to bolster their own credibility. However, if the external parties are overly aggressive or critical, people will disengage from thought altogether, and simply assert their personal opinions without justification. Lerner and Tetlock say that people only push themselves to think critically and logically when they know in advance they will need to explain themselves to others who are well-informed, genuinely interested in the truth, and whose views they do not already know. Because those conditions rarely exist, they argue, most people are using confirmatory thought most of the time.<ref>{{Citation|editor= Sandra L. Schneider|title=Emerging perspectives on judgment and decision research|year=2003|publisher=[[Cambridge University Press]]|location=Cambridge [u.&nbsp;a.]|isbn=978-0-521-52718-7|page=445|author=Shanteau, James}}</ref><ref>{{Citation|last=Haidt|first=Jonathan|title=The righteous mind: Why good people are divided by politics and religion|year=2013|publisher=Penguin Books|location=London|isbn=978-0-14-103916-9|pages=87–88}}</ref><ref>{{Citation|editor1-first=Susan T.|editor1-last=Fiske|editor2-first=Daniel T.|editor2-last=Gilbert|editor3-first=Gardner|editor3-last=Lindzey|title=The handbook of social psychology|year=2010|publisher=[[Wiley (publisher)|Wiley]]|location=Hoboken, NJ|isbn=978-0-470-13749-9|page=[https://archive.org/details/handbookofsocial5ed2unse/page/811 811]|edition=5th|url=https://archive.org/details/handbookofsocial5ed2unse/page/811}}</ref>
 
 
 
=== Make-believe ===
 
Developmental psychologist Eve Whitmore has argued that beliefs and biases involved in confirmation bias have their roots in childhood coping through make-believe, which becomes "the basis for more complex forms of self-deception and illusion into adulthood." The friction brought on by questioning as an adolescent with developing critical thinking can lead to the rationalization of false beliefs, and the habit of such rationalization can become unconscious over the years.<ref name = "apa">{{cite journal |author1=American Psychological Association |title=Why we're susceptible to fake news – and how to defend against it |journal=[[Skeptical Inquirer]] |date=2018 |volume=42 |issue=6 |pages=8–9 |mode=cs2}}</ref>
 
  
 
== Real-world effects ==
 
== Real-world effects ==
=== Social media ===
+
There are numerous real life situations in which confirmation bias affects people's decision-making. A striking illustration of confirmation bias in the real world is numerological [[pyramidology]]: the practice of finding meaning in the proportions of the Egyptian pyramids. There are many different length measurements that can be made of, for example, the [[Great Pyramid of Giza]] and many ways to combine or manipulate them. Hence it is almost inevitable that people who look at these numbers selectively will find superficially impressive correspondences, for example with the dimensions of the Earth.<ref name=Nickerson/>  
In [[social media]], confirmation bias is amplified by the use of [[filter bubble]]s, or "algorithmic editing", which displays to individuals only information they are likely to agree with, while excluding opposing views.<ref name=":0">{{Citation|url=https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles|title=Ted talk: Beware online "filter bubbles"|last=Pariser|first=Eli|date=2 May 2011|website=TED: Ideas Worth Spreading|access-date=1 October 2017}}</ref> Some have argued that confirmation bias is the reason why society can never escape from filter bubbles, because individuals are psychologically hardwired to seek information that agrees with their preexisting values and beliefs.<ref>{{Citation|url=https://www.newstatesman.com/science-tech/social-media/2016/11/forget-fake-news-facebook-real-filter-bubble-you|title=Forget fake news on Facebook – the real filter bubble is you|last=Self|first=Will|date=28 November 2016|website=NewStatesman|access-date=24 October 2017}}</ref> Others have further argued that the mixture of the two is degrading [[democracy]]—claiming that this "algorithmic editing" removes diverse viewpoints and information—and that unless filter bubble algorithms are removed, voters will be unable to make fully informed political decisions.<ref>{{Citation|url=https://www.wired.com/2015/05/did-facebooks-big-study-kill-my-filter-bubble-thesis/|title=Did Facebook's big study kill my filter bubble thesis?|last=Pariser|first=Eli|date=7 May 2015|magazine=Wired|access-date=24 October 2017}}</ref><ref name=":0" />
 
  
The rise of social media has contributed greatly to the rapid spread of [[fake news]], that is, false and misleading information that is presented as credible news from a seemingly reliable source. Confirmation bias (selecting or reinterpreting evidence to support one's beliefs) is one of three main hurdles cited as to why critical thinking goes astray in these circumstances. The other two are shortcut heuristics (when overwhelmed or short of time, people rely on simple rules such as group consensus or trusting an expert or role model) and social goals (social motivation or peer pressure can interfere with objective analysis of facts at hand).<ref>{{Citation |last1=Kendrick |first1=Douglas T. |first2=Adam B. |last2=Cohen |first3= Steven L. |last3=Neuberg | first4= Robert B. |last4=Cialdini |title=The science of anti-science thinking |journal=Scientific American |year=2020 |volume=29 |issue=4, Fall, Special Issue |pages=84–89}}</ref>
+
Confirmation bias is not only widespread, but can lead to unfortunate consequences:
 +
<blockquote>If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. Many have written about this bias, and it appears to be sufficiently strong and pervasive that one is led to wonder whether the bias, by itself, might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations.<ref>Raymond S. Nickerson, [https://pages.ucsd.edu/~mckenzie/nickersonConfirmationBias.pdf Confirmation Bias: A Ubiquitous Phenomenon in Many Guises] ''Review of General Psychology'' 2(2) (1998): 175-220. Retrieved May 10, 2023.</ref></blockquote>
  
In combating the spread of fake news, social media sites have considered turning toward "digital nudging".<ref>{{Cite journal|last1=Weinmann|first1=Markus|last2=Schneider|first2=Christoph|last3=vom Brocke|first3=Jan|date=2015|title=Digital nudging|journal=SSRN|location=Rochester, NY|ssrn=2708250|doi=10.2139/ssrn.2708250|s2cid=219380211|mode=cs2}}</ref> This can currently be done in two different forms of nudging. This includes nudging of information and nudging of presentation. Nudging of information entails social media sites providing a disclaimer or label questioning or warning users of the validity of the source while nudging of presentation includes exposing users to new information which they may not have sought out but could introduce them to viewpoints that may combat their own confirmation biases.<ref>{{Cite journal|last1=Thornhill|first1=Calum|last2=Meeus|first2=Quentin|last3=Peperkamp|first3=Jeroen|last4=Berendt|first4=Bettina|date=2019|title=A digital nudge to counter confirmation bias|journal=Frontiers in Big Data|volume=2|page=11|doi=10.3389/fdata.2019.00011|pmid=33693334|pmc=7931917|issn=2624-909X|doi-access=free|mode=cs2}}</ref>
+
Attempts have been made to discover ways to overcome, or at least attenuate, the effects of confirmation bias in some common situation.
  
=== Science and scientific research ===
+
=== Conflict and law ===
{{See also|Planck's principle|Replication crisis}}
+
[[File:Witness impeachment.jpg|thumb|right|400px|[[Mock trial]]s allow researchers to examine confirmation biases in a realistic setting.]]
 +
Confirmation bias can be a factor in creating or extending conflicts, from emotionally charged debates to [[war]]s: by interpreting the evidence in their favor, each opposing party can become overconfident that it is in the stronger position.<ref name=Baron/> On the other hand, confirmation bias can result in people ignoring or misinterpreting the signs of an imminent or incipient conflict. For example, psychologists [[Stuart Sutherland]] and Thomas Kida have each argued that [[U.S. Navy]] Admiral [[Husband E. Kimmel]] showed confirmation bias when playing down the first signs of the Japanese [[attack on Pearl Harbor]].<ref name=Sutherland/><ref name=Kida/> In [[police]] investigations, a detective may identify a suspect early in an investigation, but then sometimes largely seek supporting or confirming evidence, ignoring or downplaying falsifying evidence.
  
A distinguishing feature of [[science|scientific thinking]] is the search for confirming or supportive evidence ([[inductive reasoning]]) as well as falsifying evidence ([[deductive reasoning]]). Inductive research in particular can have a serious problem with confirmation bias.<ref>{{Citation|journal=Cognitive Therapy and Research|volume=1|issue=3|pages=229–238|title=Psychology of the scientist: An analysis of problem-solving bias|first1=Michael J.|last1=Mahoney |first2=B.G.|last2=DeMonbreun |year=1977|doi=10.1007/BF01186796|s2cid=9703186}}</ref><ref>{{Citation|title=Norms and counter-norms in a select group of the Apollo moon scientists: A case study of the ambivalence of scientists|first=I. I.|last=Mitroff|journal=American Sociological Review|year=1974|volume=39|issue=4|pages=579–395|jstor=2094423|doi=10.2307/2094423}}
+
Reasoning in judicial and political contexts is sometimes subconsciously biased, favoring conclusions that judges, [[jury|juries]], or governments have already committed to.<ref name=Nickerson/> Since the evidence in a jury [[trial]] can be complex, and jurors often reach decisions about the verdict early on, it is reasonable to expect an attitude polarization effect. The prediction that jurors will become more extreme in their views as they see more evidence has been borne out in experiments with [[mock trial]]s.<ref> Diane F. Halpern, ''Critical Thinking Across the Curriculum: A Brief Edition of Thought and Knowledge'' (Routledge, 1997, ISBN 978-0805827316).</ref>
</ref>
 
  
Many times in the [[history of science]], scientists have resisted new discoveries by selectively interpreting or ignoring unfavorable data.<ref name ="nickerson"/>{{rp|192–194}} Several studies have shown that scientists rate studies that report findings consistent with their prior beliefs more favorably than studies reporting findings inconsistent with their previous beliefs.<ref name="Koehler 1993">{{Harvnb|Koehler|1993}}</ref><ref name="Mahoney 1977">{{Harvnb|Mahoney|1977}}</ref>
+
=== Finance ===
 +
Confirmation bias can lead investors to be overconfident, ignoring evidence that their strategies will lose money.<ref name=WSJ /><ref>Michael M. Pompian, ''Behavioral Finance and Wealth Management: How to Build Optimal Portfolios That Account for Investor Biases'' (John Wiley and Sons, 2006, ISBN 0471745170).</ref> To combat the effect of confirmation bias, investors can try to adopt a contrary viewpoint "for the sake of argument."<ref>David Krueger and John David Mann, ''The Secret Language of Money: How to Make Smarter Financial Decisions and Live a Richer Life'' (McGraw Hill, 2009, ISBN 978-0071623391).</ref> In one technique, they imagine that their investments have collapsed and ask themselves why this might happen.<ref name=WSJ />
  
However, assuming that the research question is relevant, the experimental design adequate and the data are clearly and comprehensively described, the empirical data obtained should be important to the scientific community and should not be viewed prejudicially, regardless of whether they conform to current theoretical predictions.<ref name="Mahoney 1977"/> In practice, researchers may misunderstand, misinterpret, or not read at all studies that contradict their preconceptions, or wrongly cite them anyway as if they actually supported their claims.<ref name="Letrud & Hernes 2019">{{cite journal|last1=Letrud|first1=Kåre|last2=Hernes|first2=Sigbjørn|title=Affirmative citation bias in scientific myth debunking: A three-in-one case study|journal=PLOS ONE|volume=14|issue=9|year=2019|pages=e0222213|doi=10.1371/journal.pone.0222213|pmid=31498834|pmc=6733478|bibcode=2019PLoSO..1422213L|doi-access=free|mode=cs2}}</ref>
+
=== Mass delusions ===
 
+
Confirmation bias can play a key role in the propagation of [[mass delusion]]s. [[Witch trial]]s are frequently cited as an example.<ref>Hugh R. Trevor-Roper, ''The European witch-craze of the Sixteenth and Seventeenth Centuries'' (Penguin Books, 1991, ISBN 978-0140137187).</ref>
Further, confirmation biases can sustain scientific theories or research programs in the face of inadequate or even contradictory evidence.<ref name="sutherland">{{Citation |last=Sutherland |first=Stuart |title=Irrationality |edition=2nd |publisher= Pinter and Martin |location=London |year=2007 |pages=95–103 |isbn=978-1-905177-07-3 |oclc=72151566}}</ref><ref>{{cite web | url =http://nautil.us/issue/24/error/the-trouble-with-scientists | title =The trouble with scientists: How one psychologist is tackling human biases in science | last =Ball | first =Phillip | date =14 May 2015 | website =Nautilus | access-date =6 October 2019 | mode =cs2 | archive-date =7 October 2019 | archive-url =https://web.archive.org/web/20191007124310/http://nautil.us/issue/24/error/the-trouble-with-scientists | url-status =dead }}</ref> The discipline of [[parapsychology]] is often cited as an example in the context of whether it is a [[protoscience]] or a pseudoscience.<ref>{{Citation |last=Sternberg |first=Robert J. |editor1-first=Robert J. |editor1-last=Sternberg |editor2-first=Henry L. |editor2-last=Roediger III |editor3-first=Diane F. |editor3-last=Halpern |title=Critical thinking in psychology |year=2007 |publisher=Cambridge University Press |isbn=978-0-521-60834-3 |oclc=69423179 |page=292 |chapter=Critical thinking in psychology: It really is critical |quote=Some of the worst examples of confirmation bias are in research on parapsychology ... Arguably, there is a whole field here with no powerful confirming data at all. But people want to believe, and so they find ways to believe.}}</ref>
 
 
 
An experimenter's confirmation bias can potentially affect which data are reported. Data that conflict with the experimenter's expectations may be more readily discarded as unreliable, producing the so-called [[publication bias|file drawer effect]]. To combat this tendency, scientific training teaches ways to prevent bias.<ref name="shadish">{{Citation |last=Shadish |first=William R. |title= Critical Thinking in Psychology |editor1-first=Robert J. |editor1-last=Sternberg |editor2-first=Henry L. |editor2-last=Roediger III |editor3-first=Diane F. |editor3-last=Halpern |publisher=Cambridge University Press |year=2007 |page=49 |chapter=Critical thinking in quasi-experimentation |isbn=978-0-521-60834-3}}</ref> For example, [[Design of experiments|experimental design]] of [[randomized controlled trial]]s (coupled with their [[systematic review]]) aims to minimize sources of bias.<ref name="shadish" /><ref>{{Citation | doi = 10.1136/bmj.323.7303.42 | last1 = Jüni | first1 = P.| last2 = Altman | first2 = D.G.| last3 = Egger | first3 = M.| title = Systematic reviews in health care: Assessing the quality of controlled clinical trials | journal = BMJ (Clinical Research Ed.)| volume = 323| issue = 7303| pages = 42–46| year = 2001| pmid = 11440947 | pmc = 1120670}}</ref>
 
 
 
The social process of [[peer review]] aims to mitigate the effect of individual scientists' biases, even though the peer review process itself may be susceptible to such biases<ref>{{Citation|last1=Lee|first1=C.J.|last2=Sugimoto|first2=C.R.|author2-link=Cassidy Sugimoto|last3=Zhang|first3=G.|last4=Cronin|first4=B.|year=2013|title=Bias in peer review|journal=Journal of the Association for Information Science and Technology|volume=64|pages=2–17|doi=10.1002/asi.22784}}</ref><ref>{{Citation |last=Shermer |first=Michael |author-link=Michael Shermer|date=July 2006 |title=The political brain: A recent brain-imaging study shows that our political predilections are a product of unconscious confirmation bias |journal=[[Scientific American]] |volume=295 |issue=1 |pages=36 |issn=0036-8733 |bibcode=2006SciAm.295a..36S |doi=10.1038/scientificamerican0706-36 |pmid=16830675 }}</ref><ref name="Mahoney 1977" /><ref>{{Citation | last1 = Emerson | first1 = G.B.| last2 = Warme | first2 = W.J.| last3 = Wolf | first3 = F.M.| last4 = Heckman | first4 = J.D.| last5 = Brand | first5 = R.A.| last6 = Leopold | first6 = S.S.| doi = 10.1001/archinternmed.2010.406 | title = Testing for the presence of positive-outcome bias in peer review: A randomized controlled trial | journal = [[Archives of Internal Medicine]] | volume = 170 | issue = 21| pages = 1934–1339 | year = 2010 | pmid = 21098355 | doi-access = free }}</ref><ref name="Bartlett 2011">Bartlett, Steven James, "The psychology of abuse in publishing: Peer review and editorial bias," Chap. 7, pp. 147–177, in [[Steven James Bartlett]], ''Normality does not equal mental health: The need to look elsewhere for standards of good psychological health''. Santa Barbara, CA: Praeger, 2011.</ref> Confirmation bias may thus be especially harmful to objective evaluations regarding nonconforming results since biased individuals may regard opposing evidence to be weak in principle and give little serious thought to revising their beliefs.<ref name="Koehler 1993" /> Scientific innovators often meet with resistance from the scientific community, and research presenting controversial results frequently receives harsh peer review.<ref>{{Citation |doi=10.1001/jama.263.10.1438 |last=Horrobin |first=David F. |year=1990 |title=The philosophical basis of peer review and the suppression of innovation |journal=[[Journal of the American Medical Association]] |pmid=2304222 |volume=263 |issue=10 |pages=1438–1441 }}</ref>
 
 
 
=== Finance ===
 
Confirmation bias can lead investors to be overconfident, ignoring evidence that their strategies will lose money.<ref name=WSJ /><ref>{{Citation |title=Behavioral finance and wealth management: how to build optimal portfolios that account for investor biases |first=Michael M. |last=Pompian |publisher=[[John Wiley and Sons]] |isbn=978-0-471-74517-4 |oclc=61864118 |year=2006 |pages=187–190}}</ref> In studies of [[election stock market|political stock markets]], investors made more profit when they resisted bias. For example, participants who interpreted a candidate's debate performance in a neutral rather than partisan way were more likely to profit.<ref>{{Citation |last=Hilton |first=Denis J. |journal=Journal of Behavioral Finance |year=2001 |title=The psychology of financial decision-making: Applications to trading, dealing, and investment analysis |volume=2 |issue=1 |doi=10.1207/S15327760JPFM0201_4 |issn=1542-7579 |pages=37–39|s2cid=153379653 }}</ref> To combat the effect of confirmation bias, investors can try to adopt a contrary viewpoint "for the sake of argument".<ref>{{Citation |first1=David |last1= Krueger |first2=John David |last2=Mann |title=The secret language of money: How to make smarter financial decisions and live a richer life |isbn=978-0-07-162339-1 |oclc=277205993 |year=2009 |publisher=[[McGraw Hill Professional]] |pages=112–113}}</ref> In one technique, they imagine that their investments have collapsed and ask themselves why this might happen.<ref name=WSJ />
 
  
 
=== Medicine and health ===
 
=== Medicine and health ===
Cognitive biases are important variables in clinical decision-making by medical general practitioners (GPs) and medical specialists. Two important ones are confirmation bias and the overlapping availability bias. A GP may make a diagnosis early on during an examination, and then seek confirming evidence rather than falsifying evidence. This cognitive error is partly caused by the availability of evidence about the supposed disorder being diagnosed. For example, the client may have mentioned the disorder, or the GP may have recently read a much-discussed paper about the disorder. The basis of this cognitive shortcut or heuristic (termed anchoring) is that the doctor does not consider multiple possibilities based on evidence, but prematurely latches on (or anchors to) a single cause.<ref>{{Citation | last=Groopman | first= Jerome | title=How doctor's think | publisher=Melbourne: Scribe Publications | year=2007 | pages=64–66 | isbn=978-1-921215-69-8}}</ref> In emergency medicine, because of time pressure, there is a high density of decision-making, and shortcuts are frequently applied. The potential failure rate of these cognitive decisions needs to be managed by education about the 30 or more cognitive biases that can occur, so as to set in place proper debiasing strategies.<ref name="croskerry">{{Citation | last1=Croskerry | first1=Pat | title=Achieving quality in clinical decision making: Cognitive strategies and detection of bias | journal= Academic Emergency Medicine | date=2002 | volume=9 | issue=11 | pages=1184–1204 | doi=10.1197/aemj.9.11.1184 | pmid=12414468 }}.</ref> Confirmation bias may also cause doctors to perform unnecessary medical procedures due to pressure from adamant patients.<ref name="hospitalbias">{{Citation|last1=Pang |first1=Dominic|last2=Bleetman|first2=Anthony|last3=Bleetman| first3=David|last4=Wynne |first4=Max|title=The foreign body that never was: the effects of confirmation bias|journal=British Journal of Hospital Medicine|date=2 June 2017|volume=78|issue=6|pages=350–351|doi=10.12968/hmed.2017.78.6.350|pmid=28614014}}</ref>
+
Confirmation bias has significant impact on clinical decision-making by medical general practitioners (GPs) and medical specialists. A GP may make a diagnosis early on during an examination, and then seek confirming evidence rather than falsifying evidence. In emergency medicine, because of time pressure, there is a high density of decision-making, and shortcuts are frequently applied.  
  
Raymond Nickerson, a psychologist, blames confirmation bias for the ineffective medical procedures that were used for centuries before the [[History of medicine|arrival of scientific medicine]].<ref name ="nickerson"/>{{rp|192}} If a patient recovered, medical authorities counted the treatment as successful, rather than looking for alternative explanations such as that the disease had run its natural course. Biased assimilation is a factor in the modern appeal of [[alternative medicine]], whose proponents are swayed by positive [[anecdotal evidence]] but treat [[scientific evidence]] hyper-critically.<ref>{{Harvnb|Goldacre|2008|p=233}}</ref><ref>{{Citation |last1=Singh |first1=Simon | author-link=Simon Singh |first2=Edzard |last2=Ernst | author-link2=Edzard Ernst |title=Trick or treatment?: Alternative medicine on trial |publisher= Bantam |location=London |year=2008 |isbn=978-0-593-06129-9 |pages=287–288}}</ref><ref>{{Citation |last=Atwood |first=Kimball |year=2004 |title=Naturopathy, pseudoscience, and medicine: Myths and fallacies vs truth |journal=[[Medscape General Medicine]] |volume=6 |issue=1 |page=33|pmc=1140750 |pmid=15208545 }}</ref>
+
Raymond Nickerson, a psychologist, blames confirmation bias for the ineffective medical procedures that were used for centuries before the [[History of medicine|arrival of scientific medicine]].<ref name=Nickerson/> If a patient recovered, medical authorities counted the treatment as successful, rather than looking for alternative explanations such as that the disease had run its natural course. Biased assimilation is a factor in the modern appeal of [[alternative medicine]], whose proponents are swayed by positive [[anecdotal evidence]] but treat [[scientific evidence]] hyper-critically.<ref> Ben Goldacre, ''Bad Science'' (Fourth Estate, 2008, ISBN 978-0007240197).</ref>
  
[[Cognitive therapy]] was developed by [[Aaron T. Beck]] in the early 1960s and has become a popular approach.<ref>{{Citation |first1=Michael |last1=Neenan |first2=Windy |last2=Dryden |year=2004 |title=Cognitive therapy: 100 key points and techniques |publisher=Psychology Press |isbn=978-1-58391-858-6 |oclc=474568621 |page=ix}}</ref> According to Beck, biased information processing is a factor in [[depression (mood)|depression]].<ref>{{Citation |first1=Ivy-Marie |last1=Blackburn |first2=Kate M. |last2=Davidson |year=1995 |title=Cognitive therapy for depression & anxiety: a practitioner's guide |publisher=Wiley-Blackwell |isbn=978-0-632-03986-9 |oclc=32699443 |edition=2 |page=19}}</ref> His approach teaches people to treat evidence impartially, rather than selectively reinforcing negative outlooks.<ref name="baron195" /> [[Phobias]] and [[hypochondria]] have also been shown to involve confirmation bias for threatening information.<ref>{{Citation |first1=Allison G. |last1=Harvey |first2=Edward |last2=Watkins |first3= Warren |last3=Mansell |year=2004 |title=Cognitive behavioural processes across psychological disorders: a transdiagnostic approach to research and treatment |publisher=Oxford University Press |isbn=978-0-19-852888-3 |oclc=602015097 |pages=172–173, 176}}</ref>
+
=== Paranormal beliefs ===
 +
One factor in the appeal of alleged [[psychic]] readings is that listeners apply a confirmation bias which fits the psychic's statements to their own lives. By making a large number of ambiguous statements in each sitting, the psychic gives the client more opportunities to find a match. This is one of the techniques of [[cold reading]], with which a psychic can deliver a subjectively impressive reading without any prior information about the client.<ref>Jonathan C. Smith, ''Pseudoscience and Extraordinary Claims of the Paranormal: A Critical Thinker's Toolkit'' (Wiley-Blackwell, 2009, ISBN 978-1405181228).</ref>
  
=== Politics, law and policing ===
+
=== Scientific research ===
[[File:Witness impeachment.jpg|thumb|right|400px|[[Mock trial]]s allow researchers to examine confirmation biases in a realistic setting.]]
+
A distinguishing feature of [[Scientific method|scientific thinking]] is the search for confirming or supportive evidence ([[inductive reasoning]]) as well as falsifying evidence ([[deductive reasoning]]). Inductive research in particular is susceptible to confirmation bias.
Nickerson argues that reasoning in judicial and political contexts is sometimes subconsciously biased, favoring conclusions that judges, juries or governments have already committed to.<ref name ="nickerson"/>{{rp|191–193}} Since the evidence in a jury trial can be complex, and jurors often reach decisions about the verdict early on, it is reasonable to expect an attitude polarization effect. The prediction that jurors will become more extreme in their views as they see more evidence has been borne out in experiments with [[mock trial]]s.<ref>{{Citation |last1=Myers |first1=D.G. |first2=H. |last2=Lamm |year=1976 |title=The group polarization phenomenon |journal=Psychological Bulletin |volume=83 |pages=602–527 |doi=10.1037/0033-2909.83.4.602 |issue=4}} via {{Harvnb|Nickerson|1998|pp=193–194}}</ref><ref name="halpern">{{Citation |last=Halpern |first=Diane F. |title=Critical thinking across the curriculum: A brief edition of thought and knowledge |publisher=Lawrence Erlbaum Associates |year=1987 |page=194 |isbn=978-0-8058-2731-6 |oclc=37180929}}</ref> Both [[Inquisitorial system|inquisitorial]] and [[Adversarial system|adversarial]] criminal justice systems are affected by confirmation bias.<ref>{{Citation |last=Roach |first= Kent |ssrn=1619124 |title=Wrongful convictions: Adversarial and inquisitorial themes |journal= North Carolina Journal of International Law and Commercial Regulation |volume=35 | year=2010 | pages=387–446 | quote=Quote: Both adversarial and inquisitorial systems seem subject to the dangers of tunnel vision or confirmation bias.}}</ref>
 
  
Confirmation bias can be a factor in creating or extending conflicts, from emotionally charged debates to wars: by interpreting the evidence in their favor, each opposing party can become overconfident that it is in the stronger position.<ref name="baron191">{{Harvnb|Baron|2000|pp=191, 195}}</ref> On the other hand, confirmation bias can result in people ignoring or misinterpreting the signs of an imminent or incipient conflict. For example, psychologists [[Stuart Sutherland]] and Thomas Kida have each argued that [[U.S. Navy]] Admiral [[Husband E. Kimmel]] showed confirmation bias when playing down the first signs of the Japanese [[attack on Pearl Harbor]].<ref name="sutherland" /><ref>{{Harvnb|Kida|2006|p=155}}</ref>
+
Many times in the [[history of science]], scientists have resisted new discoveries by selectively interpreting or ignoring unfavorable data. In practice, researchers may misunderstand, misinterpret, or not read at all studies that contradict their preconceptions, or wrongly cite them anyway as if they actually supported their claims. Further, confirmation biases can sustain scientific theories or research programs in the face of inadequate or even contradictory evidence.<ref name=Sutherland/> The discipline of [[parapsychology]] is often cited as an example in the context of whether it is a pseudoscience:
 +
<blockquote>Some of the worst examples of confirmation bias are in research on parapsychology ... Arguably, there is a whole field here with no powerful confirming data at all. But people want to believe, and so they find ways to believe.<ref name=Critical>Robert J. Sternberg, Henry L. Roediger III, and Diane F. Halpern (eds.), ''Critical Thinking in Psychology'' (Cambridge University Press, 2006, ISBN 978-0521608343).</ref></blockquote>
  
A two-decade study of political pundits by [[Philip E. Tetlock]] found that, on the whole, their predictions were not much better than chance. Tetlock divided experts into "foxes" who maintained multiple hypotheses, and "hedgehogs" who were more dogmatic. In general, the hedgehogs were much less accurate. Tetlock blamed their failure on confirmation bias, and specifically on their inability to make use of new information that contradicted their existing theories.<ref>{{Citation |last=Tetlock |first=Philip E. |title=Expert political judgment: How good is it? How can we know? |publisher=Princeton University Press |location=Princeton, NJ |year=2005 |isbn=978-0-691-12302-8 |oclc=56825108 |pages=125–128}}</ref>
+
An experimenter's confirmation bias can potentially affect which data are reported. Data that conflict with the experimenter's expectations may be more readily discarded as unreliable, producing the so-called [[publication bias|file drawer effect]]. To combat this tendency, scientific training teaches ways to prevent bias. For example, [[Design of experiments|experimental design]] of [[randomized controlled trial]]s (coupled with their [[systematic review]]) aims to minimize sources of bias.<ref name=Critical/>
  
In police investigations, a detective may identify a suspect early in an investigation, but then sometimes largely seek supporting or confirming evidence, ignoring or downplaying falsifying evidence.<ref>{{Citation|last=O'Brien| first=B. | title=Prime suspect: An examination of factors that aggravate and counteract confirmation bias in criminal investigations | journal=Psychology, Public Policy, and Law | date=2009 |volume=15 | issue=4 | pages=315–334 |doi=10.1037/a0017881}}</ref>
+
The social process of [[peer review]] aims to mitigate the effect of individual scientists' biases, even though the peer review process itself may be susceptible to such biases<ref>Steven James Bartlett, "The psychology of abuse in publishing: Peer review and editorial bias," ''Normality Does Not Equal Mental Health: The Need to Look Elsewhere for Standards of Good Psychological Health'' (Santa Barbara, CA: Praeger, 2011, ISBN 978-0313399312), 147–177.</ref> Confirmation bias may thus be especially harmful to objective evaluations regarding nonconforming results since biased individuals may regard opposing evidence to be weak in principle and give little serious thought to revising their beliefs. Scientific innovators often meet with resistance from the scientific community, and research presenting controversial results frequently receives harsh peer review.<ref>David F. Horrobin, [https://jamanetwork.com/journals/jama/article-abstract/380984 The philosophical basis of peer review and the suppression of innovation] ''Journal of the American Medical Association'' 263(10) (1990):1438–1441. Retrieved May 9, 2023.</ref>
  
=== Social psychology ===
+
=== Social media and searches===
Social psychologists have identified two tendencies in the way people seek or interpret information about themselves. ''[[Self-verification]]'' is the drive to reinforce the existing [[self-image]] and ''[[self-enhancement]]'' is the drive to seek positive feedback. Both are served by confirmation biases.<ref name="reconciling">{{Citation |last1=Swann |first1=William B. |first2=Brett W. |last2=Pelham |first3= Douglas S. |last3=Krull |title=Agreeable fancy or disagreeable truth? Reconciling self-enhancement and self-verification |journal=Journal of Personality and Social Psychology |year=1989 |volume=57 |issue=5 |pages=782–791 |issn=0022-3514 |pmid=2810025 |doi=10.1037/0022-3514.57.5.782}}</ref> In experiments where people are given feedback that conflicts with their self-image, they are less likely to attend to it or remember it than when given self-verifying feedback.<ref name="swannread_jesp" /><ref>{{Citation |last=Story |first=Amber L. |title=Self-esteem and memory for favorable and unfavorable personality feedback |journal=Personality and Social Psychology Bulletin |year=1998 |volume=24 |issue=1 |pages= 51–64 |doi=10.1177/0146167298241004 |s2cid=144945319 |issn=1552-7433}}</ref><ref>{{Citation |last1=White |first1=Michael J. |first2=Daniel R. |last2 =Brockett |first3=Belinda G. |last3=Overstreet |title=Confirmatory bias in evaluating personality test information: Am I really that kind of person? |journal=[[Journal of Counseling Psychology]] |year=1993 |volume= 40 |issue=1 |pages=120–126 |doi=10.1037/0022-0167.40.1.120 |issn=0022-0167}}</ref> They reduce the impact of such information by interpreting it as unreliable.<ref name="swannread_jesp">{{Citation |last1=Swann |first1=William B. |first2=Stephen J. |last2=Read |title=Self-verification processes: How we sustain our self-conceptions |journal=[[Journal of Experimental Social Psychology]] |year=1981 |volume=17 |issue=4 |pages=351–372 |issn=0022-1031 |doi=10.1016/0022-1031(81)90043-3}}</ref><ref name="swannread_jpsp">{{Citation |last1=Swann |first1=William B. |first2=Stephen J. |last2=Read |title=Acquiring self-knowledge: The search for feedback that fits |journal=Journal of Personality and Social Psychology |year=1981 |volume=41 |issue=6 |pages=1119–1328 |issn=0022-3514 |doi=10.1037/0022-3514.41.6.1119|citeseerx=10.1.1.537.2324 }}</ref><ref>{{Citation |last1=Shrauger |first1=J. Sidney |first2=Adrian K. |last2=Lund |title=Self-evaluation and reactions to evaluations from others |journal=[[Journal of Personality]] |year=1975 |volume=43 |issue=1 |pmid=1142062 |pages=94–108 |doi=10.1111/j.1467-6494.1975.tb00574.x }}</ref> Similar experiments have found a preference for positive feedback, and the people who give it, over negative feedback.<ref name="reconciling"/>
+
In [[social media]] and personalized searches by internet search engines such as Google and Bing, confirmation bias is amplified by the use of [[filter bubble]]s, or "algorithmic editing," which displays to individuals only information they are likely to agree with, while excluding opposing views. A suggested consequence is the degrading of [[democracy]] given that this "algorithmic editing" removes diverse viewpoints and information, and that unless filter bubble algorithms are removed voters will be unable to make fully informed political decisions.<ref>Eli Pariser, [https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles Beware online "filter bubbles"] ''TED'', May 2, 2011. Retrieved May 9, 2023.</ref>
  
=== Mass delusions ===
+
The rise of social media has contributed greatly to the rapid spread of [[fake news]], that is, false and misleading information that is presented as credible news from a seemingly reliable source. Confirmation bias in the form of selecting or reinterpreting evidence to support one's beliefs is one of the main hurdles cited as to why critical thinking goes astray in these circumstances:
Confirmation bias can play a key role in the propagation of [[mass delusion]]s. [[Witch trial]]s are frequently cited as an example.<ref>{{cite thesis |last=Lidén |first=Moa |date=2018 |title=Confirmation bias in criminal cases |chapter=3.2.4.1 |publisher=Department of Law, Uppsala University |chapter-url=http://www.diva-portal.org/smash/get/diva2:1237959/FULLTEXT01.pdf |access-date=20 February 2020}}</ref><ref>{{cite book |last=Trevor-Roper |first=H.R. |date=1969 |title=The European witch-craze of the sixteenth and seventeenth centuries and other essays |publisher=London: HarperCollins}} {{ISBN?}}</ref>
+
<blockquote>The key to people’s accepting fake news as true, despite evidence to the contrary, is a phenomenon known as confirmation bias, or the tendency for people to seek and accept information that confirms their existing beliefs while rejecting or ignoring that which contradicts those beliefs. ... one could say the brain is hardwired to accept, reject, misremember or distort information based on whether it is viewed as accepting of or threatening to existing beliefs.”<ref name=APA/></blockquote>
  
For another example, in the [[Seattle windshield pitting epidemic]], there seemed to be a "pitting epidemic" in which windshields were damaged due to an unknown cause. As news of the apparent wave of damage spread, more and more people checked their windshields, discovered that their windshields too had been damaged, thus confirming belief in the supposed epidemic. In fact, the windshields were previously damaged, but the damage went unnoticed until people checked their windshields as the delusion spread.<ref>{{cite podcast |url=https://www.stitcher.com/podcast/the-constant/e/64112747 |title=The constant: A history of getting things wrong |website=constantpodcast.com |host=Chrisler, Mark |date=24 September 2019 |access-date=19 February 2020 |mode=cs2}}</ref>
+
In combating the spread of fake news, social media sites have considered turning toward "digital nudging." This includes nudging of information and nudging of presentation. Nudging of information entails social media sites providing a disclaimer or label questioning or warning users of the validity of the source while nudging of presentation includes exposing users to new information which they may not have sought out but could introduce them to viewpoints that may combat their own confirmation biases.<ref>Calum Thornhill, Quentin Meeus, Jeroen Peperkamp, and Bettina Berendt, "A digital nudge to counter confirmation bias," ''Frontiers in Big Data'' 2 (2019):11.</ref>
  
=== Paranormal beliefs ===
+
== Associated effects ==
One factor in the appeal of alleged [[psychic]] readings is that listeners apply a confirmation bias which fits the psychic's statements to their own lives.<ref name="toolkit">{{Citation |last=Smith |first= Jonathan C. |title=Pseudoscience and extraordinary claims of the paranormal: A critical thinker's toolkit |publisher=London: Wiley-Blackwell |year=2009 |pages=149–151 |isbn=978-1-4051-8122-8 |oclc=319499491}}</ref> By making a large number of ambiguous statements in each sitting, the psychic gives the client more opportunities to find a match. This is one of the techniques of [[cold reading]], with which a psychic can deliver a subjectively impressive reading without any prior information about the client.<ref name="toolkit" /> Investigator [[James Randi]] compared the transcript of a reading to the client's report of what the psychic had said, and found that the client showed a strong selective recall of the "hits".<ref>{{Citation |last=Randi |first=James |title=James Randi: Psychic investigator |publisher=London: Boxtree |year=1991 |isbn=978-1-85283-144-8 |oclc= 26359284 |pages=58–62}}</ref>
+
Confirmation bias has been invoked to explain four specific effects:
  
As a striking illustration of confirmation bias in the real world, Nickerson mentions numerological [[pyramidology]]: the practice of finding meaning in the proportions of the Egyptian pyramids.<ref name ="nickerson"/>{{rp|190}} There are many different length measurements that can be made of, for example, the [[Great Pyramid of Giza]] and many ways to combine or manipulate them. Hence it is almost inevitable that people who look at these numbers selectively will find superficially impressive correspondences, for example with the dimensions of the Earth.<ref name ="nickerson"/>{{rp|190}}
+
*Attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence)
 +
*Belief perseverance (when beliefs persist after the evidence for them is shown to be false)
 +
*Irrational primacy effect (a greater reliance on information encountered early in a series)
 +
*Illusory correlation (when people falsely perceive an association between two events or situations).
  
=== Recruitment and selection ===
 
Unconscious cognitive bias (including confirmation bias) in [[recruitment|job recruitment]] affects hiring decisions and can potentially prohibit a diverse and inclusive workplace. There are a variety of unconscious biases that affects recruitment decisions but confirmation bias is one of the major ones, especially during the interview stage.<ref>{{Citation |title= Here is how bias can affect recruitment in your organization | first=Dr Pragva | last= Agarwal | newspaper=[[Forbes]] | date=19 October 2018 | url=https://www.forbes.com/sites/pragyaagarwaleurope/2018/10/19/how-can-bias-during-interviews-affect-recruitment-in-your-organisation/ | access-date=31 July 2019}}</ref> The interviewer will often select a candidate that confirms their own beliefs, even though other candidates are equally or better qualified.
 
 
== Associated effects and outcomes ==
 
 
=== Polarization of opinion ===
 
=== Polarization of opinion ===
{{Main|Attitude polarization}}
+
'''Attitude polarization''', also known as '''belief polarization''', is a phenomenon in which a disagreement becomes more extreme as the different parties consider evidence on the issue. It is one of the effects of confirmation bias: the tendency of people to search for and interpret evidence selectively, to reinforce their current beliefs or attitudes.<ref name=Fine/> When people encounter ambiguous evidence, this bias can result in each of them interpreting it as support for their existing attitudes, widening rather than narrowing the disagreement between them.<ref name="lord1979">Charles G. Lord, Lee Ross, and Mark R. Lepper, [https://psycnet.apa.org/doiLanding?doi=10.1037%2F0022-3514.37.11.2098 Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence] ''Journal of Personality and Social Psychology'' 37(11) (1979):2098–2109. Retrieved May 9, 2023. </ref>
When people with opposing views interpret new information in a biased way, their views can move even further apart. This is called "attitude polarization".<ref name="kuhn_lao" /> The effect was demonstrated by an experiment that involved drawing a series of red and black balls from one of two concealed "bingo baskets". Participants knew that one basket contained 60 percent black and 40 percent red balls; the other, 40 percent black and 60 percent red. The experimenters looked at what happened when balls of alternating color were drawn in turn, a sequence that does not favor either basket. After each ball was drawn, participants in one group were asked to state out loud their judgments of the probability that the balls were being drawn from one or the other basket. These participants tended to grow more confident with each successive draw—whether they initially thought the basket with 60 percent black balls or the one with 60 percent red balls was the more likely source, their estimate of the probability increased. Another group of participants were asked to state probability estimates only at the end of a sequence of drawn balls, rather than after each ball. They did not show the polarization effect, suggesting that it does not necessarily occur when people simply hold opposing positions, but rather when they openly commit to them.<ref>{{Harvnb|Baron|2000|p=201}}</ref>
 
  
A less abstract study was the Stanford biased interpretation experiment, in which participants with strong opinions about the death penalty read about mixed experimental evidence. Twenty-three percent of the participants reported that their views had become more extreme, and this self-reported shift correlated strongly with their initial attitudes.<ref name="lord1979">{{Citation |last1=Lord |first1=Charles G. |first2=Lee |last2 =Ross |first3=Mark R. |last3=Lepper |year=1979 |title=Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence |journal=Journal of Personality and Social Psychology |volume=37 |issue=11 |pages=2098–2109 |issn=0022-3514 |doi=10.1037/0022-3514.37.11.2098|citeseerx=10.1.1.372.1743 }}</ref> In later experiments, participants also reported their opinions becoming more extreme in response to ambiguous information. However, comparisons of their attitudes before and after the new evidence showed no significant change, suggesting that the self-reported changes might not be real.<ref name="taber_political">{{Citation |last1=Taber |first1=Charles S. |first2=Milton |last2=Lodge |date=July 2006 |title=Motivated skepticism in the evaluation of political beliefs |journal=American Journal of Political Science |volume=50 |issue=3 |pages=755–769 |issn=0092-5853 |doi=10.1111/j.1540-5907.2006.00214.x|citeseerx=10.1.1.472.7064 }}</ref><ref name="kuhn_lao">{{Citation |last1=Kuhn |first1=Deanna |first2=Joseph |last2=Lao |date=March 1996 |title=Effects of evidence on attitudes: Is polarization the norm? |journal=Psychological Science |volume=7 |issue=2 |pages=115–120 |doi=10.1111/j.1467-9280.1996.tb00340.x|s2cid=145659040 }}</ref><ref>{{Citation |last1=Miller |first1=A.G.|first2=J.W. |last2=McHoskey |first3=C.M. |last3=Bane |first4=T.G. |last4=Dowd |s2cid=14102789|year=1993 |title=The attitude polarization phenomenon: Role of response measure, attitude extremity, and behavioral consequences of reported attitude change |journal=Journal of Personality and Social Psychology |volume=64 |pages=561–574 |doi=10.1037/0022-3514.64.4.561 |issue=4}}</ref> Based on these experiments, Deanna Kuhn and Joseph Lao concluded that polarization is a real phenomenon but far from inevitable, only happening in a small minority of cases, and it was prompted not only by considering mixed evidence, but by merely thinking about the topic.<ref name="kuhn_lao"/>
+
The related '''backfire effect''' refers to the way people may hold even more strongly onto their beliefs when shown contradictory evidence, which they reject. The phrase was coined by [[Brendan Nyhan]] and Jason Reifler in 2010.<ref>Brendan Nyhan and Jason Reifler, When corrections fail: The persistence of political misperceptions, ''Political Behavior'' 32 (2010): 303–320.</ref>
 
 
Charles Taber and Milton Lodge argued that the Stanford team's result had been hard to replicate because the arguments used in later experiments were too abstract or confusing to evoke an emotional response. The Taber and Lodge study used the emotionally charged topics of [[gun politics|gun control]] and [[affirmative action]].<ref name="taber_political" /> They measured the attitudes of their participants towards these issues before and after reading arguments on each side of the debate. Two groups of participants showed attitude polarization: those with strong prior opinions and those who were politically knowledgeable. In part of this study, participants chose which information sources to read, from a list prepared by the experimenters. For example, they could read the [[National Rifle Association]]'s and the [[Brady Campaign|Brady Anti-Handgun Coalition]]'s arguments on gun control. Even when instructed to be even-handed, participants were more likely to read arguments that supported their existing attitudes than arguments that did not. This biased search for information correlated well with the polarization effect.<ref name="taber_political" />
 
 
 
The '''{{vanchor|backfire effect}}''' is a name for the finding that given evidence against their beliefs, people can reject the evidence and believe even more strongly.<ref>{{Citation|url=http://www.skepdic.com/backfireeffect.html|title=Backfire effect|work=[[The Skeptic's Dictionary]]|access-date=26 April 2012}}</ref><ref name="CJR backfire">{{Citation| url = https://www.cjr.org/behind_the_news/the_backfire_effect.php | title = The backfire effect | access-date = 1 May 2012 | last = Silverman | first = Craig | date = 17 June 2011 | work = Columbia Journalism Review | quote = When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.}}</ref> The phrase was coined by [[Brendan Nyhan]] and Jason Reifler in 2010.<ref>Nyhan, B. & Reifler, J. (2010). 'When corrections fail: The persistence of political misperceptions". ''Political Behavior'', 32, 303–320</ref> However, subsequent research has since failed to replicate findings supporting the backfire effect.<ref>{{Cite news|url=https://educationblog.oup.com/theory-of-knowledge/facts-matter-after-all-rejecting-the-backfire-effect|title=Facts matter after all: rejecting the "backfire effect"|date=12 March 2018|work=Oxford Education Blog|access-date=23 October 2018|language=en-GB}}</ref> One study conducted out of the Ohio State University and George Washington University studied 10,100 participants with 52 different issues expected to trigger a backfire effect. While the findings did conclude that individuals are reluctant to embrace facts that contradict their already held ideology, no cases of backfire were detected.<ref name ="wood">{{Cite journal|last1=Wood|first1=Thomas|last2=Porter|first2=Ethan|date=2019|title=The elusive backfire effect: Mass attitudes' steadfast factual adherence|journal=Political Behavior| volume=41 | pages = 135–163 |doi=10.2139/ssrn.2819073|issn=1556-5068|mode=cs2 }}</ref> The backfire effect has since been noted to be a rare phenomenon rather than a common occurrence<ref>{{Cite web|url=https://www.poynter.org/news/fact-checking-doesnt-backfire-new-study-suggests|title=Fact-checking doesn't 'backfire,' new study suggests|website=Poynter|language=en|access-date=23 October 2018|date=2 November 2016|mode=cs2}}</ref> (compare the [[Boomerang effect (psychology)|boomerang effect]]).
 
  
 
=== Persistence of discredited beliefs ===
 
=== Persistence of discredited beliefs ===
{{main|Belief perseverance}}
 
{{see also|Monty Hall problem}}
 
  
{{Quote box |quote=Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential bases. |source=—Lee Ross and Craig Anderson<ref name="shortcomings"/> |width=30% |align=right}}
+
{{Quote box |quote=Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential bases.|source=—Lee Ross and Craig Anderson<ref name="shortcomings"/> |width=30% |align=right}}
Confirmation biases provide one plausible explanation for the persistence of beliefs when the initial evidence for them is removed or when they have been sharply contradicted.<ref name ="nickerson"/>{{rp|187}} This belief perseverance effect has been first demonstrated experimentally by [[Leon Festinger|Festinger]], Riecken, and Schachter. These psychologists [[Participant observation|spent time with]] a cult whose members were convinced that the world would [[When Prophecy Fails|end]] on 21 December 1954. After the prediction failed, most believers still clung to their faith. Their book describing this research is aptly named ''[[When Prophecy Fails]]''.<ref>{{Citation | last=Festinger | first=Leon | title=When prophecy fails: A social and psychological study of a modern group that predicted the destruction of the world | publisher=New York: Harper Torchbooks.| year=1956}}</ref>
+
Confirmation bias provides one plausible explanation for the persistence of beliefs when the initial evidence for them is removed or when they have been sharply contradicted.<ref name=Nickerson/> This belief perseverance effect was first demonstrated experimentally by [[Leon Festinger|Festinger]] and colleagues, who described the effect as [[Cognitive dissonance]]. These psychologists spent time with a [[cult]] whose members were convinced that the world would end on December 21, 1954. After the prediction failed, most believers still clung to their faith. Their book describing this research is aptly named ''[[When Prophecy Fails]]''.<ref> Leon Festinger, Henry W. Riecken, and Stanley Schachter, ''When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World'' (Harper-Torchbooks, 1956, ISBN 0061311324).</ref>
  
The term ''belief perseverance'', however, was coined in a series of experiments using what is called the "debriefing paradigm": participants read fake evidence for a hypothesis, their [[attitude change]] is measured, then the fakery is exposed in detail. Their attitudes are then measured once more to see if their belief returns to its previous level.<ref name="shortcomings">{{Citation|last1=Ross |first1=Lee |first2=Craig A. |last2=Anderson |title=Judgment under uncertainty: Heuristics and biases |journal=Science |volume=185 |issue=4157 |pages=1124–1131 |bibcode=1974Sci...185.1124T |doi=10.1126/science.185.4157.1124 |year=1974 |pmid=17835457|s2cid=143452957 }}.<br />{{Citation|editor1-first=Daniel |editor1-last=Kahneman |editor2-first=Paul |editor2-last=Slovic |editor3-first=Amos |editor3-last=Tversky |publisher=Cambridge University Press |year=1982 |chapter=Shortcomings in the attribution process: On the origins and maintenance of erroneous social assessments |isbn=978-0-521-28414-1 |oclc=7578020|title=Judgment under uncertainty: Heuristics and biases}}</ref>
+
The term '''belief perseverance''' was coined in a series of experiments using what is called the "debriefing paradigm": participants read fake evidence for a hypothesis, their [[attitude]] change is measured, then the fakery is exposed in detail. Their attitudes are then measured once more to see if their belief returns to its previous level.<ref name="shortcomings"> Lee Ross and Craig A. Anderson, "Judgment under uncertainty: Heuristics and biases" in Daniel Kahneman, Paul Slovic, and Amos Tversky (eds.), ''Judgment Under Uncertainty: Heuristics and Biases'' (Cambridge University Press, 1982, ISBN 978-0521284141), 129-152.</ref> A common finding is that at least some of the initial belief remains even after a full debriefing.<ref name=Kunda/>
 
 
A common finding is that at least some of the initial belief remains even after a full debriefing.<ref name="kunda99">{{Harvnb|Kunda|1999|p=99}}</ref> In one experiment, participants had to distinguish between real and fake suicide notes. The feedback was random: some were told they had done well while others were told they had performed badly. Even after being fully debriefed, participants were still influenced by the feedback. They still thought they were better or worse than average at that kind of task, depending on what they had initially been told.<ref>{{Citation |last1=Ross |first1=Lee |first2=Mark R. |last2=Lepper |first3=Michael |last3=Hubbard |title=Perseverance in self-perception and social perception: Biased attributional processes in the debriefing paradigm |journal=Journal of Personality and Social Psychology |volume=32 |issn=0022-3514 |pages=880–is 892 |year=1975 |issue=5 |doi=10.1037/0022-3514.32.5.880 |pmid=1185517}} via {{Harvnb|Kunda|1999|p=99}}</ref>
 
 
 
In another study, participants read [[job performance]] ratings of two firefighters, along with their responses to a [[risk aversion]] test.<ref name="shortcomings" /> This fictional data was arranged to show either a negative or positive association: some participants were told that a risk-taking firefighter did better, while others were told they did less well than a risk-averse colleague.<ref name="socialperseverance" /> Even if these two case studies were true, they would have been scientifically poor evidence for a conclusion about firefighters in general. However, the participants found them subjectively persuasive.<ref name="socialperseverance">{{Citation |title=Perseverance of social theories: The role of explanation in the persistence of discredited information |first1=Craig A. |last1=Anderson |first2=Mark R. |last2=Lepper |first3=Lee |last3=Ross |journal=Journal of Personality and Social Psychology |year=1980 |volume=39 |issue=6 |pages=1037–1049 |issn=0022-3514 |doi=10.1037/h0077720|citeseerx=10.1.1.130.933 }}</ref> When the case studies were shown to be fictional, participants' belief in a link diminished, but around half of the original effect remained.<ref name="shortcomings" /> Follow-up interviews established that the participants had understood the debriefing and taken it seriously. Participants seemed to trust the debriefing, but regarded the discredited information as irrelevant to their personal belief.<ref name="socialperseverance" />
 
 
 
The [[continued influence effect]] is the tendency for misinformation to continue to influence memory and reasoning about an event, despite the misinformation having been retracted or corrected. This occurs even when the individual believes the correction.<ref>{{cite journal | last=Cacciatore | first=Michael A. | title=Misinformation and public opinion of science and health: Approaches, findings, and future directions | journal=Proceedings of the National Academy of Sciences | volume=118 | issue=15 | date=9 April 2021 | issn=0027-8424 | doi=10.1073/pnas.1912437117 | page=e1912437117 | pmid=33837143 | pmc=8053916 | bibcode=2021PNAS..11812437C | quote=The CIE refers to the tendency for information that is initially presented as true, but later revealed to be false, to continue to affect memory and reasoning | quote-page=4  | mode=cs2| doi-access=free }}</ref>
 
  
 
=== Preference for early information ===
 
=== Preference for early information ===
Experiments have shown that information is weighted more strongly when it appears early in a series, even when the order is unimportant. For example, people form a more positive impression of someone described as "intelligent, industrious, impulsive, critical, stubborn, envious" than when they are given the same words in reverse order.<ref name="baron197">{{Harvnb|Baron|2000|pp=197–200}}</ref> This ''irrational primacy effect'' is independent of the [[serial position effect|primacy effect in memory]] in which the earlier items in a series leave a stronger memory trace.<ref name="baron197"/> Biased interpretation offers an explanation for this effect: seeing the initial evidence, people form a working hypothesis that affects how they interpret the rest of the information.<ref name ="nickerson"/>{{rp|187}}
+
Experiments have shown that information is weighted more strongly when it appears early in a series, even when the order is unimportant. For example, people form a more positive impression of someone described as "intelligent, industrious, impulsive, critical, stubborn, envious" than when they are given the same words in reverse order. This ''irrational primacy effect'' is independent of the primacy effect in [[memory]] in which the earlier items in a series leave a stronger memory trace.<ref name=Baron/> Biased interpretation offers an explanation for this effect: seeing the initial evidence, people form a working hypothesis that affects how they interpret the rest of the information.<ref name=Nickerson/>
 
 
One demonstration of irrational primacy used colored chips supposedly drawn from two urns. Participants were told the color distributions of the urns, and had to estimate the probability of a chip being drawn from one of them.<ref name="baron197"/> In fact, the colors appeared in a prearranged order. The first thirty draws favored one urn and the next thirty favored the other.<ref name ="nickerson"/>{{rp|187}} The series as a whole was neutral, so rationally, the two urns were equally likely. However, after sixty draws, participants favored the urn suggested by the initial thirty.<ref name="baron197" />
 
 
 
Another experiment involved a slide show of a single object, seen as just a blur at first and in slightly better focus with each succeeding slide.<ref name="baron197"/> After each slide, participants had to state their best guess of what the object was. Participants whose early guesses were wrong persisted with those guesses, even when the picture was sufficiently in focus that the object was readily recognizable to other people.<ref name ="nickerson"/>{{rp|187}}
 
  
 
=== Illusory association between events ===
 
=== Illusory association between events ===
 
{{Main|Illusory correlation}}
 
{{Main|Illusory correlation}}
Illusory correlation is the tendency to see non-existent correlations in a set of data.<ref name=fine>{{Harvnb |Fine|2006|pp=66–70}}</ref> This tendency was first demonstrated in a series of experiments in the late 1960s.<ref name=Plous/> In one experiment, participants read a set of psychiatric case studies, including responses to the [[Rorschach inkblot test]]. The participants reported that the homosexual men in the set were more likely to report seeing buttocks, anuses or sexually ambiguous figures in the inkblots. In fact the fictional case studies had been constructed so that the homosexual men were no more likely to report this imagery or, in one version of the experiment, were less likely to report it than heterosexual men.<ref name=fine /> In a survey, a group of experienced psychoanalysts reported the same set of illusory associations with homosexuality.<ref name=fine /><ref name=Plous/>
+
Illusory correlation is the tendency to see non-existent correlations in a set of data. This tendency was first demonstrated in a series of experiments in the late 1960s. In one experiment, participants read a set of psychiatric case studies, including responses to the [[Rorschach inkblot test]]. The participants reported that the homosexual men in the set were more likely to report seeing buttocks or sexually ambiguous figures in the inkblots. In fact the fictional case studies had been constructed so that the homosexual men were no more likely to report this imagery or, in one version of the experiment, were less likely to report it than heterosexual men. In a survey, a group of experienced psychoanalysts reported the same set of illusory associations with homosexuality.<ref name=Fine/><ref name=Plous/>
  
Another study recorded the symptoms experienced by arthritic patients, along with weather conditions over a 15-month period. Nearly all the patients reported that their pains were correlated with weather conditions, although the real correlation was zero.<ref>{{Citation |last1=Redelmeir |first1=D.A. |first2=Amos |last2=Tversky |year=1996 |title=On the belief that arthritis pain is related to the weather |journal=Proceedings of the National Academy of Sciences |volume=93 |pages=2895–2896 |doi=10.1073/pnas.93.7.2895 |pmid=8610138 |issue=7|bibcode=1996PNAS...93.2895R |pmc=39730 |doi-access=free }} via {{Harvnb|Kunda|1999|p=127}}</ref>
+
Another study recorded the symptoms experienced by arthritic patients, along with weather conditions over a 15-month period. Nearly all the patients reported that their pains were correlated with weather conditions, although the real correlation was zero.<ref>D.A. Redelmeir and Amos Tversky, [https://www.pnas.org/doi/abs/10.1073/pnas.93.7.2895 On the belief that arthritis pain is related to the weather]''Proceedings of the National Academy of Sciences'' 93(7) (1996):2895–2896. Retrieved May 10, 2023.</ref>
 
{| class="wikitable" style="width:250px;text-align:center;margin: 1em auto 1em auto"
 
{| class="wikitable" style="width:250px;text-align:center;margin: 1em auto 1em auto"
 
|+ Example
 
|+ Example
Line 222: Line 184:
 
|}
 
|}
  
This effect is a kind of biased interpretation, in that objectively neutral or unfavorable evidence is interpreted to support existing beliefs. It is also related to biases in hypothesis-testing behavior.<ref name="kunda127">{{Harvnb|Kunda|1999|pp=127–130}}</ref> In judging whether two events, such as illness and bad weather, are correlated, people rely heavily on the number of ''positive-positive'' cases: in this example, instances of both pain and bad weather. They pay relatively little attention to the other kinds of observation (of no pain and/or good weather).<ref name=Plous/> This parallels the reliance on positive tests in hypothesis testing.<ref name="kunda127" /> It may also reflect selective recall, in that people may have a sense that two events are correlated because it is easier to recall times when they happened together.<ref name="kunda127" />
+
In judging whether the two events, illness and bad weather, were correlated, participants relied heavily on the number of ''positive-positive'' cases: in this example, instances of both pain and bad weather. They paid relatively little attention to the other kinds of observation (of no pain and/or good weather).<ref name=Plous/> This parallels the reliance on positive tests in hypothesis testing. It may also reflect selective recall, in that people may have a sense that two events are correlated because it is easier to recall times when they happened together.<ref name=Kunda/>
  
 
== Notes ==
 
== Notes ==
Line 228: Line 190:
  
 
== References ==
 
== References ==
 +
* Alighieri, Dante, trans. Allen Mandelbaum. ''The Divine Comedy: Inferno; Purgatorio; Paradiso''. Everyman's Library, 1995. ISBN 978-0679433132
 +
* Bacon, Francis. ''Novum Organum''. Legare Street Press, 2022 (original 1620). ISBN 978-1015466555)
 
* Baron, Jonathan. ''Thinking and Deciding''. New York: Cambridge University Press, 2000. ISBN 978-0521650304
 
* Baron, Jonathan. ''Thinking and Deciding''. New York: Cambridge University Press, 2000. ISBN 978-0521650304
 +
* Bartlett, Steven James. ''Normality Does Not Equal Mental Health: The Need to Look Elsewhere for Standards of Good Psychological Health''. Santa Barbara, CA: Praeger, 2011. ISBN 978-0313399312
 +
* Festinger, Leon, Henry W. Riecken, and Stanley Schachter. ''When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World''. Harper-Torchbooks, 1956. ISBN 0061311324
 +
* Fine, Cordelia. ''A Mind of Its Own : How Your Brain Distorts and Deceives''. Icon Books, 2005. ISBN 978-1840466782
 +
* Goldacre, Ben. ''Bad Science''. Fourth Estate, 2008. ISBN 978-0007240197
 +
* Haidt, Jonathan. ''The Righteous Mind: Why good people are divided by politics and religion''. Penguin Books Ltd, 2013. ISBN 978-0141039169
 +
* Halpern, Diane F. ''Critical Thinking Across the Curriculum: A Brief Edition of Thought and Knowledge''. Routledge, 1997. ISBN 978-0805827316
 +
* Hamilton, David L. (ed.). ''Social Cognition: Key Readings''. Psychology Press, 2005. ISBN 978-0863775918
 +
* Higgins, E. Tory, and Arie W. Kruglanski (eds.). ''Social Psychology: Handbook of Basic Principles''. The Guilford Press, 1996. ISBN 978-1572301009
 +
* Kahneman, Daniel, Paul Slovic, and Amos Tversky (eds.). ''Judgment Under Uncertainty: Heuristics and Biases''. Cambridge University Press, 1982. ISBN 978-0521284141
 +
* Khaldun, Ibn. ''The Muqadimmah''. Pantheon Books, 1958. ISBN 978-0710001955
 +
* Kida, Thomas E. ''Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking''. Prometheus, 2006. ISBN 1591024080
 +
* Kofta, Mirosław, Gifford Weary, and Grzegorz Sedek. ''Personal Control in Action: Cognitive and Motivational Mechanisms''. Springer, 1998. ISBN 978-0306457203
 +
* Krueger, David, and John David Mann. ''The Secret Language of Money: How to Make Smarter Financial Decisions and Live a Richer Life''. McGraw Hill, 2009. ISBN 978-0071623391
 +
* Kunda, Ziva. ''Social Cognition: Making Sense of People''. Bradford Book, 1999. ISBN 0262611430
 +
* Nickerson, Raymond S. ''Argumentation''. Cambridge University Press, 2020. ISBN 978-1108799874
 
* Plous, Scott. ''The Psychology of Judgment and Decision Making''. McGraw-Hill, 1993. ISBN 978-0070504776
 
* Plous, Scott. ''The Psychology of Judgment and Decision Making''. McGraw-Hill, 1993. ISBN 978-0070504776
 
+
* Pohl, Rüdiger F. (ed.). ''Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory''. Psychology Press, 2012. ISBN 978-0415646758
 
+
* Poletiek, Fenna H. ''Hypothesis-testing Behaviour''. Psychology Press, 2000. ISBN 978-1841691596
 
+
* Pompian, Michael M. ''Behavioral Finance and Wealth Management: How to Build Optimal Portfolios That Account for Investor Biases.(ohn Wiley and Sons, 2006. ISBN 0471745170
* {{Citation |last=Fine |first=Cordelia |author-link=Cordelia Fine|title=A Mind of its Own: how your brain distorts and deceives |publisher=[[Icon Books]] |location=Cambridge, UK |year=2006 |isbn=978-1-84046-678-2 |oclc=60668289}}
+
* Schopenhauer, Arthur. ''The World as Will and Presentation Volume 2''. Routledge, 2010. ISBN 0321355806
* {{Citation |last=Friedrich |first=James |title=Primary error detection and minimization (PEDMIN) strategies in social cognition: a reinterpretation of confirmation bias phenomena |journal=Psychological Review |year=1993 |volume=100 |issue=2 |pages=298–319 |pmid=8483985 |issn=0033-295X | doi=10.1037/0033-295X.100.2.298 }}
+
* Smith, Jonathan C. ''Pseudoscience and Extraordinary Claims of the Paranormal: A Critical Thinker's Toolkit''. Wiley-Blackwell, 2009. ISBN 978-1405181228
* {{Citation |last=Goldacre |first=Ben |author-link=Ben Goldacre |title=Bad science |publisher=Fourth Estate |location=London |year=2008 |isbn=978-0-00-724019-7 |oclc=259713114}}
+
* Sternberg, Robert J., Henry L. Roediger III, and Diane F. Halpern (eds.). ''Critical Thinking in Psychology''. Cambridge University Press, 2006. ISBN 978-0521608343
* {{Citation |doi=10.1007/s12144-010-9087-5 |last1=Hergovich |first1=Andreas | last2=Schott |first2=Reinhard |last3=Burger |first3=Christoph |year=2010 |title=Biased evaluation of abstracts depending on topic and conclusion: Further evidence of a confirmation bias within scientific psychology |journal=Current Psychology |volume=29 |pages=188–209 |issue=3 |s2cid=145497196 }}
+
* Sutherland, Stuart. ''Irrationality: The Enemy within''. Pinter & Martin, 2013. ISBN 978-1780660257
* {{Citation |last=Kida |first=Thomas E. |year=2006 |title=Don't believe everything you think: The 6 basic mistakes we make in thinking |location=Amherst, NY |publisher=[[Prometheus Books]] |isbn=978-1-59102-408-8 |oclc=63297791 |url=https://archive.org/details/dontbelieveevery00kida }}
+
* Thucydides. ''History of the Peloponnesian War''. Penguin Classic, 1972 (original 431 B.C.E.). ISBN 978-0140440393
* {{Citation |doi=10.1006/obhd.1993.1044 |last=Koehler |first=Jonathan J. |year=1993 |title=The influence of prior beliefs on scientific judgments of evidence quality |journal=Organizational Behavior and Human Decision Processes |volume=56 |pages=28–55}}
+
* Tolstoy, Leo, trans. Constance Garnett. ''The Kingdom of God Is Within You''. Wentworth Press, 2016 (original 1894). ISBN 978-
* {{Citation |last=Kunda |first=Ziva |author-link=Ziva Kunda|title=Social cognition: Making sense of people |publisher=[[MIT Press]] |year=1999 |isbn=978-0-262-61143-5 |oclc=40618974}}
+
* Tolstoy, Leo, trans. Aylmer Maude, ''What Is Art?'' Hackett Publishing Company, Inc., 1996 (original 1897). ISBN 978-0872202955
* {{Citation |last=Lewicka |first=Maria |editor1-first=Mirosław |editor1-last=Kofta |editor2-first=Gifford |editor2-last=Weary |editor3-first= Grzegorz |editor3-last=Sedek |title=Personal control in action: Cognitive and motivational mechanisms |publisher=Springer |year=1998 |isbn=978-0-306-45720-3 |oclc=39002877 |chapter=Confirmation bias: Cognitive error or adaptive strategy of action control? |pages=233–255}}
+
* Vyse, Stuart A. ''Believing in Magic: The Psychology of Superstition''. Oxford University Press, 2000. ISBN 978-0195136340
* {{Citation |doi=10.1146/annurev.psych.49.1.259 |last=MacCoun |first=Robert J. |year=1998 |title=Biases in the interpretation and use of research results |journal=Annual Review of Psychology |pmid=15012470 |volume=49 |pages=259–287 | url=http://socrates.berkeley.edu/~maccoun/MacCoun_AnnualReview98.pdf}}
 
* {{Citation |doi=10.1007/BF01173636 |last=Mahoney |first=Michael J. |year=1977 |title=Publication prejudices: An experimental study of confirmatory bias in the peer review system |journal=Cognitive Therapy and Research |volume=1 |pages=161–175 |issue=2|s2cid=7350256 }}
 
* {{Citation|last=Nickerson|first=Raymond S.|title=Confirmation bias: A ubiquitous phenomenon in many guises|journal=Review of General Psychology|volume=2|issue=2|pages=175–220|year=1998|doi=10.1037/1089-2680.2.2.175|s2cid=8508954}}
 
* {{Citation |last1=Oswald |first1=Margit E. |first2=Stefan |last2=Grosjean |title=Cognitive illusions: A handbook on fallacies and biases in thinking, judgement and memory |editor-first=Rüdiger F. |editor-last=Pohl |publisher=Psychology Press |location=Hove, UK |year=2004 |chapter=Confirmation bias |isbn=978-1-84169-351-4 |oclc=55124398 |pages=[https://archive.org/details/cognitiveillusio0000unse/page/79 79–96] |chapter-url=https://archive.org/details/cognitiveillusio0000unse/page/79 }}
 
 
 
* {{Citation |last=Poletiek |first=Fenna |title= Hypothesis-testing behaviour |publisher=Psychology Press |location=Hove, UK |year=2001 |isbn=978-1-84169-159-6 |oclc=44683470}}
 
* {{Citation |last1=Risen |first1=Jane |first2=Thomas |last2= Gilovich |title=Critical thinking in psychology |editor1-first=Robert J. |editor1-last=Sternberg |editor2-first=Henry L. |editor2-last=Roediger III |editor3-first=Diane F. |editor3-last=Halpern |publisher= Cambridge University Press |year=2007 |pages=110–130 |chapter=Informal logical fallacies |isbn=978-0-521-60834-3 |oclc=69423179}}
 
* {{Citation |last=Vyse |first=Stuart A. |title=Believing in magic: The psychology of superstition |year=1997 |isbn=978-0-19-513634-0 |location=New York |publisher=Oxford University Press |oclc=35025826}}
 
* {{Citation |last=Wason |first=Peter C. |year=1960 |title=On the failure to eliminate hypotheses in a conceptual task |journal=[[Quarterly Journal of Experimental Psychology]] |volume=12 |issue=3 |issn=1747-0226 |pages=129–140 |doi=10.1080/17470216008416717|s2cid=19237642 }}
 
 
 
 
 
* {{Citation |last=Keohane|first=Joe|title=How facts backfire: Researchers discover a surprising threat to democracy: our brains|url=http://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/?page=full|newspaper=[[Boston Globe]]|date=11 July 2010|publisher=[[The New York Times]]}}
 
* {{Citation|last=Leavitt|first=Fred|title=Dancing with absurdity: Your most cherished beliefs (and all your others) are probably wrong|year=2015|publisher=[[Peter Lang Publishers]]|isbn=}}{{ISBN?}}
 
* {{Citation |title=What intelligence tests miss: The psychology of rational thought |last=Stanovich |first=Keith |publisher=[[Yale University Press]] |location=New Haven (CT) |isbn=978-0-300-12385-2 |date=2009 |url=https://archive.org/details/whatintelligence00stan |type=Lay }}
 
* {{Citation |last=Westen |first=Drew |title=The political brain: The role of emotion in deciding the fate of the nation |publisher=[[PublicAffairs]] |year=2007 |isbn=978-1-58648-425-5 |oclc=86117725 |url=https://archive.org/details/politicalbrainro00west }}
 
  
 
== External links ==
 
== External links ==
All links retrieved  
+
All links retrieved May 10, 2023.
 
* [https://skepdic.com/confirmbias.html confirmation bias] ''Skeptic's Dictionary''  
 
* [https://skepdic.com/confirmbias.html confirmation bias] ''Skeptic's Dictionary''  
 
* [https://youarenotsosmart.com/2010/06/23/confirmation-bias/ Confirmation Bias] by David McRaney, ''You Are Not So Smart'', June 23, 2010
 
* [https://youarenotsosmart.com/2010/06/23/confirmation-bias/ Confirmation Bias] by David McRaney, ''You Are Not So Smart'', June 23, 2010
Line 265: Line 229:
 
* [https://www.psychologytoday.com/us/blog/science-choice/201504/what-is-confirmation-bias What Is Confirmation Bias?] by Shahram Heshmat, ''Psychology Today'', April 23, 2015
 
* [https://www.psychologytoday.com/us/blog/science-choice/201504/what-is-confirmation-bias What Is Confirmation Bias?] by Shahram Heshmat, ''Psychology Today'', April 23, 2015
 
* [https://thedecisionlab.com/biases/confirmation-bias Why do we favor our existing beliefs?] ''The Decision Lab''
 
* [https://thedecisionlab.com/biases/confirmation-bias Why do we favor our existing beliefs?] ''The Decision Lab''
 +
* [https://www.skepdic.com/backfireeffect.html backfire effect] ''Skeptic's Dictionary''
  
 
[[Category:Social sciences]]
 
[[Category:Social sciences]]
 
[[Category:Psychology]]
 
[[Category:Psychology]]
 
{{Credits|Confirmation_bias|1143215094}}
 
{{Credits|Confirmation_bias|1143215094}}

Latest revision as of 16:46, 10 May 2023

Confirmation bias has been described as an internal "yes man," echoing back a person's beliefs like Charles Dickens' character Uriah Heep.[1]

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. This bias has the unfortunate consequence of people holding on to beliefs that are contradicted by evidence. This can lead to polarization of opinions, in which a disagreement becomes more extreme and has the possibility of a tragic outcome.

Flawed decisions due to confirmation bias have been found in a wide range of political, organizational, financial, and scientific contexts. These biases contribute to overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. For example, confirmation bias produces systematic errors in scientific research based on inductive reasoning (the gradual accumulation of supportive evidence). Similarly, a police detective may identify a suspect early in an investigation, but then may only seek confirming rather than disconfirming evidence. A medical practitioner may prematurely focus on a particular disorder early in a diagnostic session, and then seek only confirming evidence. Confirmation bias is amplified by the use of filter bubbles, or "algorithmic editing" commonly used in social media and search engines, which display to individuals only information they are likely to agree with, while excluding opposing views.

Definition

Confirmation bias is a term coined by English psychologist Peter Wason, to describe the tendency for people to immediately favor information that validates their preconceptions, hypotheses, and personal beliefs regardless of whether they are true or not. It also includes the tendency to strive toward proving one’s hypothesis instead of disproving it.[2]

Confirmation bias (or confirmatory bias) has also been termed myside bias, a term suggested by David Perkins, a professor and researcher at the Harvard Graduate School of Education. This reflects the bias as a preference for "my" side of an issue.[3]

Confirmation biases differ from what is sometimes called the behavioral confirmation effect, commonly known as self-fulfilling prophecy, in which a person's expectations influence their own behavior, bringing about the expected result.

Discovery

The phenomenon that has come to be known as "confirmation bias" was observed throughout history. Beginning with the Greek historian Thucydides (c. 460 B.C.E. - c. 400 B.C.E.) wrote of misguided reason in History of the Peloponnesian War: "... for it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy."[4] Italian poet Dante Alighieri (1265–1321) noted it in The Divine Comedy, in which St. Thomas Aquinas cautions Dante upon meeting in Paradise, "opinion—hasty—often can incline to the wrong side, and then affection for one's own opinion binds, confines the mind."[5] Ibn Khaldun noticed the same effect in his Muqaddimah:

Untruth naturally afflicts historical information. There are various reasons that make this unavoidable. One of them is partisanship for opinions and schools. ... if the soul is infected with partisanship for a particular opinion or sect, it accepts without a moment's hesitation the information that is agreeable to it. Prejudice and partisanship obscure the critical faculty and preclude critical investigation. The result is that falsehoods are accepted and transmitted.[6]

In the Novum Organum, English philosopher and scientist Francis Bacon (1561–1626) noted that biased assessment of evidence drove "all superstitions, whether in astrology, dreams, omens, divine judgments or the like."[7] He wrote:

The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or rejects.[7]

In the second volume of his The World as Will and Representation (1844), German philosopher Arthur Schopenhauer observed that "An adopted hypothesis gives us lynx-eyes for everything that confirms it and makes us blind to everything that contradicts it."[8]

In his essay (1894) The Kingdom of God Is Within You, Russian novelist Leo Tolstoy wrote:

The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.[9]

And in his essay (1897) What Is Art?, Tolstoy wrote:

I know that most men—not only those considered clever, but even those who are very clever, and capable of understanding most difficult scientific, mathematical, or philosophic problems—can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed, perhaps with much difficulty—conclusions of which they are proud, which they have taught to others, and on which they have built their lives.[10]

In the 1960s, psychologists gathered experimental data to support such observations that people are biased toward confirming their existing beliefs.[2] The initial experiment that generated the term "confirmation bias" was published by Peter Wason in 1960 (although that published article does not mention the term "confirmation bias").[11] Wason repeatedly challenged participants to identify a rule applying to triples of numbers. They were told that (2,4,6) fits the rule. They generated triples, and the experimenter told them whether each triple conformed to the rule. The actual rule was simply "any ascending sequence," but participants had great difficulty in finding it, often announcing rules that were far more specific, such as "the middle number is the average of the first and last." The participants seemed to test only positive examples—triples that obeyed their hypothesized rule. For example, if they thought the rule was, "Each number is two greater than its predecessor," they would offer a triple that fitted (confirmed) this rule, such as (11,13,15) rather than a triple that violated (falsified) it, such as (11,12,19).[11]

Wason interpreted his results as showing a preference for confirmation over falsification, hence he coined the term "confirmation bias" or "verification bias."[12] He used this bias to explain the results of his selection task experiment.[13] Participants repeatedly performed badly on various forms of this test, in most cases ignoring information that could potentially refute (falsify) the specified rule.[14]

Types of confirmation bias

Biased search for information

It has been found repeatedly that people tend to test hypotheses in a one-sided way, by searching for evidence consistent with their current hypothesis.[15][16] Rather than searching through all the relevant evidence, they phrase questions to receive an affirmative answer that supports their theory. They look for the consequences that they would expect if their hypothesis was true, rather than what would happen if it was false.[3] For example, someone using yes/no questions to find a number they suspect to be the number 3 might ask, "Is it an odd number?" People prefer this type of question, called a "positive test," even when a negative test such as "Is it an even number?" would yield exactly the same information.[17]

The preference for positive tests in itself is not a bias, since positive tests can be highly informative.[18] However, in combination with other effects, this strategy can confirm existing beliefs or assumptions, independently of whether they are true. In real-world situations, evidence is often complex and mixed. For example, various contradictory ideas about someone could each be supported by concentrating on one aspect of his or her behavior. Thus any search for evidence in favor of a hypothesis is likely to succeed. One illustration of this is the way the phrasing of a question can significantly change the answer.[16] For example, people who are asked, "Are you happy with your social life?" report greater satisfaction than those asked, "Are you unhappy with your social life?"[19]

The biased search for information may be reduced by a preference for genuine diagnostic tests.[16] For example, when asked to rate a person on the introversion–extroversion personality dimension on the basis of an interview, if the interviewee was introduced as an introvert, the participants chose questions from the list that presumed introversion, such as, "What do you find unpleasant about noisy parties?" When the interviewee was described as extroverted, they mostly asked questions that presumed extroversion, such as, "What would you do to liven up a dull party?" These loaded questions gave the interviewees little or no opportunity to falsify the hypothesis about them. However, when given less presumptive questions to choose from, such as, "Do you shy away from social interactions?" participants preferred to ask these more diagnostic questions, showing only a weak bias towards positive tests.[12]

Biased interpretation

Confirmation biases are not limited to the collection of evidence. Even if two individuals have the same information, the way they interpret it can be biased:

People who hold strong opinions on complex social issues are likely to examine relevant empirical evidence in a biased manner. They are apt to accept "confirming" evidence at face value while subjecting "disconfirming" evidence to critical evaluation, and, as a result, draw undue support for their initial positions from mixed or random empirical findings. Thus, the result of exposing contending factions in a social dispute to an identical body of relevant empirical evidence may be not a narrowing of disagreement but rather an increase in polarization.[20]

For example, people who felt strongly about capital punishment, half in favor and half against it, were given descriptions of two studies: a comparison of U.S. states with and without the death penalty, and a comparison of murder rates in a state before and after the introduction of the death penalty. After reading a quick description of each study, the participants were asked whether their opinions had changed. Then, they read a more detailed account of each study's procedure and had to rate whether the research was well-conducted and convincing. In fact, the studies were fictional. Half the participants were told that one kind of study supported the deterrent effect and the other undermined it, while for other participants the conclusions were reversed.[3]

The participants, whether supporters or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Participants described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways: Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, "The research didn't cover a long enough period of time," while an opponent's comment on the same study said, "No strong evidence to contradict the researchers has been presented."[20] The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations. This effect, known as "disconfirmation bias," has been supported by other experiments.[21]

Another study of biased interpretation involved participants who reported having strong feelings about the candidates in the 2004 United States presidential election. They were shown apparently contradictory pairs of statements, either from Republican candidate George W. Bush, Democratic candidate John Kerry, or a politically neutral public figure. They were also given further statements that made the apparent contradiction seem reasonable. From these three pieces of information, they had to decide whether each individual's statements were inconsistent. There were strong differences in these evaluations, with participants much more likely to interpret statements from the candidate they opposed as contradictory.The participants made their judgments while in a magnetic resonance imaging (MRI) scanner which monitored their brain activity. As participants evaluated contradictory statements by their favored candidate, emotional centers of their brains were aroused. This did not happen with the statements by the other figures. The experimenters inferred that the different responses to the statements were not due to passive reasoning errors. Instead, the participants were actively reducing the cognitive dissonance induced by reading about their favored candidate's irrational or hypocritical behavior.[22]

Biased memory recall

People may remember evidence selectively to reinforce their expectations, even if they gather and interpret evidence in a neutral manner. This effect may be called "selective recall," "confirmatory memory," or "access-biased memory."[23]

Studies have shown, for example, that emotional memories are reconstructed by current emotional states. When widows and widowers rated the intensity of their experienced grief six months and five years after the deaths of their spouses, they noted a higher experience of grief at six months rather than at five years. Yet, when the participants were asked after five years how they had felt six months after the death of their significant other, the intensity of grief participants recalled was highly correlated with their current level of grief. Individuals appear to utilize their current emotional states to analyze how they must have felt when experiencing past events.[24]

A selective memory effect has also been shown in experiments that manipulate the desirability of personality types.[16] For example, a group of participants were shown evidence that extroverted people are more successful than introverts. Another group were told the opposite. In a subsequent, apparently unrelated study, participants were asked to recall events from their lives in which they had been either introverted or extroverted. Each group of participants provided more memories connecting themselves with the more desirable personality type, and recalled those memories more quickly.[16]

Another study showed how selective memory can maintain belief in extrasensory perception (ESP). Believers and disbelievers were each shown descriptions of ESP experiments. Half of each group were told that the experimental results supported the existence of ESP, while the others were told they did not. In a subsequent test, participants recalled the material accurately, apart from believers who had read the non-supportive evidence. This group remembered significantly less information and some of them incorrectly remembered the results as supporting ESP.[25]

Explanations

Confirmation bias was once believed to be correlated with intelligence; however, as Michael Shermer observed, "Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons."[17] It appears that this bias can cause an inability to effectively and logically evaluate the opposite side of an argument. In other words, it is an absence of "active open-mindedness," meaning the active search for why an initial idea may be wrong, rather than a lack of intelligence per se.[3]

How people view "what makes a good argument" can influence the way a person formulates their own arguments. In a study investigating individual differences of argumentation schema, participants were asked to write essays either for or against their preferred side of an argument. They were given research instructions that took either a balanced or an unrestricted approach. The balanced-research instructions directed participants to create a "balanced" argument, i.e., that included both pros and cons; the unrestricted-research instructions included nothing on how to create the argument.[26]

Overall, the results revealed that the balanced-research instructions significantly increased the incidence of opposing information in arguments. These data also revealed that personal belief was not a source of myside bias. This evidence is consistent with Baron's understanding—that people's opinions about what makes good thinking can influence how arguments are generated.[26]

Explanations for confirmation bias also include wishful thinking and the limited human capacity to process information. Another possibility is that people show confirmation bias because they are pragmatically assessing the costs of being wrong, rather than investigating in a neutral, scientific way.

Positive test strategy

Klayman and Ha argued that the Wason experiments do not actually demonstrate a bias towards confirmation, but instead a tendency to make tests consistent with the working hypothesis.[18] They called this the "positive test strategy."[16] This strategy is an example of a heuristic: a reasoning shortcut that is imperfect but easy to compute.[2] Klayman and Ha used Bayesian probability and information theory as their standard of hypothesis-testing, rather than the falsificationism used by Wason. According to these ideas, each answer to a question yields a different amount of information, which depends on the person's prior beliefs. Thus a scientific test of a hypothesis is one that is expected to produce the most information. Since the information content depends on initial probabilities, a positive test can either be highly informative or uninformative. Klayman and Ha argued that when people think about realistic problems, they are looking for a specific answer with a small initial probability. In this case, positive tests are usually more informative than negative tests.[18] However, in Wason's rule discovery task the answer—three numbers in ascending order—is very broad, so positive tests are unlikely to yield informative answers. Klayman and Ha supported their analysis by citing an experiment that used the labels "DAX" and "MED" in place of "fits the rule" and "doesn't fit the rule." This avoided implying that the aim was to find a low-probability rule. Participants had much more success with this version of the experiment.[27]

Information processing explanations

There are several information processing explanations of confirmation bias.

Cognitive versus motivational

Explanations for biased evidence processing include cognitive and motivational mechanisms.

Cognitive explanations for confirmation bias are based on limitations in people's ability to handle complex tasks, and the shortcuts, or heuristics, that they use. For example, people may judge the reliability of evidence by using the "availability heuristic" that is, how readily a particular idea comes to mind.[16] It is also possible that people can only focus on one thought at a time, so find it difficult to test alternative hypotheses in parallel.[15] Another heuristic is the positive test strategy identified by Klayman and Ha, in which people test a hypothesis by examining cases where they expect a property or event to occur. This heuristic avoids the difficult or impossible task of working out how diagnostic each possible question will be. However, it is not universally reliable, so people can overlook challenges to their existing beliefs.[18][15]

Motivational explanations involve an effect of desire on belief.[15][3] People prefer positive thoughts over negative ones, the so-called the "Pollyanna principle."[28] Applied to arguments or sources of evidence, this could explain why desired conclusions are more likely to be believed true. Although consistency is a desirable feature of attitudes, an excessive drive for consistency is another potential source of bias because it may prevent people from neutrally evaluating new, surprising information.

Social psychologist Ziva Kunda combined the cognitive and motivational theories, arguing that motivation creates the bias, but cognitive factors determine the size of the effect.[15]

Cost-benefit

Explanations in terms of cost-benefit analysis assume that people do not just test hypotheses in a disinterested way, but assess the costs of different errors.[29] Yaacov Trope and Akiva Liberman suggested that people compare the two different kinds of error: accepting a false hypothesis or rejecting a true hypothesis when seeking evidence. For instance, someone who underestimates a friend's honesty might treat him or her suspiciously and so undermine the friendship. Overestimating the friend's honesty may also be costly, but less so. In this case, it would be rational to seek, evaluate, or remember evidence of their honesty in a biased way.[30]

In this way, confirmation bias an be viewed as a social skill.[31] For example, when someone gives an initial impression of being introverted or extroverted, questions that match that impression come across as more empathic. This suggests that when talking to someone who seems to be an introvert, it is a sign of better social skills to ask, "Do you feel awkward in social situations?" rather than, "Do you like noisy parties?" The connection between confirmation bias and social skills was corroborated by a study of how college students get to know other people. Highly self-monitoring students, who are more sensitive to their environment and to social norms, asked more matching questions when interviewing a high-status staff member than when getting to know fellow students.[31]

Exploratory versus confirmatory

Psychologists Jennifer Lerner and Philip Tetlock distinguished two different kinds of thinking process. "Exploratory thought" neutrally considers multiple points of view and tries to anticipate all possible objections to a particular position, while "confirmatory thought" seeks to justify a specific point of view, namely a confirmation bias. They suggest that when people expect to justify their position to others whose views they already know, they will tend to adopt a similar position to those people, and then use confirmatory thought to bolster their own credibility. However, if the external parties are overly aggressive or critical, people will disengage from thought altogether, and simply assert their personal opinions without justification. Lerner and Tetlock argued that people only push themselves to think critically and logically when they know in advance they will need to explain themselves to others who are well-informed, genuinely interested in the truth, and whose views they do not already know. Because those conditions rarely exist, most people use confirmatory thought most of the time.[32]

Make-believe

Developmental psychologist Eve Whitmore has argued that beliefs and biases involved in confirmation bias have their roots in childhood coping through make-believe:

From the beginning, parents reinforce to their children the skill of pretending in order to cope with the realities inherent in culture and society. Children’s learning about make-believe and mastery of it becomes the basis for more complex forms of self-deception and illusion into adulthood.[33]

The friction brought on by questioning as an adolescent with developing critical thinking can lead to the rationalization of false beliefs, and the habit of such rationalization can become unconscious over the years.

Real-world effects

There are numerous real life situations in which confirmation bias affects people's decision-making. A striking illustration of confirmation bias in the real world is numerological pyramidology: the practice of finding meaning in the proportions of the Egyptian pyramids. There are many different length measurements that can be made of, for example, the Great Pyramid of Giza and many ways to combine or manipulate them. Hence it is almost inevitable that people who look at these numbers selectively will find superficially impressive correspondences, for example with the dimensions of the Earth.[15]

Confirmation bias is not only widespread, but can lead to unfortunate consequences:

If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. Many have written about this bias, and it appears to be sufficiently strong and pervasive that one is led to wonder whether the bias, by itself, might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations.[34]

Attempts have been made to discover ways to overcome, or at least attenuate, the effects of confirmation bias in some common situation.

Conflict and law

Mock trials allow researchers to examine confirmation biases in a realistic setting.

Confirmation bias can be a factor in creating or extending conflicts, from emotionally charged debates to wars: by interpreting the evidence in their favor, each opposing party can become overconfident that it is in the stronger position.[3] On the other hand, confirmation bias can result in people ignoring or misinterpreting the signs of an imminent or incipient conflict. For example, psychologists Stuart Sutherland and Thomas Kida have each argued that U.S. Navy Admiral Husband E. Kimmel showed confirmation bias when playing down the first signs of the Japanese attack on Pearl Harbor.[14][17] In police investigations, a detective may identify a suspect early in an investigation, but then sometimes largely seek supporting or confirming evidence, ignoring or downplaying falsifying evidence.

Reasoning in judicial and political contexts is sometimes subconsciously biased, favoring conclusions that judges, juries, or governments have already committed to.[15] Since the evidence in a jury trial can be complex, and jurors often reach decisions about the verdict early on, it is reasonable to expect an attitude polarization effect. The prediction that jurors will become more extreme in their views as they see more evidence has been borne out in experiments with mock trials.[35]

Finance

Confirmation bias can lead investors to be overconfident, ignoring evidence that their strategies will lose money.[1][36] To combat the effect of confirmation bias, investors can try to adopt a contrary viewpoint "for the sake of argument."[37] In one technique, they imagine that their investments have collapsed and ask themselves why this might happen.[1]

Mass delusions

Confirmation bias can play a key role in the propagation of mass delusions. Witch trials are frequently cited as an example.[38]

Medicine and health

Confirmation bias has significant impact on clinical decision-making by medical general practitioners (GPs) and medical specialists. A GP may make a diagnosis early on during an examination, and then seek confirming evidence rather than falsifying evidence. In emergency medicine, because of time pressure, there is a high density of decision-making, and shortcuts are frequently applied.

Raymond Nickerson, a psychologist, blames confirmation bias for the ineffective medical procedures that were used for centuries before the arrival of scientific medicine.[15] If a patient recovered, medical authorities counted the treatment as successful, rather than looking for alternative explanations such as that the disease had run its natural course. Biased assimilation is a factor in the modern appeal of alternative medicine, whose proponents are swayed by positive anecdotal evidence but treat scientific evidence hyper-critically.[39]

Paranormal beliefs

One factor in the appeal of alleged psychic readings is that listeners apply a confirmation bias which fits the psychic's statements to their own lives. By making a large number of ambiguous statements in each sitting, the psychic gives the client more opportunities to find a match. This is one of the techniques of cold reading, with which a psychic can deliver a subjectively impressive reading without any prior information about the client.[40]

Scientific research

A distinguishing feature of scientific thinking is the search for confirming or supportive evidence (inductive reasoning) as well as falsifying evidence (deductive reasoning). Inductive research in particular is susceptible to confirmation bias.

Many times in the history of science, scientists have resisted new discoveries by selectively interpreting or ignoring unfavorable data. In practice, researchers may misunderstand, misinterpret, or not read at all studies that contradict their preconceptions, or wrongly cite them anyway as if they actually supported their claims. Further, confirmation biases can sustain scientific theories or research programs in the face of inadequate or even contradictory evidence.[14] The discipline of parapsychology is often cited as an example in the context of whether it is a pseudoscience:

Some of the worst examples of confirmation bias are in research on parapsychology ... Arguably, there is a whole field here with no powerful confirming data at all. But people want to believe, and so they find ways to believe.[41]

An experimenter's confirmation bias can potentially affect which data are reported. Data that conflict with the experimenter's expectations may be more readily discarded as unreliable, producing the so-called file drawer effect. To combat this tendency, scientific training teaches ways to prevent bias. For example, experimental design of randomized controlled trials (coupled with their systematic review) aims to minimize sources of bias.[41]

The social process of peer review aims to mitigate the effect of individual scientists' biases, even though the peer review process itself may be susceptible to such biases[42] Confirmation bias may thus be especially harmful to objective evaluations regarding nonconforming results since biased individuals may regard opposing evidence to be weak in principle and give little serious thought to revising their beliefs. Scientific innovators often meet with resistance from the scientific community, and research presenting controversial results frequently receives harsh peer review.[43]

Social media and searches

In social media and personalized searches by internet search engines such as Google and Bing, confirmation bias is amplified by the use of filter bubbles, or "algorithmic editing," which displays to individuals only information they are likely to agree with, while excluding opposing views. A suggested consequence is the degrading of democracy given that this "algorithmic editing" removes diverse viewpoints and information, and that unless filter bubble algorithms are removed voters will be unable to make fully informed political decisions.[44]

The rise of social media has contributed greatly to the rapid spread of fake news, that is, false and misleading information that is presented as credible news from a seemingly reliable source. Confirmation bias in the form of selecting or reinterpreting evidence to support one's beliefs is one of the main hurdles cited as to why critical thinking goes astray in these circumstances:

The key to people’s accepting fake news as true, despite evidence to the contrary, is a phenomenon known as confirmation bias, or the tendency for people to seek and accept information that confirms their existing beliefs while rejecting or ignoring that which contradicts those beliefs. ... one could say the brain is hardwired to accept, reject, misremember or distort information based on whether it is viewed as accepting of or threatening to existing beliefs.”[33]

In combating the spread of fake news, social media sites have considered turning toward "digital nudging." This includes nudging of information and nudging of presentation. Nudging of information entails social media sites providing a disclaimer or label questioning or warning users of the validity of the source while nudging of presentation includes exposing users to new information which they may not have sought out but could introduce them to viewpoints that may combat their own confirmation biases.[45]

Associated effects

Confirmation bias has been invoked to explain four specific effects:

  • Attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence)
  • Belief perseverance (when beliefs persist after the evidence for them is shown to be false)
  • Irrational primacy effect (a greater reliance on information encountered early in a series)
  • Illusory correlation (when people falsely perceive an association between two events or situations).

Polarization of opinion

Attitude polarization, also known as belief polarization, is a phenomenon in which a disagreement becomes more extreme as the different parties consider evidence on the issue. It is one of the effects of confirmation bias: the tendency of people to search for and interpret evidence selectively, to reinforce their current beliefs or attitudes.[19] When people encounter ambiguous evidence, this bias can result in each of them interpreting it as support for their existing attitudes, widening rather than narrowing the disagreement between them.[20]

The related backfire effect refers to the way people may hold even more strongly onto their beliefs when shown contradictory evidence, which they reject. The phrase was coined by Brendan Nyhan and Jason Reifler in 2010.[46]

Persistence of discredited beliefs

Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential bases.
——Lee Ross and Craig Anderson[47]

Confirmation bias provides one plausible explanation for the persistence of beliefs when the initial evidence for them is removed or when they have been sharply contradicted.[15] This belief perseverance effect was first demonstrated experimentally by Festinger and colleagues, who described the effect as Cognitive dissonance. These psychologists spent time with a cult whose members were convinced that the world would end on December 21, 1954. After the prediction failed, most believers still clung to their faith. Their book describing this research is aptly named When Prophecy Fails.[48]

The term belief perseverance was coined in a series of experiments using what is called the "debriefing paradigm": participants read fake evidence for a hypothesis, their attitude change is measured, then the fakery is exposed in detail. Their attitudes are then measured once more to see if their belief returns to its previous level.[47] A common finding is that at least some of the initial belief remains even after a full debriefing.[16]

Preference for early information

Experiments have shown that information is weighted more strongly when it appears early in a series, even when the order is unimportant. For example, people form a more positive impression of someone described as "intelligent, industrious, impulsive, critical, stubborn, envious" than when they are given the same words in reverse order. This irrational primacy effect is independent of the primacy effect in memory in which the earlier items in a series leave a stronger memory trace.[3] Biased interpretation offers an explanation for this effect: seeing the initial evidence, people form a working hypothesis that affects how they interpret the rest of the information.[15]

Illusory association between events

Illusory correlation is the tendency to see non-existent correlations in a set of data. This tendency was first demonstrated in a series of experiments in the late 1960s. In one experiment, participants read a set of psychiatric case studies, including responses to the Rorschach inkblot test. The participants reported that the homosexual men in the set were more likely to report seeing buttocks or sexually ambiguous figures in the inkblots. In fact the fictional case studies had been constructed so that the homosexual men were no more likely to report this imagery or, in one version of the experiment, were less likely to report it than heterosexual men. In a survey, a group of experienced psychoanalysts reported the same set of illusory associations with homosexuality.[19][2]

Another study recorded the symptoms experienced by arthritic patients, along with weather conditions over a 15-month period. Nearly all the patients reported that their pains were correlated with weather conditions, although the real correlation was zero.[49]

Example
Days Rain No rain
Arthritis 14 6
No arthritis 7 2

In judging whether the two events, illness and bad weather, were correlated, participants relied heavily on the number of positive-positive cases: in this example, instances of both pain and bad weather. They paid relatively little attention to the other kinds of observation (of no pain and/or good weather).[2] This parallels the reliance on positive tests in hypothesis testing. It may also reflect selective recall, in that people may have a sense that two events are correlated because it is easier to recall times when they happened together.[16]

Notes

  1. 1.0 1.1 1.2 Jason Zweig, How to ignore the yes-man in your head The Wall Street Journal, November 19, 2009. Retrieved May 4, 2023.
  2. 2.0 2.1 2.2 2.3 2.4 Scott Plous, The Psychology of Judgment and Decision Making (McGraw-Hill, 1993, ISBN 978-0070504776).
  3. 3.0 3.1 3.2 3.3 3.4 3.5 3.6 Jonathan Baron, Thinking and Deciding (New York: Cambridge University Press, 2000, ISBN 978-0521650304).
  4. Thucydides, History of the Peloponnesian War (Penguin Classic, 1972 (original 431 B.C.E.), ISBN 978-0140440393).
  5. Dante Alighieri, trans. Allen Mandelbaum, The Divine Comedy: Inferno; Purgatorio; Paradiso (Everyman's Library, 1995, ISBN 978-0679433132), "Paradiso" canto XIII: 118–120.
  6. Ibn Khaldun, The Muqadimmah (Pantheon Books, 1958, ISBN 978-0710001955).
  7. 7.0 7.1 Francis Bacon, Novum Organum (Legare Street Press, 2022 (original 1620), ISBN 978-1015466555).
  8. Arthur Schopenhauer, The World as Will and Presentation Volume 2 (Routledge, 2010, ISBN 0321355806).
  9. Leo Tolstoy, trans. Constance Garnett, The Kingdom of God Is Within You Wentworth Press, 2016 (original 1894), ISBN 978-1371256289).
  10. Leo Tolstoy, trans. Aylmer Maude, What Is Art? (Hackett Publishing Company, Inc., 1996 (original 1897), ISBN 978-0872202955).
  11. 11.0 11.1 Peter C. Wason, On the Failure to Eliminate Hypotheses in a Conceptual Task Quarterly Journal of Experimental Psychology 12(3) (1960): 129–140. Retrieved May 6, 2023.
  12. 12.0 12.1 Fenna H. Poletiek, Hypothesis-testing Behaviour (Psychology Press, 2000, ISBN 978-1841691596).
  13. Peter C. Wason, Reasoning about a rule Quarterly Journal of Experimental Psychology 20(3) (1968):273–281. Retrieved May 6, 2023.
  14. 14.0 14.1 14.2 Stuart Sutherland, Irrationality: The Enemy within (Pinter & Martin, 2013, ISBN 978-1780660257).
  15. 15.0 15.1 15.2 15.3 15.4 15.5 15.6 15.7 15.8 15.9 Raymond S. Nickerson, Argumentation (Cambridge University Press, 2020, ISBN 978-1108799874).
  16. 16.0 16.1 16.2 16.3 16.4 16.5 16.6 16.7 16.8 Ziva Kunda, Social Cognition: Making Sense of People (Bradford Book, 1999, ISBN 0262611430).
  17. 17.0 17.1 17.2 Thomas E. Kida, Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking (Prometheus, 2006, ISBN 1591024080).
  18. 18.0 18.1 18.2 18.3 Joshua Klayman and Young-Won Ha, Confirmation, Disconfirmation, and Information in Hypothesis Testing Psychological Review 94(2) (1987): 211-228. Retrieved May 4, 2023.
  19. 19.0 19.1 19.2 Cordelia Fine, A Mind of Its Own : How Your Brain Distorts and Deceives (Icon Books, 2005, ISBN 978-1840466782).
  20. 20.0 20.1 20.2 Charles G. Lord, Lee Ross, and Mark R. Lepper, Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence Journal of Personality and Social Psychology 37(11) (1979):2098–2109. Retrieved May 9, 2023.
  21. Kari Edwards and Edward E. Smith, A Disconfirmation Bias in the Evaluation of Arguments Journal of Personality and Social Psychology 71(1) (1996):5-24. Retrieved May 5, 2023.
  22. Drew Westen, Pavel S. Blagov, Keith Harenski, Clint Kilts, and Stephan Hamann, Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 U.S. Presidential election, Journal of Cognitive Neuroscience 18(11) (2006): 1947–1958.
  23. David L. Hamilton (ed.), Social Cognition: Key Readings (Psychology Press, 2005, ISBN 978-0863775918).
  24. Linda J. Levine, Vincent Prohaska, Stewart L. Burgess, John A. Rice, and Tracy M. Laulhere, Remembering past emotions: The role of current appraisals, Cognition and Emotion 15(4) (2001):393–417.
  25. Stuart A. Vyse, Believing in Magic: The Psychology of Superstition (Oxford University Press, 2000, ISBN 978-0195136340).
  26. 26.0 26.1 Christopher Wolfe and Anne Britt, "The locus of the myside bias in written argumentation" Thinking & Reasoning 14 (2008): 1–27.
  27. Maria Lewicka, "Confirmation bias: Cognitive error or adaptive strategy of action control?" in Mirosław Kofta, Gifford Weary, and Grzegorz Sedek (eds.), Personal Control in Action: Cognitive and Motivational Mechanisms (Springer, 1998, ISBN 978-0306457203), 233–255.
  28. Margaret W. Matilin, "Pollyanna Principle" in Rüdiger F. Pohl (ed.), Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory (Psychology Press, 2012, ISBN 978-0415646758).
  29. Margit E. Oswald and Stefan Grosjean "Confirmation bias" in Rüdiger F. Pohl (ed.), Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory (Psychology Press, 2012, ISBN 978-0415646758).
  30. Y. Trope and A. Liberman, "Social hypothesis testing: Cognitive and motivational mechanisms in E. Tory Higgins and Arie W. Kruglanski (eds.), Social Psychology: Handbook of Basic Principles (The Guilford Press, 1996, ISBN 978-1572301009).
  31. 31.0 31.1 Benoit Dardenne and Jacques-Philippe Leyens, Confirmation bias as a social skill Personality and Social Psychology Bulletin 21(11) (1995): 1229–1239. Retrieved May 8, 2023.
  32. Jonathan Haidt, The Righteous Mind: Why good people are divided by politics and religion (Penguin Books Ltd, 2013, ISBN 978-0141039169).
  33. 33.0 33.1 Why we're susceptible to fake news – and how to defend against it American Psychological Association, August 10, 2018. Retrieved May 8, 2023.
  34. Raymond S. Nickerson, Confirmation Bias: A Ubiquitous Phenomenon in Many Guises Review of General Psychology 2(2) (1998): 175-220. Retrieved May 10, 2023.
  35. Diane F. Halpern, Critical Thinking Across the Curriculum: A Brief Edition of Thought and Knowledge (Routledge, 1997, ISBN 978-0805827316).
  36. Michael M. Pompian, Behavioral Finance and Wealth Management: How to Build Optimal Portfolios That Account for Investor Biases (John Wiley and Sons, 2006, ISBN 0471745170).
  37. David Krueger and John David Mann, The Secret Language of Money: How to Make Smarter Financial Decisions and Live a Richer Life (McGraw Hill, 2009, ISBN 978-0071623391).
  38. Hugh R. Trevor-Roper, The European witch-craze of the Sixteenth and Seventeenth Centuries (Penguin Books, 1991, ISBN 978-0140137187).
  39. Ben Goldacre, Bad Science (Fourth Estate, 2008, ISBN 978-0007240197).
  40. Jonathan C. Smith, Pseudoscience and Extraordinary Claims of the Paranormal: A Critical Thinker's Toolkit (Wiley-Blackwell, 2009, ISBN 978-1405181228).
  41. 41.0 41.1 Robert J. Sternberg, Henry L. Roediger III, and Diane F. Halpern (eds.), Critical Thinking in Psychology (Cambridge University Press, 2006, ISBN 978-0521608343).
  42. Steven James Bartlett, "The psychology of abuse in publishing: Peer review and editorial bias," Normality Does Not Equal Mental Health: The Need to Look Elsewhere for Standards of Good Psychological Health (Santa Barbara, CA: Praeger, 2011, ISBN 978-0313399312), 147–177.
  43. David F. Horrobin, The philosophical basis of peer review and the suppression of innovation Journal of the American Medical Association 263(10) (1990):1438–1441. Retrieved May 9, 2023.
  44. Eli Pariser, Beware online "filter bubbles" TED, May 2, 2011. Retrieved May 9, 2023.
  45. Calum Thornhill, Quentin Meeus, Jeroen Peperkamp, and Bettina Berendt, "A digital nudge to counter confirmation bias," Frontiers in Big Data 2 (2019):11.
  46. Brendan Nyhan and Jason Reifler, When corrections fail: The persistence of political misperceptions, Political Behavior 32 (2010): 303–320.
  47. 47.0 47.1 Lee Ross and Craig A. Anderson, "Judgment under uncertainty: Heuristics and biases" in Daniel Kahneman, Paul Slovic, and Amos Tversky (eds.), Judgment Under Uncertainty: Heuristics and Biases (Cambridge University Press, 1982, ISBN 978-0521284141), 129-152.
  48. Leon Festinger, Henry W. Riecken, and Stanley Schachter, When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World (Harper-Torchbooks, 1956, ISBN 0061311324).
  49. D.A. Redelmeir and Amos Tversky, On the belief that arthritis pain is related to the weatherProceedings of the National Academy of Sciences 93(7) (1996):2895–2896. Retrieved May 10, 2023.

References
ISBN links support NWE through referral fees

  • Alighieri, Dante, trans. Allen Mandelbaum. The Divine Comedy: Inferno; Purgatorio; Paradiso. Everyman's Library, 1995. ISBN 978-0679433132
  • Bacon, Francis. Novum Organum. Legare Street Press, 2022 (original 1620). ISBN 978-1015466555)
  • Baron, Jonathan. Thinking and Deciding. New York: Cambridge University Press, 2000. ISBN 978-0521650304
  • Bartlett, Steven James. Normality Does Not Equal Mental Health: The Need to Look Elsewhere for Standards of Good Psychological Health. Santa Barbara, CA: Praeger, 2011. ISBN 978-0313399312
  • Festinger, Leon, Henry W. Riecken, and Stanley Schachter. When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World. Harper-Torchbooks, 1956. ISBN 0061311324
  • Fine, Cordelia. A Mind of Its Own : How Your Brain Distorts and Deceives. Icon Books, 2005. ISBN 978-1840466782
  • Goldacre, Ben. Bad Science. Fourth Estate, 2008. ISBN 978-0007240197
  • Haidt, Jonathan. The Righteous Mind: Why good people are divided by politics and religion. Penguin Books Ltd, 2013. ISBN 978-0141039169
  • Halpern, Diane F. Critical Thinking Across the Curriculum: A Brief Edition of Thought and Knowledge. Routledge, 1997. ISBN 978-0805827316
  • Hamilton, David L. (ed.). Social Cognition: Key Readings. Psychology Press, 2005. ISBN 978-0863775918
  • Higgins, E. Tory, and Arie W. Kruglanski (eds.). Social Psychology: Handbook of Basic Principles. The Guilford Press, 1996. ISBN 978-1572301009
  • Kahneman, Daniel, Paul Slovic, and Amos Tversky (eds.). Judgment Under Uncertainty: Heuristics and Biases. Cambridge University Press, 1982. ISBN 978-0521284141
  • Khaldun, Ibn. The Muqadimmah. Pantheon Books, 1958. ISBN 978-0710001955
  • Kida, Thomas E. Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking. Prometheus, 2006. ISBN 1591024080
  • Kofta, Mirosław, Gifford Weary, and Grzegorz Sedek. Personal Control in Action: Cognitive and Motivational Mechanisms. Springer, 1998. ISBN 978-0306457203
  • Krueger, David, and John David Mann. The Secret Language of Money: How to Make Smarter Financial Decisions and Live a Richer Life. McGraw Hill, 2009. ISBN 978-0071623391
  • Kunda, Ziva. Social Cognition: Making Sense of People. Bradford Book, 1999. ISBN 0262611430
  • Nickerson, Raymond S. Argumentation. Cambridge University Press, 2020. ISBN 978-1108799874
  • Plous, Scott. The Psychology of Judgment and Decision Making. McGraw-Hill, 1993. ISBN 978-0070504776
  • Pohl, Rüdiger F. (ed.). Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Psychology Press, 2012. ISBN 978-0415646758
  • Poletiek, Fenna H. Hypothesis-testing Behaviour. Psychology Press, 2000. ISBN 978-1841691596
  • Pompian, Michael M. Behavioral Finance and Wealth Management: How to Build Optimal Portfolios That Account for Investor Biases.(ohn Wiley and Sons, 2006. ISBN 0471745170
  • Schopenhauer, Arthur. The World as Will and Presentation Volume 2. Routledge, 2010. ISBN 0321355806
  • Smith, Jonathan C. Pseudoscience and Extraordinary Claims of the Paranormal: A Critical Thinker's Toolkit. Wiley-Blackwell, 2009. ISBN 978-1405181228
  • Sternberg, Robert J., Henry L. Roediger III, and Diane F. Halpern (eds.). Critical Thinking in Psychology. Cambridge University Press, 2006. ISBN 978-0521608343
  • Sutherland, Stuart. Irrationality: The Enemy within. Pinter & Martin, 2013. ISBN 978-1780660257
  • Thucydides. History of the Peloponnesian War. Penguin Classic, 1972 (original 431 B.C.E.). ISBN 978-0140440393
  • Tolstoy, Leo, trans. Constance Garnett. The Kingdom of God Is Within You. Wentworth Press, 2016 (original 1894). ISBN 978-
  • Tolstoy, Leo, trans. Aylmer Maude, What Is Art? Hackett Publishing Company, Inc., 1996 (original 1897). ISBN 978-0872202955
  • Vyse, Stuart A. Believing in Magic: The Psychology of Superstition. Oxford University Press, 2000. ISBN 978-0195136340

External links

All links retrieved May 10, 2023.

Credits

New World Encyclopedia writers and editors rewrote and completed the Wikipedia article in accordance with New World Encyclopedia standards. This article abides by terms of the Creative Commons CC-by-sa 3.0 License (CC-by-sa), which may be used and disseminated with proper attribution. Credit is due under the terms of this license that can reference both the New World Encyclopedia contributors and the selfless volunteer contributors of the Wikimedia Foundation. To cite this article click here for a list of acceptable citing formats.The history of earlier contributions by wikipedians is accessible to researchers here:

The history of this article since it was imported to New World Encyclopedia:

Note: Some restrictions may apply to use of individual images which are separately licensed.