Talk:Confirmation bias/Archive 2

Recent change to definition
I've removed these two sentences you've added. Let's discuss them here. "The standard method of evaluating facts is to collect data, and then interpret the full set of data." It's not clear what this means. Is standard meant in a normative or descriptive sense? If it's normative but not what people do, then how is it "standard"? If it's descriptive, what is the evidence that this is how people evaluate facts? "However, if people have strong beliefs, they tend to cherry-pick only the facts that fit (confirm) their beliefs, and ignore the remainder of the data." The lead of the article is meant to summarise the article content, but this sentence contradicts it. The studies summarised in the article show people do not have this tendency. This is a folk misconception about Confirmation bias, not the scientific understanding described in the sources. MartinPoulter (talk) 18:23, 12 May 2020 (UTC)


 * Hi Martin! Confirmation bias is an important page on Wikipedia (because of the current polarisation in our rapidly changing society), and it is being viewed 2000 times per day. I appreciate your efforts to restructure the page in 2010. I share your view that critical thinking is important and that the initial summary page should be a summary of the remainder of the article. However I think this topic needs gradual restructuring again. Wikipedia is a tool for public education, and that the concepts need to be explained simply, with technical parts perhaps removed to end notes (or some other solution). The current article has too much detail and is too technical in places, and most readers will not bother to read (or understand) much of it. Even my psychology students find it a hard slog - and what about the general public? This is a long term interest of mine. We need be more explicit about the role confirmation bias plays in scientific fraud, denialism, pseudoscience and conspiracy therories (Lee McIntyre's book "The Scientific Attitude" has been an inspiration for me about that). Kookaburra17 12:13, 13 May 2020 (UTC) — Preceding unsigned comment added by Kookaburra17 (talk • contribs)

This article is very well-formatted and was a great source for understanding confirmation bias. I love the detail that is put into each section, and the use of media was very helpful in understanding the material. There may be a few grammatical errors scattered throughout the article, but I plan to work through the entire article and see if I can find any to fix. Boggessh (talk) 01:17, 2 October 2020 (UTC)

Bias confirmed ...
I just knew this was going to be a great article ... Daniel Case (talk) 23:43, 1 December 2020 (UTC)

Citations needed
There are multiple missing sources. The "Definition and context" section (last paragraph), "Biased memory recall of information" section (fourth paragraph, last sentence), "Social media" section (second paragraph, last sentence), "Recruitment and selection" section (last sentence). These issues have to be addressed to keep the featured status of this article. Wretchskull (talk) 12:02, 8 February 2021 (UTC)

"Criticism" section
Moving this anonymously-added text from the article to Talk because it's not appropriate in its current form but some insights from it probably belong in the article. A good quality article shouldn't have a "Criticism" section; it should fairly represent the whole evidence about the topic. By the same token, if one publication argues that the academic consensus is wrong, it doesn't deserve equal representation with decades of academic consensus and highly replicated results. This text is in dubious English ("search for truth under climate change conditions"?) and the citations are ill-formed. Most relevantly, the things that are "criticised" in these arguments are not confirmation bias. Confirmation bias is not "the allegation that brains evolved to confirm beliefs" and it is not "the claim that anger at an argument that contradicts a belief would be proof of the contradiction of the belief being the cause of anger" so though there are probably legitimate arguments being described here, they don't connect with the core of the article.

Criticisms
One criticism of the allegation that brains evolved to confirm beliefs is that in any environment where it was possible to survive without knowing the truth, a brain that only made the same processing steps leading to the decision but skipping confirmation steps after the decision had been fixed would make the same decisions to a lower cost of energy than a brain searching for confirmation. It is therefore argued that while evolution can select for brains that search for truth under climate change conditions, and for brains that make simple instinctual decisions without rationalization under stable climate conditions, evolution can never select for brains that first make irrational decisions and then rationalize them. It is also argued that since the same hypothesis can and often do make a wide range of predictions with different implications in different contexts, evolutionary psychology's cost and benefit analyses of false hypotheses are misguided in their assumption that the implications of one false hypothesis could be classified as "mild" or "severe" as if one hypothesis only made one or two predictions. There are also criticisms of alleged evidence for confirmation bias, such as the claim that anger at an argument that contradicts a belief would be proof of the contradiction of the belief being the cause of anger which is not the case since there are other possible aspects of the argument that can be the cause. In this context, experiments that allege to prove confirmation bias are criticized for not ruling out error sources such as imprecision of the argument being the cause of anger.

MartinPoulter (talk) 09:00, 16 April 2021 (UTC)

Can only be managed, but not eliminated
Can we get some info to back up the sentence "Confirmation bias cannot be eliminated entirely, but it can be managed, for example, by education and training in critical thinking skills." Where is this info coming from? This sentence is near the top of page. 208.66.148.87 (talk) 21:07, 13 September 2021 (UTC)

Kahneman's use of the Term
In Thinking Fast and Slow (2011) Daniel Kahneman refers to Confirmation Bias in three places (pp 80, 324, 33), mapping it onto his "System 1 / System 2" framework. On Page 80 he describes it as follows: 'The operations of associative memory contribute to a general " confirmation bias ". When asked "Is Sam Friendly?" different instances of Sam's behaviour will come to mind than would if you had been asked "Is Sam unfriendly?" '

Thus Kahneman uses "confirmation bias" as a sort of desire, in our brains, to answer "yes" to the question, which leads as he says, to "uncritical acceptance of suggestions and exaggeration of the likelihood of extreme and improbable events" or, more succinctly, "System 1 is gullible and biased to believe." This occurs even for topics where we have no pre-existing opinion (he uses the example, 'whitefish eat candy').

Kahneman traces this insight to Spinoza (1600s) and Gilbert (1990) but there is no indication that they used the term "confirmation bias."Mrdavenport (talk) 19:12, 14 October 2021 (UTC)