User:Tbere43/sandbox

= Intro = Knowledge neglect refers to cases when people fail to retrieve and apply previously stored knowledge appropriately into a current situation (Marsh, Umanath, 2014). The phrase “knowledge neglect” was first coined by Elizabeth Marsh and Sharda Umanath of Duke University in their study in 2014, which examined people’s failure to notice errors they encounter that directly contradict their previously learned knowledge.

One of the most famous examples of knowledge neglect is the Moses Illusion which was discovered in 1981 by Erickson and Mattson (Bredart, Modolo, 1988). The Moses Illusion is prompted when participants are asked the question, “How many pairs of each animal did Moses bring on his ark?” If a participant answers the question saying, “2,” this is an example of knowledge neglect because they have failed to apply their previously stored knowledge that Noah was the individual who constructed the arc and herded the animals, not Moses.

= Examples /Studies =
 * Marsh and Umanath

Hypothesis for Cause

One plausible reason that people fall victim to knowledge neglect is because of the inherent truth bias. The truth bias states that unless led to believe otherwise, people tend to believe what they're told. For this reason, individuals may fall victim to knowledge simply because they are unaware that the information they are being presented is false. Another possible cause for somebody to fall victim to knowledge neglect could be due to the fact that their attention is fragmented and they are not expecting the information they are reading to be incorrect. While reading stories or detecting/answering distorted questions, the participant is doing a lot and may not have the processing resources available to assess whether or not the information is true (Marsh, Umanath 2014). For example, the reader of a story is processing a plot line, keeping track of characters, and more generally, building a mental model of the text (e.g., Bower & Morrow, 1990; Johnson‐ Laird, 1983); catching contradictions with stored knowledge is thus, not the main focus of the reader (Marsh, Umanath 2014).

A possible cause for people falling victim to knowledge neglect could be in part due to the truth bias. The Truth Bias states that unless explicitly stated,
 * Gilbert and colleagues have argued that people automatically believe information when they read it and that a second processing step is required to “unbelieve” information and label it as false (see Gilbert, 1991, for a review).
 * People have a truth bias, meaning that unless explicitly stated false, they tend toward believing things that seem plausible


 * “When reading stories or detecting/answering distorted questions, the participant is doing a lot and may not have the processing resources available to assess the truth value of what is being read.” →  Marsh and Umanath
 * “The reader of a story is processing a plot line, keeping track of characters, and more generally, building a mental model of the text (e.g., Bower & Morrow, 1990; Johnson‐ Laird, 1983); catching contradictions with stored knowledge is not the main focus of the reader.” → Marsh and Umanath

Importance

References

Bottoms, H. C., Eslick, A. N., & Marsh, E. J. (2010). Memory and the Moses illusion: Failures to detect contradictions with stored knowledge yield negative memorial consequences. Memory, 18(6), 670-678

Park, H., & Reder, L. M. (2004). Moses illusion: Implications for human cognition. In R. F. Pohl (Ed.), Cognitive illusions: A handbook on fallacies and biases in thinking, judgment, and memory (pp. 275"292). Hove, UK: Psychology Press.

Reder, L. M., & Kusbit, G. W. (1991). Locus of the Moses illusion: Imperfect encoding, retrieval, or match? Journal of Memory and Language, 30, 385"406.*

Knowledge does not protect against illusory truth.

Fazio, Lisa K.; Brashier, Nadia M.; Payne, B. Keith; Marsh, Elizabeth J.

Journal of Experimental Psychology: General, Vol 144(5), Oct 2015, 993-1002.

= Allison D. Cantor & Elizabeth J. Marsh (2016): Expertise effects in the Moses illusion: detecting contradictions with stored knowledge, Memory, DOI: 10.1080/09658211.2016.1152377 =