Wikia

Psychology Wiki

List of cognitive biases

Talk0
34,142pages on
this wiki
Revision as of 14:05, August 13, 2013 by Dr Joe Kiff (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Cognitive Psychology: Attention · Decision making · Learning · Judgement · Memory · Motivation · Perception · Reasoning · Thinking  - Cognitive processes Cognition - Outline Index


A cognitive bias is a pattern of deviation in judgement that occurs in particular situations (see also cognitive distortion and the lists of thinking-related topics). Implicit in the concept of a "pattern of deviation" is a standard of comparison; this may be judgement of people outside those particular situations, or may be a set of independently verifiable facts. The existence of some of these cognitive biases has been verified empirically in the field of psychology, others are widespread beliefs, and may themselves be a consequence of cognitive bias.

Cognitive biases are instances of evolved mental behaviour. Some are presumably adaptive, for example, because they lead to more effective actions or enable faster decisions. Others presumably result from a lack of appropriate mental mechanisms, or from the misapplication of a mechanism that is adaptive under different circumstances.

Decision-making and behavioral biasesEdit

Many of these biases are studied for how they affect belief formation and business decisions and scientific research.

  • Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink, herd behaviour, and manias.
  • Base rate fallacy — ignoring available statistical data in favor of particulars
  • Bias blind spot — the tendency not to compensate for one's own cognitive biases.
  • Closed world assumption - the presumption that what is not currently known to be true is false.
  • Choice-supportive bias — the tendency to remember one's choices as better than they actually were.
  • Confirmation bias — the tendency to search for or interpret information in a way that confirms one's preconceptions.
  • Congruence bias — the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
  • Contrast effect — the enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object.
  • Déformation professionnelle — the tendency to look at things according to the conventions of one's own profession, forgetting any broader point of view.
  • Distinction bias - the tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.[1]
  • Endowment effect — "the fact that people often demand much more to give up an object than they would be willing to pay to acquire it".[2]
  • Extreme aversion — the tendency to avoid extremes, being more likely to choose an option if it is the intermediate choice.
  • Focusing effect — prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
  • Framing — by using a too narrow approach or description of the situation or issue. Also framing effect — drawing different conclusions based on how data are presented.
  • Hyperbolic discounting — the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, the closer to the present both payoffs are.
  • Illusion of control — the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
  • Impact bias — the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
  • Information bias — the tendency to seek information even when it cannot affect action.
  • Irrational escalation — the tendency to make irrational decisions based upon rational decisions in the past or to justify actions already taken.
  • Loss aversion — "the disutility of giving up an object is greater than the utility associated with acquiring it".[3] (see also sunk cost effects and Endowment effect).
  • Mere exposure effect — the tendency for people to express undue liking for things merely because they are familiar with them.
  • Moral credential effect — the tendency of a track record of non-prejudice to increase subsequent prejudice.
  • Need for closure — the need to reach a verdict in important matters; to have an answer and to escape the feeling of doubt and uncertainty. The personal context (time or social pressure) might increase this bias.[4]
  • Neglect of probability — the tendency to completely disregard probability when making a decision under uncertainty.
  • Omission bias — The tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
  • Open world assumption - emphasising that lack of knowledge does not imply falsity
  • Outcome bias — the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
  • Planning fallacy — the tendency to underestimate task-completion times.
  • Post-purchase rationalization — the tendency to persuade oneself through rational argument that a purchase was a good value.
  • Pseudocertainty effect — the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
  • Reactance - the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
  • Selective perception — the tendency for expectations to affect perception.
  • Status quo bias — the tendency for people to like things to stay relatively the same (see also Loss aversion and Endowment effect).[5]
  • Unit bias — the tendency to want to finish a given unit of a task or an item with strong effects on the consumption of food in particular
  • Von Restorff effect — the tendency for an item that "stands out like a sore thumb" to be more likely to be remembered than other items.
  • Zero-risk bias — preference for reducing a small risk to zero over a greater reduction in a larger risk.

Biases in probability and beliefEdit

Many of these biases are often studied for how they affect business and economic decisions and how they affect experimental research.

  • Ambiguity effect — the avoidance of options for which missing information makes the probability seem "unknown".
  • Anchoring — the tendency to rely too heavily, or "anchor," on a past reference or on one trait or piece of information when making decisions.
  • Attentional bias — neglect of relevant data when making judgments of a correlation or association.
  • Availability heuristic — estimating what is more likely by what is more available in memory, which is biased toward vivid, unusual, or emotionally charged examples.
  • Availability cascade - a self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or "repeat something long enough and it will become true").
  • Clustering illusion — the tendency to see patterns where actually none exist.
  • Conjunction fallacy — the tendency to assume that specific conditions are more probable than general ones.
  • Gambler's fallacy — the tendency to assume that individual random events are influenced by previous random events. For example, "I've flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads."
  • Hawthorne effect — refers to a phenomenon which is thought to occur when people observed during a research study temporarily change their behavior or performance (this can also be referred to as demand characteristics).
  • Hindsight bias — sometimes called the "I-knew-it-all-along" effect, the inclination to see past events as being predictable.
  • Illusory correlation — beliefs that inaccurately suppose a relationship between a certain type of action and an effect.
  • Ludic fallacy — the analysis of chance related problems with the narrow frame of games. Ignoring the complexity of reality, and the non-gaussian distribution of many things.
  • Neglect of prior base rates effect — the tendency to neglect known odds when reevaluating odds in light of weak evidence.
  • Observer-expectancy effect — when a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
  • Optimism bias — the systematic tendency to be over-optimistic about the outcome of planned actions.
  • Overconfidence effect — the tendency to overestimate one's own abilities.
  • Positive outcome bias — a tendency in prediction to overestimate the probability of good things happening to them (see also wishful thinking, optimism bias and valence effect).
  • Primacy effect — the tendency to weigh initial events more than subsequent events.
  • Recency effect — the tendency to weigh recent events more than earlier events (see also peak-end rule).
  • Regression toward the mean disregarded — the tendency to expect extreme performance to continue.
  • Reminiscence bump — the effect that people tend to recall more personal events from adolescence and early adulthood than from other lifetime periods.
  • Repetition bias - A willingness to believe what we have been told most often and by the greatest number of different of sources.
  • Rosy retrospection — the tendency to rate past events more positively than they had actually rated them when the event occurred.
  • Stereotyping — expecting a member of a group to have certain characteristics without having actual information about that individual.
  • Subadditivity effect — the tendency to judge probability of the whole to be less than the probabilities of the parts.
  • Telescoping effect — the effect that recent events appear to have occurred more remotely and remote events appear to have occurred more recently.
  • Texas sharpshooter fallacy — the fallacy of selecting or adjusting a hypothesis after the data is collected, making it impossible to test the hypothesis fairly.

Social biases Edit

Most of these biases are labeled as attributional biases.

  • Actor-observer bias — the tendency for explanations of other individuals' behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation (see also fundamental attribution error). However, this is coupled with the opposite tendency for the self in that explanations for our own behaviors overemphasize the influence of our situation and underemphasize the influence of our own personality.
  • Dunning-Kruger effect — "...when people are incompetent in the strategies they adopt to achieve success and satisfaction, they suffer a dual burden: Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it. Instead, ...they are left with the mistaken impression that they are doing just fine."[6](see also Lake Wobegon effect, and overconfidence effect).
  • Egocentric bias — occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would.
  • Forer effect (aka Barnum Effect) — the tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes.
  • False consensus effect — the tendency for people to overestimate the degree to which others agree with them.
  • Fundamental attribution error — the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).
  • Halo effect — the tendency for a person's positive or negative traits to "spill over" from one area of their personality to another in others' perceptions of them (see also physical attractiveness stereotype).
  • Herd instinct — Common tendency to adopt the opinions and follow the behaviors of the majority to feel safer and to avoid conflict.
  • Illusion of asymmetric insight — people perceive their knowledge of their peers to surpass their peers' knowledge of them.
  • Illusion of transparency — people overestimate others' ability to know them, and they also overestimate their ability to know others.
  • Ingroup bias — the tendency for people to give preferential treatment to others they perceive to be members of their own groups.
  • Just-world phenomenon — the tendency for people to believe that the world is "just" and therefore people "get what they deserve."
  • Lake Wobegon effect — the human tendency to report flattering beliefs about oneself and believe that one is above average (see also worse-than-average effect, and overconfidence effect).
  • Notational bias — a form of cultural bias in which a notation induces the appearance of a nonexistent natural law.
  • Outgroup homogeneity bias — individuals see members of their own group as being relatively more varied than members of other groups.
  • Projection bias — the tendency to unconsciously assume that others share the same or similar thoughts, beliefs, values, or positions.
  • Self-serving bias — the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias).
  • Self-fulfilling prophecy — the tendency to engage in behaviors that elicit results which will (consciously or not) confirm our beliefs.
  • System justification — the tendency to defend and bolster the status quo, i.e. existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual and collective self-interest.
  • Trait ascription bias — the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable.

Memory errors Edit

Further information: Memory bias
  • Beneffectance: perceiving oneself as responsible for desirable outcomes but not responsible for undesirable ones. (Term coined by Greenwald (1980))
  • Consistency bias: incorrectly remembering one's past attitudes and behaviour as resembling present attitudes and behaviour.
  • Cryptomnesia: a form of misattribution where a memory is mistaken for imagination.
  • Egocentric bias: recalling the past in a self-serving manner, e.g. remembering one's exam grades as being better than they were, or remembering a caught fish as being bigger than it was
  • False memory
  • Hindsight bias: filtering memory of past events through present knowledge, so that those events look more predictable than they actually were; also known as the 'I-knew-it-all-along effect'.
  • Selective Memory
  • Suggestibility: a form of misattribution where ideas suggested by a questioner are mistaken for memory.

Common theoretical causes of some cognitive biasesEdit

See alsoEdit


ReferencesEdit

  • Baron, J. (2000). Thinking and deciding (3d. edition). New York: Cambridge University Press. ISBN 0-521-65030-5
  • Bishop, Michael A & Trout, J.D. (2004). Epistemology and the Psychology of Human Judgment. New York: Oxford University Press. ISBN 0-19-516229-3
  • Gilovich, T. (1993). How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. New York: The Free Press. ISBN 0-02-911706-2
  • Gilovich, T., Griffin D. & Kahneman, D. (Eds.). (2002). Heuristics and biases: The psychology of intuitive judgment. Cambridge, UK: Cambridge University Press. ISBN 0-521-79679-2
  • Greenwald, A. (1980). "The Totalitarian Ego: Fabrication and Revision of Personal History" American Psychologist, Vol. 35, No. 7
  • Kahneman, D., Slovic, P. & Tversky, A. (Eds.). (1982). Judgment under Uncertainty: Heuristics and Biases. Cambridge, UK: Cambridge University Press. ISBN 0-521-28414-7
  • Kahneman, Daniel, Jack L. Knetsch, and Richard H. Thaler. (1991). "Anomalies: The Endowment Effect, Loss Aversion, and Status Quo Bias." The Journal of Economic Perspectives 5(1):193-206.
  • Plous, S. (1993). The Psychology of Judgment and Decision Making. New York: McGraw-Hill. ISBN 0-07-050477-6
  • Schacter, D. L. (1999). "The Seven Sins of Memory: Insights From Psychology and Cognitive Neuroscience" American Psychologist Vol. 54. No. 3, 182-203
  • Tetlock, Philip E. (2005). Expert Political Judgment: how good is it? how can we know?. Princeton: Princeton University Press. ISBN 978-0-691-12302-8
  • Virine, L. and Trumper M., Project Decisions: The Art and Science (2007). Management Concepts. Vienna, VA, ISBN 978-1567262179

Further readingEdit

  • Haselton, M. G. & Funder, D. (in press). The evolution of accuracy and bias in social judgment. In M. Schaller, D. T. Kenrick, & J. A. Simpson (Eds.), Evolution and Social Psychology. New York: Psychology Press. [Volume to be published as part of the Frontiers of Social Psychology series.] Full text
  • Haselton, M. G. (in press). Error management theory. In R. Baumeister & K. Vohs (eds.), Encyclopedia of social psychology. Thousand Oaks, CA: Sage. Full text
  • Haselton, M. G. & Buss, D. M. (2003). Biases in Social Judgment: Design Flaws or Design Features? In J. Forgas, K. Williams, & B. von Hippel (Eds.) Responding to the Social World: Implicit and Explicit Processes in Social Judgments and Decisions. New York, NY: Cambridge. Full text



This page uses Creative Commons Licensed content from Wikipedia (view authors).

Cite error: <ref> tags exist, but no <references/> tag was found

Around Wikia's network

Random Wiki