Cognitive bias
Agency detection bias: Overdetecting agency and intentionality behind events, attributing events to intentional actions of agents, such as conspirators, rather than chance or natural causes.
Anchoring bias: Reyling too heavily on the first piece of information offered (the "anchor") when making decisions.
Attribution bias: The tendency to attribute the behavior of others to their character or disposition, while attributing one's own behavior to situational factors.
Availability bias: The tendency to overestimate the likelihood of events with greater "availability" in memory. Judging events that are more easily recalled as more frequent. Arises when participants do not remember past events accurately, skewing the data.
Bounded rationality: Bounded rationality is an an alternative to the classical economic theory of perfect rationality. When individuals make decisions, their rationality is limited by various factors, such as the available information, cognitive limitations, and time constraints. As a result, decision-makers often seek satisfactory ("satisficing") rather than optimal ("ideal") solutions. As proposed by Herbert Simon.
Key aspects of bounded rationality:
Limited information: Decision-makers often lack access to all relevant information and may not have the ability to process and interpret all available data.
Cognitive constraints: The human brain has limited computational capacity and is subject to various biases and heuristics that can lead to suboptimal decisions.
Time and resource constraints: Decision-makers often face time pressures and may not have the resources to thoroughly analyze all possible alternatives.
Satisficing: Instead of seeking the optimal solution, people often choose the first satisfactory option that meets their minimum requirements.
Heuristics: To cope with the limitations of rationality, individuals often rely on mental shortcuts or rules of thumb to simplify complex decisions.
Adaptive behavior: Bounded rationality acknowledges that people learn from their experiences and adapt their decision-making strategies over time.
Implications of bounded rationality:
Realistic decision-making models: By acknowledging the limitations of human rationality, bounded rationality helps develop more realistic models of decision-making in various fields, such as economics, psychology, and political science.
Organizational decision-making: Bounded rationality has important implications for understanding how organizations make decisions, as they are composed of individuals with limited rationality and are subject to various organizational constraints.
Choice architecture: Insights from bounded rationality have led to the development of "nudges" and other strategies that aim to improve decision-making by altering the context in which choices are made.
Satisficing vs. optimizing: Bounded rationality suggests that in many situations, satisficing (choosing a good enough option) may be a more practical and efficient approach than optimizing (seeking the best possible option).
Bounded rationality is a more realistic approach to understanding human decision-making that takes into account the various limitations and constraints that individuals face.
Cognitive bias: A systematic pattern of deviation from norm or rationality in judgment, whereby inferences about other people and situations may be drawn in an illogical fashion. Individuals create their own "subjective reality" from their perception of the input. Cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality. Cognitive biases are often a result of an individual's attempt to simplify information processing. They are rules of thumb that help us make sense of the world and reach decisions with relative speed. Some of these biases are related to memory. The way you remember an event may be biased for a number of reasons and that in turn can lead to biased thinking and decision-making. Other cognitive biases might be related to problems with attention, when information overload can cause people to focus on just a subset of the available information.
Confirmation bias: The tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values.
Conjunction fallacy: The tendency to think that specific conditions are more probable than more general ones. Judging the co-occurrence of two events as more likely than either event alone.
Economic rationality: Economic rationality is a fundamental concept in economics that describes the decision-making process of individuals or entities, assuming they are motivated by self-interest and aim to maximize their utility or benefit while minimizing costs. In other words, economic rationality posits that people make decisions based on logical reasoning and the available information to achieve the best possible outcome for themselves.
Key assumptions of economic rationality:
Consistency: Individuals have consistent preferences and make decisions that align with those preferences.
Perfect information: Decision-makers have access to all relevant information and can process it effectively.
Self-interest: People make choices that maximize their own utility or well-being.
Optimization: Individuals seek the best possible outcome given the constraints they face.
Criticisms of economic rationality:
Bounded rationality: Herbert Simon argued that humans have cognitive limitations that prevent them from always making optimal decisions. Instead, they often use heuristics or rules of thumb to make satisfactory, rather than optimal, choices.
Psychological factors: Behavioral economists like Daniel Kahneman and Amos Tversky demonstrated that psychological factors, such as emotions, biases, and heuristics, significantly influence decision-making, leading to deviations from perfect rationality.
Social and cultural influences: Critics argue that economic rationality neglects the role of social norms, cultural values, and interpersonal relationships in shaping individual preferences and decisions.
Altruism and cooperation: Economic rationality struggles to explain altruistic behavior or cooperation, as these actions may not always maximize an individual's self-interest in the short term.
Incomplete information: In reality, decision-makers rarely have access to perfect information, leading to suboptimal choices.
Adaptive preferences: People's preferences may change over time or adapt to their circumstances, challenging the assumption of consistent preferences.
Ethical considerations: Some argue that focusing solely on self-interest and utility maximization may lead to unethical or socially harmful decisions.
Despite these criticisms, economic rationality remains a useful model for understanding and predicting economic behavior in many situations. However, economists increasingly recognize the importance of incorporating insights from behavioral economics and other disciplines to develop more realistic and comprehensive models of decision-making.
Expectation bias: The inclination to perceive what we expect to perceive.
Framing bias: Arises when participants do not remember past events accurately, skewing the data.
Funding bias: The conflict of interest problem of funding influencing the research process or outcomes to favor the sponsor's interests.
Groupthink: The tendency for members of a group to conform to the prevailing views and suppress dissenting opinions to maintain group harmony.
Heuristic: A heuristic is a mental shortcut or a rule of thumb that allows people to make judgments and decisions quickly and efficiently. Heuristics are often used when faced with complex problems or incomplete information, as they can help simplify the decision-making process by reducing the amount of mental effort required. While heuristics can be useful in many situations, they can also lead to cognitive biases and systematic errors in judgment.
Key characteristics of heuristics:
Efficiency: Heuristics enable individuals to make decisions quickly by focusing on the most relevant information and ignoring less essential details.
Cognitive economy: By using heuristics, people conserve mental resources and avoid the need for extensive information processing.
Implicit learning: Heuristics are often developed through experience and observation, rather than explicit instruction.
Adaptiveness: Heuristics can be adaptive, helping individuals navigate complex environments and make decisions under uncertainty.
Potential for bias: While heuristics are often useful, they can also lead to systematic errors and biases in judgment when applied inappropriately or when the simplified rule does not fit the situation.
Examples of common heuristics:
Availability heuristic: Judging the likelihood of an event based on how easily examples come to mind.
Representativeness heuristic: Making judgments based on how similar an object or event is to a typical case or stereotype.
Anchoring and adjustment: Relying heavily on the first piece of information encountered (the anchor) and making insufficient adjustments based on additional information.
Affect heuristic: Basing decisions on emotional responses rather than objective evaluations.
Recognition heuristic: Choosing the option that is most familiar or recognizable.
While heuristics are an essential part of human cognition and can be helpful in many situations, it is important to be aware of their potential limitations and biases. By understanding when and how heuristics are used, individuals can make more informed decisions and avoid common pitfalls in judgment. Additionally, researchers in fields such as psychology, economics, and decision science study heuristics to better understand human behavior and develop strategies for improving decision-making processes.
Hindsight bias: The inclination to see past events as more predictable than they actually were, often leading to an overestimation of one's ability to predict future events.
Illusory correlation: The tendency to perceive a relationship between variables even when no such relationship exists.
Jumping-to-conclusions bias (JTC): The tendency to make decisions based on limited information before gathering sufficient evidence. Making hasty judgments prematurely.
Mirror imaging: The assumption that others (such as political candidates) will act in a way consistent with one's own thought processes and values.
Observer bias: Results from researchers' subjective expectations influencing their observations or interpretations.
Omitted variable bias: Happens when important variables are left out of an analysis, leading to incorrect relationships being identified.
Overconfidence: Having excessive confidence in one's own answers to questions. Overestimating one's own skills, abilities and accuracy of beliefs.
Proportionality bias: The belief that causes should resemble effects in size or magnitude. Large effects are assumed to have large causes.
Recall bias: Arises when participants do not remember past events accurately, skewing the data.
Representativeness heuristic: Judging the probability of an event by finding a 'comparable known' event and assuming that the probabilities will be similar. Assuming that two things that share characteristics are related.
Resistance to change: Perceptions resist change even when new evidence is presented.
Self-serving bias: The common habit of a person taking credit for positive events or outcomes, but blaming outside factors for negative events.
Survivorship bias: Involves focusing on surviving or existing data while overlooking data that does not make it past some point of the process.
System 1 thinking: Fast, automatic, emotional and subconscious style of thinking and judgment. Relies on heuristics and produces systematic errors.
System 2 thinking: Slow, effortful, controlled and conscious style of reasoning and analysis. Logical and rule-based. Can overcome biases from System 1.
Vividness criterion: The tendency to give more weight to concrete, emotionally compelling information than to abstract or statistical information.