06 Jan Top Ten Behavioral Biases in Project Management: An Overview
Top Ten Behavioral Biases in Project Management: An Overview
Abstract
Behavioral science has witnessed an explosion in the number of biases identified by behavioral scientists, to more than 200 at present. This article identifies the 10 most important behavioral biases for project management. First, we argue it is a mistake to equate behavioral bias with cognitive bias, as is common. Cognitive bias is half the story; political bias the other half. Second, we list the top 10 behavioral biases in project management: (1) strategic misrepresentation, (2) optimism bias, (3) uniqueness bias, (4) the planning fallacy, (5) overconfidence bias, (6) hindsight bias, (7) availability bias, (8) the base rate fallacy, (9) anchoring, and (10) escalation of commitment. Each bias is defined, and its impacts on project management are explained, with examples. Third, base rate neglect is identified as a primary reason that projects underperform. This is supported by presentation of the most comprehensive set of base rates that exist in project management scholarship, from 2,062 projects. Finally, recent findings of power law outcomes in project performance are identified as a possible first stage in discovering a general theory of project management, with more fundamental and more scientific explanations of project outcomes than found in conventional theory.
Introduction
Since the early work of Tversky and Kahneman (1974), the number of biases identified by behavioral scientists has exploded in what has been termed a behavioral revolution in economics, management, and across the social and human sciences. Today, Wikipedia’s list of cognitive biases contains more than 200 items (“List of cognitive biases,” 2021). The present article gives an overview of the most important behavioral biases in project planning and management, summarized in Table 1. They are the biases most likely to trip up project planners and managers and negatively impact project outcomes, if the biases are not identified and dealt with up front and during delivery.
Top 10 Behavioral Biases in Project Planning and Management
| Name of Bias | Description |
|---|---|
| 1. Strategic misrepresentation | The tendency to deliberately and systematically distort or misstate information for strategic purposes. Aka political bias, strategic bias, or power bias. |
| 2. Optimism bias | The tendency to be overly optimistic about the outcome of planned actions, including overestimation of the frequency and size of positive events and underestimation of the frequency and size of negative ones. |
| 3. Uniqueness bias | The tendency to see one’s project as more singular than it actually is. |
| 4. Planning fallacy (writ large) | The tendency to underestimate costs, schedule, and risk and overestimate benefits and opportunities. |
| 5. Overconfidence bias | The tendency to have excessive confidence in one’s own answers to questions. |
| 6. Hindsight bias | The tendency to see past events as being predictable at the time those events happened. Also known as the I-knew-it-all-along effect. |
| 7. Availability bias | The tendency to overestimate the likelihood of events with greater ease of retrieval (availability) in memory. |
| 8. Base rate fallacy | The tendency to ignore generic base rate information and focus on specific information pertaining to a certain case or small sample. |
| 9. Anchoring | The tendency to rely too heavily, or “anchor,” on one trait or piece of information when making decisions, typically the first piece of information acquired on the relevant subject. |
| 10. Escalation of commitment | The tendency to justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting the decision may be wrong. Also known as the sunk cost fallacy. |
Discussion
Scientific revolutions rarely happen without friction. So, too, for the behavioral revolution. It has been met with skepticism, including from parts of the project management community (Flyvbjerg et al., 2018). Some members prefer to stick with conventional explanations of project underperformance in terms of errors of scope, complexity, labor and materials prices, archaeology, geology, bad weather, ramp-up problems, demand fluctuations, and so forth (Cantarelli et al., 2010a).
Behavioral scientists would agree with the skeptics that scope changes, complexity, and so forth are relevant for understanding what goes on in projects but would not see them as root causes of outcomes. According to behavioral science, the root cause of, say, cost overrun is the well-documented fact that project planners and managers keep underestimating scope changes, complexity, and so forth in project after project.
Behavioral scientists would agree with the skeptics that scope changes, complexity, and so forth are relevant for understanding what goes on in projects but would not see them as root causes of outcomes. According to behavioral science, the root cause of, say, cost overrun is the well-documented fact that project planners and managers keep underestimating scope changes, complexity, and so forth in project after project.
Behavioral science is not perfect. We saw above how behavioral economics suffers from a “psychology bias,” in the sense it tends to reduce behavioral biases to cognitive biases, ignoring political bias in the process, thus committing the very sin it accuses conventional economics of, namely theory-induced blindness resulting in limited rationality. Gigerenzer (2018) goes further and criticizes behavioral economics for “bias bias,” and he is right when he calls for conceptual clarification. Not all behavioral biases are well defined, or even well delineated: many and large overlaps exist among different biases that need clarification, including for the 10 described above. Just as seriously, many biases have only been documented in simplified lab experiments but are tacitly assumed to hold in real-life situations outside the lab, without sound demonstration that the assumption holds. Finally, the psychology used by behavioral economists is not considered cutting-edge by psychologists, a fact openly acknowledged by Thaler (2015, p. 180), who further admits it is often difficult to pin down which specific behavioral bias is causing outcomes in a given situation or to rule out alternative explanations (Thaler, 2015, p. 295).
Nevertheless, the behavioral revolution seems to be here to stay, and it entails an important change of perspective for project management: The problem with project cost overruns and benefit shortfalls is not error but bias, and as long as we try to solve the problem as something it is not (error), we will not succeed. Estimates and decisions need to be debiased, which is fundamentally different from eliminating error. Furthermore, the problem is not even cost overruns or benefit shortfalls, it is cost underestimation and benefit overestimation. Overrun, for instance, is mainly a consequence of underestimation, with the latter happening upstream from overrun, for big projects often years before overruns manifest. Again, if we try to solve the problem as something it is not (cost overrun), we will fail. We need to solve the problem of upstream cost underestimation in order to solve the problem of downstream cost overrun. Once we understand these straightforward insights, we understand that we and our projects are better off with an understanding of behavioral science and behavioral bias than without it.
References
“Top Ten Behavioral Biases in Project Management: An Overview” Article written by Bent Flyvbjerg 14 December 2021.
https://journals.sagepub.com/doi/full/10.1177/87569728211049046
Anderson C., Galinsky A. D. (2006). Power, optimism, and risk-taking. European Journal of Social Psychology, 36, 511–536.
Ansar A., Flyvbjerg B., Budzier A., Lunn D. (2014). Should we build more large dams? The actual costs of hydropower megaproject development. Energy Policy, 69, 43–56.
Arkes H. R., Blumer C. (1985). The psychology of sunk cost. Organizational Behavior and Human Decision Making, 35(1), 24–140.
Association of Project Management (APM). (2012). APM body of knowledge (6th Ed.) Retrieved from https://www.apm.org.uk/body-of-knowledge/context/governance/project-management/.
Bar-Hillel M. (1980). The base-rate fallacy in probability judgments. Acta Psychologica, 44(3), 211–233.
Barabási A.-L. (2014). Linked: How everything is connected to everything else and what it means for business, science, and everyday life. Basic Books.
Barabási A.-L., Albert R. (1999). Emergence of scaling in random networks. Science, 286(5439), 509–512.
Batselier J., Vanhoucke M. (2016). Practical application and empirical evaluation of reference class forecasting for project management. Project Management Journal, 47(5), 36–51.
Bizony P. (2006). The man who ran the moon: James Webb, JFK, and the secret history of Project Apollo. Icon Books.
Bok S. (1999). Lying: Moral choice in public and private life. Vintage, first published in 1979.
Brockner J. (1992). The escalation of commitment to a failing course of action: Toward theoretical progress. Academy of Management Review, 17(1), 39–61.
