Why do we underestimate how long it will take to complete a task
The Planning Fallacy
, explained.What is the Planning Fallacy?
The planning fallacy describes our tendency to underestimate the amount of time it will take to complete a task, as well as the costs and risks associated with that task—even if it contradicts our experiences.
Where this bias occurs
Let’s say John, a university student, has a paper due on Friday, a week from today. John has written many papers of a similar length before, and it generally takes him about a week to get it done. Nonetheless, as he is planning how to divide up his time, John is positive that he can finish the assignment in three days, so he doesn’t start until Tuesday. In the end, he doesn’t have the paper finished in time, and needs to ask for an extension.
Individual effects
Just as the name suggests, the planning fallacy can cause us to plan poorly for the future, causing us to make decisions that ignore realistic estimates of the demands of a task (be it time, money, energy, or something else). It also leads us to downplay the elements of risk and luck; instead, we focus only on our own abilities—and probably an overly optimistic assessment of our abilities at that.
Systemic effects
The planning fallacy affects everybody, whether they are an undergraduate student, a city planner, or the CEO of an organization. When it comes to large-scale ventures, from disruptive construction projects to expensive business mergers, the livelihoods of many people (not to mention a whole lot of money) are at stake, and there are widespread economic and social consequences of poor planning.
Why it happens
We prefer to focus on the positive
Simply put, the planning fallacy stems from our overall bias towards optimism, especially where our own abilities are concerned.11,12 In general, we are oriented towards positivity. We have optimistic expectations of the world and other people; we are more likely to remember positive events than negative ones; and, most relevantly, we tend to favor positive information in our decision-making processes.1,2
When it comes to our own capabilities, we are particularly bad at making accurate judgments. As an example, take one study that asked incoming university students to estimate how they would perform academically, compared to their classmates. On average, participants believed they would outperform 84% of their peers.3 Of course, this figure may well have been accurate for some individual students, but it is mathematically impossible for everybody to be in the top 16%.
All of this means that when we set out to plan a project, we are likely to focus on imagined successful outcomes rather than potential pitfalls, and we are likely to overestimate how capable we (and our team members) are of meeting certain goals. While enthusiasm is certainly important for any venture, it can become toxic if it comes at the expense of being realistic.
We become anchored to our original plan
Anchoring is another type of cognitive bias that plays a big role in the planning fallacy. It was coined by Muzafer Sherif, Daniel Taub, and Carl Hovland. Anchoring is the tendency to rely too heavily on early information when we are making decisions.6 When we draw up an initial plan for a project, we are biased to continue thinking in terms of those initial values—deadlines, budgets, and so on.
Anchoring is especially problematic if our original plans were unrealistically optimistic. Even if our initial predictions were massively inaccurate, we still feel tethered to those numbers as we try to reassess. This leads us to make insufficient adjustments to our plans as we go along, preferring to make minor tweaks rather than major changes (even if major changes are necessary).
We write off negative information
Even if we do take outside information into account, we have a tendency to discount pessimistic views or data that challenge our optimistic outlook. This is the flipside of our positivity bias: our preference for affirmative information also makes us reluctant to consider the downsides.
In the business world, one example of this is known as competitor neglect, which describes how company executives fail to anticipate how their rivals will behave because they are focused on their own organization.3 For example, when a company decides to break into a fast-growing market, it often forgets to consider its competitors are likely to do the same, leading to an underestimation of risk.
More generally, we often make attribution errors when considering our successes and failures. Whereas we tend to ascribe positive outcomes to our talents and hard work, we attribute negative outcomes to factors beyond our control. This makes us less likely to take previous failures into account: we believe that those instances were not our fault, and we convince ourselves that the external factors that caused us to fail will not come up again.4
We face social pressure
Organizational pressure to finish projects quickly, and without problems, is a major reason that the planning fallacy can be so detrimental. Workplace cultures can often be highly competitive, and there may be costs for individuals who voice less enthusiastic opinions about a project, or who insist on a longer timeline than others. At the same time, executives might favour the most overly optimistic predictions over others, giving individuals an incentive to engage in inaccurate, intuition-based planning.
Why it is important
The planning fallacy has consequences for both our professional and personal lives, nudging us to invest our time and money in ill-fated ventures, and keeping us tethered to those projects for far too long. Research has demonstrated how widespread this bias is: In the business world, it has been found that more than 80% of start-up ventures fail to achieve their initial market-share targets.3 Meanwhile, in classrooms, students report finishing about two thirds of their assignments later than they expected it to.4
In some fields, such as venture capital, high rates of failure are often ascribed to normal levels of risk and seen as proof that the system is working as it should. However, cognitive scientists such as Dan Lovallo and Daniel Kahneman believe that these figures have a lot more to do with cognitive biases such as the planning fallacy.3 If more people were aware of the planning fallacy, they could take steps to counteract it, such as the ones described below.
How to avoid it
Merely being aware of the planning fallacy is not enough to stop it from happening.5 Even if we have this knowledge, we still risk falling into the trap of believing that this time, the rules won’t apply to us. Most of us have a strong preference to follow our gut, even if its forecasts have been wrong in the past. What we can do is to plan around the planning fallacy, building steps into the planning process that can help us avoid it.
Take the outside view
There are two types of information available for people to use when planning: singular information and distributional information. Singular information is evidence related to the specific case under consideration, whereas distributional information is evidence related to other, similar tasks that were completed in the past.5 These stances are also referred to as the inside and outside views, respectively.3
Ideally, both singular and distributional information should be taken into account when planning. The planning fallacy is likely to arise when we rely solely on the inside view—that is, when we disregard external information about how likely we are to succeed, and instead trust our intuitive guesses about how costly a project will be. Unfortunately, this is exactly what many of us tend to do. Because planning is an inherently future-oriented process, we are inclined to look forward, rather than backward, in time. This leads us to disregard our past experiences.4
Supplementing planning processes with distributional (outside) data, wherever possible, is a solid way to temper expectations for a project.3 If an organization or individual has completed similar projects in the past, they can use the outcomes of those previous experiences to set goals for new ones. It is just as useful to look outside one’s own experiences and see how others have fared. The main point is to make a deliberate effort not to rely solely on intuition.
Set implementation intentions
Another strategy for combating the planning fallacy is illustrated by a study from the Netherlands, where study participants were given a writing assignment and told to complete it within a week. The participants were split into two groups. Both groups were instructed to set goal intentions, indicating what day they intended to start writing the paper, and what day they believed they would finish. However, the second group also set implementation instructions, specifying what time of day and in what location they would write, and were asked to visualize themselves following through on their plan.
Researchers found that setting specific implementation intentions resulted in significantly more realistic goal-setting.7 At the same time, doing so did not lessen the participants’ optimism; on the contrary, they were even more confident in their ability to meet their goals. They also reported fewer interruptions while they were working. This may be because the process of thinking through the specifics of completing the task at hand resulted in a stronger commitment to following through with one’s plan.
These results show that optimism is not incompatible with realism, as long as it is combined with a carefully thought-out plan.
Use the segmentation effect for better estimates
A related strategy involves breaking up big projects into their component parts, and then planning for the completion of the smaller subtasks instead of the project as a whole. As bad as we are at estimating the amount of time required for relatively large tasks, research has shown that we are much better at planning for small ones: often our estimates are remarkably accurate, and at worst they are overestimates.9 This is therefore a much safer strategy: In practice, it’s much better to allot too much time to a project than too little.
How it all started
The planning fallacy was first proposed by Daniel Kahneman & Amos Tversky, two foundational figures in the field of behavioural economics. In a 1977 paper, Kahneman and Tversky argued that, when making predictions about the future, people tend to rely largely on intuitive judgments that are often inaccurate. However, the types of errors that people make are not random, but systematic, indicating that they resulted from uniform cognitive biases.
In this paper, Kahnemany & Tversky brought up planning as an example of how bias interferes with our forecasts for the future. “Scientists and writers,” they said, apparently drawing from experience, “are notoriously prone to underestimate the time required to complete a project, even when they have considerable experience of past failures to live up to planned schedules.” They named this phenomenon the “planning fallacy” and argued that it arose from our tendency to ignore distributional (outside) data.5
Example 1 - The Sydney Opera House
Now one of the most iconic man-made structures in the world, the construction of the Sydney Opera House was mired with delays and unforeseen difficulties that caused the project to drag on for a decade longer than planned. The original projected cost was $7 million; by the time it was done, it had cost $102 million.4
The Australian Government insisted that construction begin early, wanting to break ground while public opinion about the Opera House was still favorable and funding was still in place. However, the architect had not yet completed the final plans, leading to major structural issues that had to be addressed down the road, slowing the project down and inflating the budget. One major problem: the original podium was not strong enough to support the House’s famous shell-shaped roof, and had to be rebuilt entirely.
Joseph Cahill, the politician who had championed the Opera House, rushed construction along out of fear that political opposition would try to stop it.9 In his enthusiasm, he disregarded criticisms of the project and relied on intuitive forecasts for its costs. While the building, when it was eventually finished, was beautiful and distinctive, it would have been prudent to slow down and take the outside view in planning.
Example 2 - The Canadian Pacific Railway
In 1871, the colony of British Columbia agreed to become a part of Canada. In exchange for joining the Confederation, it was promised that a transcontinental railway, connecting BC to Eastern Canada, would be completed by 1881.4 In the end, the railway was not completed until 1885, and would require an additional $22.5 million in loans than originally predicted.10
In initially planning the railway, its proponents had apparently not considered how difficult it would be to build through the Canadian Shield, as well as through the mountains of BC. Additionally, there was an inadequate supply of workers to build the railroad in British Columbia. The railroad was eventually built by around 15,000 Chinese laborers, who worked in extremely harsh conditions for very little pay.
Summary
What it is
The planning fallacy describes how we are likely to underestimate the costs of a project, such as how long it will take and how much it will cost.
Why it happens
The human brain is generally biased towards positivity, leading us to make overly optimistic predictions about our projects, as well as to disregard information that contradicts our optimistic beliefs. Once we have set unrealistic plans, other biases such as anchoring compel us to stick with them. Pressure from team members, superiors, or shareholders to get things done quickly and smoothly also makes it more costly for us to revise our plans partway through a project.
Example #1 – The Sydney Opera House
The Sydney Opera House is a famous example of the planning fallacy, because it took 10 years longer and nearly $100 million more to complete than was originally planned. One major reason was the government’s insistence on starting construction early, despite the fact that plans were not yet finished.
Example #2 – The Canadian Pacific Railway
The Canadian Pacific Railway was finished four years late and more than $20 million over budget, largely because of a failure to plan for the difficulties over building through mountain ranges and over the Canadian Shield. The eventual completion of the project required the labor of more than 15,000 Chinese workers, an example of how failure to adequately plan large projects can have a great human cost.
How to avoid it
The planning fallacy is best avoided by incorporating outside information into the planning process, rather than relying solely on intuition. Other strategies, such as setting specific intentions to implement a plan, envisioning oneself carrying out the plan, and segmenting large projects into smaller subtasks, can also help generate more accurate estimates about how costly something will really be.
Related TDL articles
Why You Might Not Be Sticking To Your Plans
This article explores a few reasons why people often fail to follow through with their plans, including the planning fallacy. Another potential explanation is the Dunning-Kruger effect, which describes how people with low ability tend to overestimate their own skills. The author also discusses the importance of planning for less-than-ideal scenarios, as well as setting implementation intentions.
The Key to Effective Teammates Isn’t Them. It’s You.
As discussed above, one reason the planning fallacy is so common is because of pressures in the workplace and other environments to overachieve, and to always strive for perfection. This article discusses the importance of being authentically ourselves, at work and elsewhere. When we act in a way that prioritizes genuine social connection over our own egos, we help others feel safe to do the same. By checking in with ourselves and our motivations, asking ourselves whether we are acting in accordance with our values and beliefs, we can create an atmosphere more accepting of imperfections.
Sources
- Ackerman, C. E. (2019, April 7). Pollyanna principle: The psychology of positivity bias. PositivePsychology.com. https://positivepsychology.com/pollyanna-principle/
- Hoorens V. (2014) Positivity Bias. In: Michalos A.C. (eds) Encyclopedia of Quality of Life and Well-Being Research. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-0753-5
- Lovallo, D., & Kahneman, D. (2003, July). Delusions of success: How optimism undermines executives’ decisions. Harvard Business Review. https://hbr.org/2003/07/delusions-of-success-how-optimism-undermines-executives-decisions
- Buehler, R., Griffin, D., & Ross, M. (1994). Exploring the “planning fallacy”: Why people underestimate their task completion times. Journal of Personality and Social Psychology, 67(3), 366-381. https://doi.org/10.1037/0022-3514.67.3.366
- Kahneman, Daniel; Tversky, Amos (1977). “Intuitive prediction: Biases and corrective procedures” (PDF). Decision Research Technical Report PTR-1042-77-6. In Kahneman, Daniel; Tversky, Amos (1982). “Intuitive prediction: Biases and corrective procedures”. In Kahneman, Daniel; Slovic, Paul; Tversky, Amos (eds.). Judgment Under Uncertainty: Heuristics and Biases. Science. 185. pp. 414–421.
- Tversky, A., & Kahneman, D. (1982). Judgment under uncertainty: Heuristics and biases. Judgment under Uncertainty, 3-20. https://doi.org/10.1017/cbo9780511809477.002
- Koole, S., & Van’t Spijker, M. (2000). Overcoming the planning fallacy through willpower: Effects of implementation intentions on actual and predicted task-completion times. European Journal of Social Psychology, 30(6), 873-888. https://doi.org/10.1002/1099-0992(200011/12)30:6<873::aid-ejsp22>3.0.co;2-u
- Forsyth, D. K., & Burt, C. D. (2008). Allocating time to future tasks: The effect of task segmentation on planning fallacy bias. Memory & Cognition, 36(4), 791-798. https://doi.org/10.3758/mc.36.4.791
- Construction begins. (n.d.). Sydney Opera House. https://www.sydneyoperahouse.com/our-story/sydney-opera-house-history/construction-begins.html
- Lavallé, O. (2008, March 6). Canadian Pacific railway. The Canadian Encyclopedia. https://www.thecanadianencyclopedia.ca/en/article/canadian-pacific-railway
- Optimism bias. (2019, August 22). The Decision Lab. https://thedecisionlab.com/biases/optimism-bias/
- Dunning–Kruger effect. (2020, July 22). The Decision Lab. https://thedecisionlab.com/biases/dunning-kruger-effect/
No comments:
Post a Comment