Imagine building a house. You have the drawings, the contractor, a budget. Everything seems clear: foundation, walls, roof, windows. However, during construction, it appears that unexpected costs arise: the soil is different than expected, the wood is more expensive, additional permits are surfacing and the work of specialists takes longer than planned. What you thought you knew turned out to be just an assumption — a false certainty.
If building a house shows how we underestimate risks, it also shows how we overestimate certainty. That's a human pattern — not just in construction, but in how we plan, make decisions and design technology.
The illusion of certainty
Behavioral scientists Daniel Kahneman and Amos Tversky showed decades ago that people consistently suffer from what we now call the planning fallacy: the tendency to underestimate the time, costs and risks of actions while overestimating the benefits. This effect occurs even when we know that similar tasks took longer in the past.
Kahneman described this as part of the human cognitive architecture: we're excellent at quick, intuitive judgements but bad at abstract, statistical predictions. Even when data is available, the brain tends to rely on heuristics — mental shortcuts — that seem pleasant but are actually inaccurate.
A classic scientific review of expert overconfidence showed that people — even when they claim to be 90% sure of their prediction — are actually far more likely than they think. Thousands of 90% confidence intervals from experts were analyzed and concluded that the true values fell far more often than the experts estimated in advance.
This doesn't mean that people are stupid — it means that our natural thinking process is designed for risks and decisions in an environment of immediate feedback and concrete consequences, not in systems of great abstraction and interdependence such as a commerce platform or large IT deployment.
Why we think we know
There are three major mental mechanisms that together explain why we overestimate certainty:
1. The Illusion of Validity
When we see patterns in data — for example, consistent sales, returns, or customer behavior — we tend to assume that this pattern is predictive of the future, even when it isn't. Kahneman describes this as the illusion of validity: we rely too much on the representativeness of our observations and too little on the uncertainties and noise in data.
In practice, this leads planners and managers to think they have “understood” a platform or project, thinking they have identified all factors — while the underlying risks are not included in their model.
2. Overconfidence Bias in predictions
Scientific research shows that both laymen and experts tend to have too much confidence in the accuracy of their own predictions. The same review study mentioned above concluded that people usually give confidence intervals that are too narrow — meaning they think they have more certainty than statistically justified.
In the context of commerce platforms, this translates into project planning: people often give too comfortable timelines and cost estimates because our brain is simply not good at assessing uncertainty.
3. Reference class forecasting (the outsider's view)
In response to this human tendency, Kahneman and Tversky, together with others, developed the method of reference class forecasting. Instead of relying on internal expectations, people look at statistics from similar, completed projects to make more realistic forecasts.
In the practice of large infrastructure projects (such as railways or bridges), the use of such reference classes has led to far more realistic estimates of costs and time — and far fewer surprises. That same principle — looking at external, historical data rather than internal optimistic assumptions — can make commerce platform projects much more robust.
How this plays out in commerce projects
When a company decides to build a new commerce platform or upgrade existing architecture, the risks are not linear and often not fully visible at the start. Teams sometimes think, “We know what we're doing — we've done this before.” But because of the biases above, the reality often looks different.
Example: B2B → D2C extension
A Dutch B2B company decided to launch a direct-to-consumer (D2C) channel. The internal assessment was that technical integration and process adjustment would be easy: “we know our systems, we know what logistics should do, we just do this on the side.”
In reality, return processes, customer segmentation, fulfillment and real-time integrations turned out to be much more complex and heterogeneous. Without a centralized commerce platform, there were duplicate administrations, delays and frustrations among staff and customers. By investing later in an integrated platform with real-time monitoring and data dashboards, these issues could be resolved in weeks rather than months — but the initial assessment was too optimistic.
This discrepancy is typical: managers overestimate control and predictability, especially when internal knowledge is interpreted as certainty, while external data (such as return rates, server load peaks and customer interaction variability) is often not included in initial estimates.
Why people have trouble with abstract risks
Our ancestors lived in environments where risks were visceral and concrete — a misstep could lead to a cliff fall, a wrong hunting strategy could lead to starvation. In that context, our brain worked very well with rapid heuristics.
In abstract, digital production systems, such as a commerce platform, risks are distributed, latent and only visible with a delay. You won't see a “server‑ready to crash flag” or a “return explosion alert” until the issues already have an impact. That is why we unfairly rely on instinctive certainty, while reality has poor predictability.
For example, a study by Xiaoxiao Niu and Nigel Harvey showed that, on average, people were 28% too optimistic in their predictions about inflation, and that their overconfidence declined when they first received feedback about previous mistakes. This highlights that without feedback, people continue to trust their internal model — even when that model is inaccurate.
Making the invisible visible: what works?
If we know that people are cognitively inclined to overconfidence and underestimate uncertainty, what can we do practically — in and out of commerce?
1. Use historical data (reference classes)
Instead of relying solely on internal assumptions, start with real data from similar projects. This reduces planners' optimism bias and makes hidden risks explicit.
2. Work with calibration and feedback loops
Studies show that when people see actual outcomes before making new predictions, they become less overconfident and make more realistic estimates.
3. Separate assumptions from realistic scenarios
When building a house, you have explicit scenario analyses (bad soil, bad weather, supply problems). The same should apply to commerce systems: what happens when traffic doubles? Tripled returns? Power failure at the payment gateway? By simulating scenarios, you uncover hidden assumptions.
4. Acceptance of uncertainty as part of decision making
Recognizing that forecasting is never perfect, but can improve, leads to more robust planning. Larger projects must provide space for iterative learning and adjustment, not just a rigid budget plan.
Why security offers comfort
So why do we keep clinging to the illusion of certainty? Because feeling that we “know what we're doing” calms us down. It removes the fear of the unknown. Even when we know rationally that predictions can be wrong, detailed planning provides a sense of control.
As surveys with entrepreneurs show, cognitive biases — including overconfidence — play a role in strategic decision-making and risk assessment because people and teams tend to trust their judgment, even when the underlying data is weak.
This is not an individual failure, but a symptom of how our cognitive architecture is shaped: fast, intuitive, and often comfortable — but not always accurate in abstract systems.
From uncertainty to better understanding
Building a commerce platform is like building a home: the hidden assumptions and unexpected details have the biggest impact.
We often underestimate the costs — but more importantly — we overestimate our security. This is a universal human trait that comes from how we think, how we learn and how we deal with risks. By being aware of this, we can plan, design and decide with more realism, robust data and a clearer view of risks.
It is precisely by recognizing the human tendency for certainty that we can design systems that are resilient, scalable and people-oriented — not because we can predict everything, but because we learn to deal with what we not be able to predict.