Early warning signals almost never announce themselves as risks. They appear as friction, hesitation, inconsistency, or quiet workarounds. They surface in conversations rather than dashboards. They are felt before they are measured.1 This is precisely why they are so often ignored.2 Organisations are not blind to early warning signals. They are trained to discount them.3
What early warning signals actually look like
Early warning signals are rarely dramatic. They show up as:
- teams interpreting the change differently
- managers quietly delaying enforcement
- adoption that looks compliant but lacks confidence
- issues being solved locally rather than escalated
- increasing reliance on informal fixes
Individually, these signals are easy to rationalise. Collectively, they indicate drift. Because they do not threaten delivery milestones immediately, they are rarely treated as urgent.
Why early signals feel unreliable
Early signals are ambiguous. They do not come with certainty or clean attribution. They require interpretation. They force leaders to make judgement calls without full data. In environments that value evidence and rigour, this creates discomfort.
Leaders hesitate to act on what feels subjective. They wait for clearer proof. They look for confirmation in metrics that lag reality. By the time those metrics move, the opportunity for low-cost intervention has passed.
How organisations learn to suppress weak signals
Most organisations unintentionally train themselves to ignore early signals. This happens through patterns such as:
- challenging concerns without addressing them
- asking for more data when action is needed
- rewarding delivery over problem surfacing
- treating early escalation as pessimism
- labelling friction as “change noise”
Over time, people stop raising what they notice. Not because they do not care, but because they have learned it is safer to wait.
This is not a communication failure. It is a behavioural one.
The optimism trap
Optimism plays a powerful role in signal suppression. Leaders want the change to work. They have invested time, credibility, and political capital. They interpret ambiguous signals in the most favourable light. This optimism is rarely naïve. It is protective.
Unfortunately, it biases interpretation toward patience rather than inquiry. Early discomfort is framed as temporary. Concerns are deferred. Optimism delays intervention long enough for small problems to become structural ones.
Why dashboards don’t help early
Dashboards are designed to detect deviation from plan. Early warning signals often precede deviation. They show up before performance drops, before benefits stall, and before risks crystallise. As a result, dashboards remain green while conditions deteriorate underneath. This creates a false sense of security. By the time dashboards reflect a problem, the organisation is already responding to consequences rather than causes.
The cost of waiting for certainty
Waiting for certainty feels responsible. It reduces the risk of overreaction. It protects leaders from acting prematurely. In change, however, waiting for certainty almost always increases cost. Early intervention is usually small and reversible. Late intervention is disruptive, political, and expensive. Certainty comes too late to be useful.
What organisations that act early do differently
Organisations that intervene early treat weak signals as valuable input, not as noise. They:
- encourage escalation without penalty
- treat inconsistency as diagnostic
- ask what friction is telling them
- intervene experimentally rather than decisively
- adjust design before enforcing behaviour
They do not need perfect information. They need enough confidence to explore. This mindset shifts intervention from crisis response to course correction.
Reframing signal detection as a leadership capability
Detecting early warning signals is not about better reporting. It is about leadership judgement. It requires the ability to:
- tolerate ambiguity
- resist optimism bias
- act before proof is complete
- protect people who surface issues
- treat friction as information
These are not technical skills. They are governance and leadership capabilities.
A more practical way to think about risk
Risk is not just what might go wrong. It is what is quietly going wrong already, without being acknowledged. Early warning signals are evidence that the system is adapting in ways that may undermine intended outcomes. Ignoring them does not preserve momentum. It mortgages it. This is one way of thinking about why change succeeds or fails. Other pieces go deeper into how organisations can intervene earlier without destabilising progress.
-
Mintzberg, H. (1973). The Nature of Managerial Work. Harper & Row. Mintzberg’s observational research on managerial work establishes that managers process information primarily through conversation, informal contact, and direct observation — not through formal reporting systems. Early warning signals surface in this informal information environment before they appear in dashboards. They are felt before they are measured precisely because the informal network registers environmental change faster than formal measurement systems. Leaders who rely primarily on formal data sources systematically lag the informal network in detecting emerging risk. ↩︎
-
Rerup, C. (2005). Learning from Past Experience: Footnotes on Mindfulness and Habitual Entrepreneurship. Scandinavian Journal of Management, 21(4), 451–472. https://doi.org/10.1016/j.scaman.2005.09.010. Rerup’s analysis of habitual attention demonstrates that organisations develop characteristic patterns of attending — they notice what their established categories of concern make salient and discount what falls outside those categories. Early warning signals are typically miscategorised as familiar, manageable noise until they exceed the thresholds that established attention patterns are designed to detect. This explains why organisations are not blind to early signals but are trained to discount them: the discounting is a function of habitual categorisation, not of absence. ↩︎
-
Weick, K. E., & Sutcliffe, K. M. (2007). Managing the Unexpected: Resilient Performance in an Age of Uncertainty (2nd ed.). Jossey-Bass. Weick and Sutcliffe’s analysis of high-reliability organisations establishes that the systematic suppression of weak signals — through normalisation of deviation, misidentification of indicators as noise, and reluctance to acknowledge difficulty — is the primary mechanism by which organisations train themselves to miss warnings that precede failure. Organisations that manage unexpected events well do so not because they have better information, but because they have built cultures that treat weak signals as worthy of investigation rather than rationalisation. ↩︎