When to Escalate What You're Noticing — Instability Announces Itself Quietly

Organisations rarely fail suddenly. Instability accumulates through repeated, modest signals that, taken together, form a pattern. Small hesitations appear. Escalations multiply. Cross-functional tensions repeat in predictable ways. Adoption plateaus without obvious explanation. Each individual signal is modest — easy to explain away, contextualise, or reframe as temporary friction.1

The difficulty is not detecting signals. People feel them. Managers notice them. Practitioners see them. The difficulty is interpreting them honestly and acting on that interpretation, especially when action feels disruptive or politically risky.

Why early signals are softened

Early warning signals often surface in ambiguous form. “Teams are confused about priorities.” “There’s some resistance in one region.” “Usage is uneven across departments.” These observations are typically framed as temporary — transitional friction, expected learning curve, predictable dip. Sometimes that interpretation is correct. The organisation is indeed in the middle of change. Some confusion is normal.

Often, though, the softening is protective. Softening signals preserves forward momentum. Escalating them risks political friction. It can feel like admitting the change isn’t landing as smoothly as it was supposed to. So organisations normalise. Each signal is contextualised individually. Rarely are they aggregated to see if there’s a pattern that warrants escalation.

The structural effect of normalisation

When early signals are softened repeatedly, three structural things happen: escalation thresholds rise, local buffering increases, and accountability diffuses. Middle managers compensate quietly. Project teams adjust timelines informally, absorbing the slippage into their own workload rather than surfacing it. Sponsors intervene cautiously, trying to stabilise without confronting the underlying pattern.

Each adaptation appears stabilising in the moment. The system finds ways to absorb shock. But collectively, these local adaptations delay structural correction. They make the problem invisible at governance levels precisely when early intervention would be most effective.2 By the time the signal can no longer be softened or contextualised, the issue is embedded. What could have been addressed with a modest reframe now requires visible authority intervention.

Why this is a governance issue

Ignoring early signals is not a communication failure or a practitioner blind spot. It is a governance decision. Every time an ambiguous pattern is reframed as “normal transition” or “expected learning curve,” the organisation is making an explicit choice: continuity over correction.3 That choice may be appropriate. The change may be worth the friction. The uncertainty may be worth accepting.

But that choice must be made explicitly by people with authority to make it. When it’s implicit — when signals just get softened and normalised without escalation — instability accumulates silently and governance loses visibility. Once instability crosses a threshold, recovery requires visible authority intervention. At that point, the political cost is much higher. The window for preventive action has closed.

What disciplined interpretation looks like

Disciplined early signal interpretation requires several practicesworking together. It means distinguishing transitional friction from structural contradiction — knowing which conflicts are expected growing pains and which are signs of misalignment. It means examining repeated escalation themes for patterns rather than treating each incident as isolated. It means tracking where managers are absorbing risk quietly, compensating for gaps, buffering systems that should be visible to governance.4 It means testing whether incentives actually contradict stated change intent — not assuming, but checking.

These practices slow the narrative. They surface discomfort. They create pressure to acknowledge that something is misaligned. But they prevent compounding instability by enabling early intervention before local buffering becomes entrenched.

Why this matters

Instability rarely emerges without warning. It accumulates through repeated reinterpretation of modest signals. Each signal gets contextualised. Each one gets absorbed into a narrative of normal transition. But patterns form through aggregation.

If signals are softened consistently, governance becomes reactive rather than anticipatory. Intervention happens after momentum is lost or credibility has declined. If signals are interpreted honestly and escalated early, governance remains able to make preventive choices while windows are still open. This is one way of understanding how early signal discipline protects structural coherence and decision quality. Other pieces in this series examine how measurement design and executive authority shape whether those signals are surfaced clearly or softened protectively.


  1. Weick, K. E., & Sutcliffe, K. M. (2007). Managing the Unexpected: Resilient Performance in an Age of Uncertainty (2nd ed.). Jossey-Bass. Weick and Sutcliffe’s concept of normalisation of deviation describes precisely this process: organisations learn to treat anomalous signals as routine through repeated reframing, until the accumulated deviation produces a failure that can no longer be explained as normal. Early warning signal management is the antidote — it requires treating each ambiguous signal as potentially significant rather than immediately contextualising it as expected friction. ↩︎

  2. Perrow, C. (1984). Normal Accidents: Living with High-Risk Technologies. Basic Books. Perrow’s analysis of interactive complexity shows that local compensations in tightly coupled systems are individually rational but produce latent vulnerability at the system level — each person absorbing a small problem contributes to a distributed pattern of hidden stress that becomes catastrophic when coupling produces simultaneous failure. Middle manager buffering during change is this dynamic: each instance is stabilising locally while the aggregate creates the conditions for structural failure. ↩︎

  3. Tetlock, P. E. (2005). Expert Political Judgment: How Good Is It? How Can We Know? Princeton University Press. Tetlock documents how decision-makers protect prior commitments by selectively reinterpreting disconfirming evidence — not through deliberate distortion but through motivated reasoning that assigns early-warning signals to the “expected” rather than the “concerning” category. The choice between “continuity” and “correction” is therefore not a neutral assessment but a cognitively constrained one: people with investment in an initiative’s success are systematically disposed to see early signals as transition noise. ↩︎

  4. Schein, E. H. (1999). Process Consultation Revisited: Building the Helping Relationship. Addison-Wesley. Schein argues that the most diagnostically significant information in an organisation is what it is not surfacing — the conversations that aren’t happening at governance level because they are being absorbed locally. Tracking where managers are compensating quietly is the diagnostic equivalent of looking for what people are choosing not to say; it reveals the structural gaps that the organisation has decided are its own problem to manage rather than governance’s problem to address. ↩︎