ECOM extends the single-loop COCOM view by recognising that skilled operators run four control loops in parallel — tracking, regulating, monitoring, and targeting — each at a different tempo and abstraction. It is a descriptive model of joint cognitive work rather than a box-and-arrow accident model.
Overview of the framework
Developed by Erik Hollnagel and David Woods as part of their Joint Cognitive Systems programme, ECOM treats control as concurrent rather than sequential. Where COCOM describes a single action–feedback cycle governed by a control mode, ECOM recognises that anything beyond the most reactive task requires multiple, coupled loops operating over different time horizons (Hollnagel & Woods, 2005).
The four layers — described from the fastest to the slowest — are tracking (closed-loop correction of deviations, tens of milliseconds to seconds), regulating (keeping current activity within bounds, seconds to minutes), monitoring (checking that the chosen plan remains adequate, minutes to tens of minutes), and targeting (selecting goals and priorities, tens of minutes to hours). Each loop provides the reference input for the layer below it; each lower loop produces disturbance information for the layer above.
Figure 1 · Four simultaneous control loops. Upper loops set references for lower loops; lower loops feed deviations upward.
When to use it
Typical applications
Modelling multi-tasking in complex control rooms and cockpits.
Designing displays that distinguish tactical from strategic information.
Analysing loss-of-control events where a lower loop captured attention from a higher one.
Specifying automation that takes over one layer without disrupting the others.
Loss-of-state-awareness events: crew locked into tracking while monitoring degraded.
ATC workload models in which strategic planning must coexist with tactical separation.
Single-pilot and reduced-crew operations where automation carries a layer.
Benefits
Parallelism made explicit. Captures the reality that pilots and controllers do not finish one task before starting the next.
Time-scale separation. Different loops have different reaction times, which helps match interface updates to cognitive tempo.
Links with COCOM. Control modes (strategic, tactical, opportunistic, scrambled) apply to each layer, allowing a richer diagnosis of degraded performance.
Design leverage. Supports principled decisions about what to automate, what to delegate and what the human must retain.
Explains attention capture. Makes clear how a fast-loop anomaly can starve the slow loop, a classic mechanism in controlled flight into terrain and loss-of-control events.
Integrates with joint cognitive systems view. Human and machine elements can occupy different layers and still form one control system.
Usable as a think-aloud structure. Observers can tag utterances and actions to a layer during simulator debriefs.
Limitations
Descriptive, not predictive. ECOM describes how control is organised; it does not, by itself, generate probabilities or failure rates.
Layer boundaries are fuzzy. In practice, tracking and regulating overlap, and the analyst has to judge where a loop ends.
Requires skilled coding. Classifying observed behaviour into the four layers is non-trivial and inter-rater reliability has to be established.
Light on team aspects. ECOM is originally an individual-operator model; extending it to crew and sector requires care.
Evidence base is smaller than COCOM's or FRAM's. Fewer published validation studies in aviation compared with ETTO or FRAM.
Overlap with other models. Analysts familiar with Rasmussen's skill-rule-knowledge taxonomy or with supervisory control may find the vocabulary competing rather than complementary.
In short
ECOM is the four-layer sibling of COCOM. Use it when the question is how multiple control loops interact under load — particularly in flight-deck, ATC, and single-pilot automation design — rather than when you need an accident-cause model or a quantitative risk number.
References (APA 7)
Hollnagel, E. (2002). Cognition as control: A pragmatic approach to the modelling of joint cognitive systems. IEEE Transactions on Systems, Man and Cybernetics, Part A, 32(6), 724–731.
Hollnagel, E., & Woods, D. D. (2005). Joint cognitive systems: Foundations of cognitive systems engineering. CRC Press.
Woods, D. D., & Hollnagel, E. (2006). Joint cognitive systems: Patterns in cognitive systems engineering. CRC Press.
Hollnagel, E. (1993). Human reliability analysis: Context and control. Academic Press.
Rasmussen, J. (1983). Skills, rules, and knowledge; signals, signs, and symbols, and other distinctions in human performance models. IEEE Transactions on Systems, Man, and Cybernetics, SMC-13(3), 257–266.
Stanton, N. A., Salmon, P. M., Walker, G. H., Baber, C., & Jenkins, D. P. (2013). Human factors methods: A practical guide for engineering and design (2nd ed.). Ashgate.
Further reading
Hollnagel, E., & Woods, D. D. (1983). Cognitive systems engineering: New wine in new bottles. International Journal of Man-Machine Studies, 18(6), 583–600.
Klein, G., Woods, D. D., Bradshaw, J. M., Hoffman, R. R., & Feltovich, P. J. (2004). Ten challenges for making automation a "team player" in joint human-agent activity. IEEE Intelligent Systems, 19(6), 91–95.
Sheridan, T. B. (1992). Telerobotics, automation, and human supervisory control. MIT Press.
Hollnagel, E. (2012). FRAM: The functional resonance analysis method — Modelling complex socio-technical systems. Ashgate. [For ECOM's successor in Hollnagel's modelling lineage.]