Every professional pilot has sat through the CRM slide. Perception. Comprehension. Projection. The three levels of situational awareness, as defined by Mica Endsley in 1988, have become the standard framework for understanding how pilots build, maintain, and lose their picture of what is happening around them.
The model is well known. Its practical application is less so. Most pilots can recite the three levels. Fewer can identify, in real time, which level they are currently operating at — or recognise the specific conditions under which each one degrades. That gap between knowing the model and applying it is where SA failures happen.
The Three Levels
Level 1 is perception — the raw input. What the instruments are showing, what ATC has said, what the other crew member is doing, what the weather is doing outside. Perception is the foundation of everything that follows, and it is the level most vulnerable to disruption.
Level 2 is comprehension — what the perceived information means in the context of the current task. The altimeter is unwinding at a rate that is faster than expected for this phase of flight. The other pilot has gone quiet. ATC has issued a revised clearance that conflicts with the current plan. Comprehension is where raw data becomes situational meaning.
Level 3 is projection — what is going to happen next, and what are the implications. Given the current rate of descent and the distance to the airport, will we make the crossing restriction? Given the weather trend, is the alternate still viable? Projection is where situational awareness becomes genuinely useful — it is the anticipatory capacity that allows crews to act ahead of events rather than react to them.
Each level is dependent on the one below it. You cannot comprehend what you have not perceived. You cannot project accurately from a comprehension that is incomplete or wrong. The model is not just a taxonomy — it is a dependency chain, and that dependency has important implications for how and where SA fails.
Where SA Actually Breaks Down
Research consistently shows that the majority of SA failures in aviation occur at Level 1 — perception. Not because pilots are inattentive, but because the flight deck environment systematically works against reliable perception in ways that are predictable and, with the right habits, manageable.
The primary mechanism is attentional narrowing. Under high workload or stress, the brain reduces the breadth of its attentional focus to manage cognitive load. The scan narrows. Less salient cues drop out. The pilot continues to believe they have a full picture while critical information sits unnoticed at the periphery of the instrument scan or in the behaviour of the other crew member.
The most dangerous SA failure is the one you don't know you're having. The brain doesn't generate a warning when it stops processing information it was never attending to.
The second major failure mode is confirmation bias at Level 2. Once a mental model is established — once you have a comprehension of what is happening — the brain preferentially processes information that confirms it. Data that contradicts the current model is more likely to be discounted, reinterpreted, or simply not registered. This is not a character flaw; it is a feature of human cognition that becomes a liability in a dynamic environment where the situation can change faster than the mental model updates.
Level 3 failures — errors in projection — typically arise from one of two causes. The first is an inaccurate comprehension feeding the projection: if your mental model is wrong, your forecast will be wrong. The second is overconfidence in a projection that was accurate when formed but has since been invalidated by changes in the environment that weren't perceived or comprehended quickly enough.
The Shared Mental Model Problem
Individual SA is only part of the picture on a multi-crew flight deck. What matters operationally is the shared mental model — the degree to which both crew members have the same comprehension of the current situation and the same projection of where it is heading.
Shared mental models degrade silently. Each pilot continues to believe they have a common picture with their colleague while the two models diverge. The divergence often isn't discovered until one pilot takes an action that surprises the other — a call that makes no sense given what the other pilot thinks is happening, or a failure to respond to something that seems obviously necessary.
The mechanism for maintaining a shared mental model is verbalisation. Crews that routinely narrate their picture — that state what they're seeing, what they think it means, and what they expect to happen next — maintain alignment far more reliably than those who process silently and assume a common understanding. This isn't about talking for its own sake. It is a specific cognitive strategy for externalising the mental model in a way that allows it to be checked, challenged, and corrected by the other crew member.
Verbalising your picture isn't a sign of uncertainty. It is the mechanism by which two pilots become one crew.
Recognising Degraded SA in Yourself
The challenge with SA loss is that it is largely self-concealing. The brain, operating with reduced information, does not generate a subjective sense of incompleteness. You do not feel like you've lost your picture — you feel like you have a picture, just a wrong or incomplete one.
There are, however, observable signals that experienced pilots learn to recognise as early indicators of SA degradation in themselves. Confusion about the current clearance or flight path. Uncertainty about what the automation is doing. A vague sense that the situation is developing faster than expected. Silence from the other crew member when you expected a response. Fixation on a single instrument or task to the exclusion of others.
None of these is definitive in isolation. Together, or in combination with elevated workload and time pressure, they are a reliable early warning that the mental model needs to be rebuilt. The correct response is structured and immediate: aviate, then verbalise the current state, then seek confirmation from the other crew member. Stating your picture out loud and asking your colleague to check it is not a sign of weakness. It is the highest-order SA behaviour available to you.
Recognising Degraded SA in Your Colleague
SA degradation in another person is often more visible than it is in yourself. The signals are behavioural: uncharacteristic hesitation, repeated questions about the same information, actions that don't fit the situation, failure to respond to calls, fixation that disrupts the normal flow of tasks.
The culturally difficult part is addressing it. Many pilots notice the signals and say nothing — not because they don't recognise them, but because calling out a colleague's apparent confusion feels presumptuous or implies criticism. This instinct needs to be consciously overridden. The professional obligation is clear: if you observe indicators of reduced SA in the other pilot, you address it directly and immediately.
Practical Strategies That Work
The "what's happening next" discipline. At natural transition points — top of descent, approaching the FAF, entering a hold — deliberately ask yourself: what is the next significant event, when will it occur, and what are the conditions under which it might not go as planned? This is Level 3 SA practised as a habit rather than left to chance.
Verbalise the picture at workload peaks. When workload is highest is precisely when verbalisation is most likely to be abandoned — and most necessary. A brief crew narration of the current state at high-workload moments takes five seconds and prevents a significant proportion of shared mental model failures.
Set personal SA gates. Before demanding phases of flight, agree with your colleague what you will each be monitoring and what the triggers are for calling a deviation or initiating a contingency. This doesn't just improve SA — it makes degraded SA visible, because a departure from the agreed monitoring plan becomes immediately apparent to both crew members.
Use the approach briefing as an SA tool. The most effective approach briefings don't just cover the procedure — they construct a shared Level 3 picture. What does normal look like at each gate? What are the threats? At what point is the approach no longer salvageable? A crew that has a shared projection before the approach begins will maintain SA through the approach far more reliably than one that builds it on the way down.
Endsley's model is thirty-eight years old. It remains the most practically useful framework for understanding flight deck awareness because it maps so accurately onto the actual failure modes that appear in accident and incident reports. The science is settled. What remains is the daily discipline of applying it — not in the classroom, but on every sector, in every briefing, with every colleague.
Across Every Sector
HPP maps all seven Situational Awareness behaviours across three development levels — from perception and monitoring through to anticipation and contingency planning. Free to start.
Start Free — highperformancepilot.com