There is a version of operating aircraft systems that looks competent from the outside and is actually the most dangerous pattern available: the quick, confident action taken from memory, without verification, in response to a situation that appeared to demand speed. That pattern fails at the point where it is most trusted — under pressure, in unfamiliar territory, or with a system that is not encountered regularly enough for the memory to be reliable. The antidote is not hesitation. It is deliberation: the disciplined habit of confirming before acting, regardless of how confident the action feels.

The behaviour asks for correct operation. Correct means deliberate, considered, and timely — in that order. Not fast. Not instinctive. Deliberate. The urgency that produces premature actions is usually self-generated rather than operationally required. In the vast majority of cases, there is time to check. Time to verify the source. Time to cross-check the intended action against the expected outcome. Taking that time is not weakness. It is the specific discipline that keeps correct operation correct.

Knowledge and Its Decay

The systems used on every flight are the ones whose operation stays current. The flows, the callouts, the sequences — repeated across hundreds of sectors, they become embedded and reliable. The systems that are rarely encountered are a different matter. Their operation may have been learned thoroughly at type rating. But without regular exposure, the detail degrades. What remains is a general familiarity that feels like knowledge but does not have the precision that correct operation requires.

The professional obligation is to know which category each system falls into. The pilot who has high confidence in their frequently-used systems and appropriate humility about the rest — who actively recognises the infrequently-used system as a source of increased risk — is the pilot who applies the right level of deliberation to each. The unfamiliar system gets more care, more verification, more time. Not because the pilot lacks ability, but because they accurately assess the reliability of their own knowledge and act accordingly.

This is where correctly identifies source of operating procedures is not a separate behaviour but a precondition for this one. The source is the protection. Never operate from memory when a procedure exists. Never assume the procedure is what you remember, or that what you remember is current. The aircraft has changed. The procedure may have been revised. The memory is not the source — it is a starting point for finding the source, nothing more.

Never assume. Always go to the source. The confidence that comes from memory is not the same as the accuracy that comes from verification.

Deliberate, Considered, Timely

Three words that sit behind the single word correctly: deliberate, considered, and timely. Each of them is doing work that the word correct alone does not capture.

Deliberate means intentional — the action is taken because it has been selected, not because it presented itself. The deliberate action is preceded by a moment of conscious decision: this is the correct action, this is the right time, this is what should happen next. That moment is not long. But it exists, and it is the difference between an action that has been chosen and one that has simply happened.

Considered means evaluated — the intended action has been cross-checked against the available information before being executed. What should happen when this switch is selected? Is the aircraft in the condition that the procedure assumes? Are there any reasons not to proceed? The considered action has been through a brief but genuine evaluation. The unconsidered action has not.

Timely means at the right moment — not too early, not too late, and not rushed. Timeliness is frequently misread as speed. It is not. A timely action is one that achieves the intended outcome within the window that the operation requires. That window is almost always wider than the self-generated urgency suggests.

Cross-Checking and the Error That Follows the Error

Cross-checking is the mechanism that catches the error before it becomes consequential. It applies both before the action — verifying that the intended action is correct — and after it — confirming that the outcome matches the expectation. Both are necessary. The pre-action cross-check prevents the wrong action. The post-action cross-check catches the case where the right action produced an unexpected result, or where the action was not fully completed.

When an error is made — and errors will be made — the response requires the same deliberation as the original action. The temptation is to correct quickly, to undo what was done before anyone notices, to restore normal operation as fast as possible. That temptation should be resisted. The knee-jerk correction, made to save face rather than to solve the problem, risks compounding the original mistake. The system is now in an unexpected state. The correct recovery from that state may not be the simple reversal of the previous action. It requires the same assessment, the same source verification, the same deliberation that the original action should have received.

◈ The Sit-on-Your-Hands Discipline

The instruction to sit on your hands is not literal. It is a reminder that the physical impulse to act — to reach for the control, to select the system, to initiate the procedure — arrives before the cognitive process that should precede it has completed. Sitting on your hands is the discipline of allowing that process to complete before the action follows.

It takes practice in normal operations, where the consequences of a premature action are recoverable, to build the habit that holds in abnormal operations, where they may not be. The deliberation built in routine flying is the deliberation available under pressure.

Associated Equipment and the Interface Responsibility

The behaviour extends beyond the aircraft systems themselves to associated equipment — GPU, de-icing rigs, refuelling bowsers, tugs, ground support equipment of all kinds. In most cases the pilot does not operate this equipment directly. But the pilot is the technical authority on the aircraft side of every interface, and that responsibility requires knowledge.

The ground engineer connecting the GPU needs to know the correct connection point, the acceptable voltage and frequency range, and the aircraft's response when external power is applied. The de-icing operator needs to know which surfaces require treatment, which fluids are approved, and how holdover time interacts with the planned departure. The refueller needs correct fuel grades, correct quantities, and the locations of the required fuelling points. The tug operator needs towing limits, nose gear constraints, and the crew's intentions for pushback.

None of this requires the pilot to operate the ground equipment. It requires the pilot to be the accurate, reliable source of the aircraft-side information that the ground crew need to do their work correctly. That is a knowledge requirement — and it is subject to exactly the same discipline as any other system operation. Know the answer accurately. If uncertain, find the source before responding. Never guess at an interface value on the basis that it probably doesn't matter.

↔ Connects With
Situational Awareness
Correct operation requires an accurate picture of the aircraft's current state before any action is taken. The cross-check before acting, and the monitoring of the outcome after, are Situational Awareness behaviours applied directly to system operation.
↔ Connects With
Workload Management
The self-generated urgency that drives premature actions is a workload management failure — the sense that there is no time to verify is almost always wrong. Protecting the time to check is a workload management discipline as much as a knowledge one.
↔ Connects With
Follows SOPs Unless Safety Dictates a Deviation
The source of the operating procedure is the SOP. Correctly identifying and following that source before acting is the connection between this behaviour and the discipline of SOP compliance — both depend on the same habit of going to the authoritative reference rather than relying on memory.
↔ Connects With
Professionalism
The willingness to acknowledge uncertainty — to say "I need to check that" rather than providing an answer that may be wrong — is an integrity behaviour. The pilot who guesses at an interface value to avoid appearing uncertain is prioritising self-image over safety.
✦ High Performance Pilot
Build the Deliberation Habit
on the Line

High Performance Pilot structures your development of Correctly Operates Aircraft Systems and Associated Equipment across three levels — Foundation, Proficient, and Mastery. The habit is built in normal operations. Free to start.

Start Free — highperformancepilot.com
✦ High Performance Brief
Brief the System Knowledge Before It Matters
High Performance Brief structures your threat-and-competency-led briefing — where system limitations, infrequently-used procedures, and ground equipment interfaces are identified and verified before the flight begins.