← Blog

On Flying Small Planes and Making Large Decisions

Brian Hunt
Brian Hunt
3 min read

I hold an FAA private pilot certificate. I don't fly often enough — fewer hours than I'd like in any given year, never as current as I want to be. But the hours I have logged have made me a better security professional, a better consultant, and probably a better decision-maker in general.

Not because of the aeronautics. Because of how aviation teaches you to think about risk under uncertainty.

The Go/No-Go Decision

Every flight requires a go/no-go decision. Weather, aircraft condition, pilot currency, fuel, alternates, personal minimums. Student pilots are taught to make this decision formally, with a checklist, before every departure. Experienced pilots often internalize it — but the underlying structure stays the same.

What makes aviation's version of this decision-making useful is that the stakes are unambiguous. A bad call in a light aircraft, in deteriorating weather, without instrument training, ends badly. There's no escalation path, no ability to call a senior partner, no "let's revisit this next quarter." The feedback loop is immediate and irreversible.

Most professional decisions don't carry that kind of consequence. Which means most professionals never develop the discipline to actually make them with appropriate rigor. They defer, they hedge, they let ambiguity linger. Aviation doesn't give you that option.

Get-There-Itis

Aviation has a name for the psychological trap that kills more pilots than mechanical failure: get-there-itis. The pressure — internal, social, logistical — to complete the flight even when the conditions have changed. The meeting on the other end. The family waiting. The client who's already annoyed about the delay.

Every experienced pilot knows someone who flew into IMC without an instrument rating because they were 40 miles from home and convinced they could make it. Some of them didn't. The ones who did didn't necessarily make a better decision — they got lucky.

I watch the professional equivalent of get-there-itis in security work constantly. The audit that has to be done by the board meeting, even though the evidence isn't there. The system that has to go live on the announced date, even though the penetration test findings haven't been remediated. The deal that has to close, even though due diligence on the target's security posture raised real questions. The pressure to continue is always present. The discipline to stop is what you have to build deliberately.

Situational Awareness Is a Practice

Pilots talk about situational awareness constantly — knowing where you are, where you're going, what the weather is doing, what the aircraft is telling you, what ATC just said. It sounds simple. It's not. It's a continuous active process of building and updating a mental model of a dynamic environment, under cognitive load, while also flying the plane.

The security equivalent is threat intelligence, monitoring, and organizational context — maintaining an accurate picture of what's happening in your environment so that anomalies register as anomalies rather than noise. Organizations that are good at this tend to detect breaches in hours. Organizations that aren't detect them in months. The difference is almost never the tooling. It's whether the people using the tools have cultivated the habit of active situational awareness.

Aviate. Navigate. Communicate. — aviation's priority order when things go wrong.

Keep the aircraft flying first. Know where you are second. Talk to someone third. In security incident response, the equivalent is: contain the bleeding, understand the scope, then start the stakeholder communication. Organizations that invert these priorities — who go straight to executive briefings before they understand what happened — create more chaos, not less.

I'll take the small planes whenever I can. They keep the thinking sharp.

Comments

No comments yet.