A few years ago, I was about to hire someone who looked extraordinary on paper. Fourteen years of directly relevant experience, strong references, articulate in interviews, clear strategic thinking. My whole team was enthusiastic. Every rational signal said yes. And yet something felt off. I couldn't name it precisely. In the final interview, there was a moment — a brief hesitation when I asked about a team he'd restructured — that registered as odd in a way I couldn't articulate or justify to anyone else in the room.
I hired him anyway, because the case for doing so was overwhelming and my discomfort had no articulable basis. He was gone within eight months. The hesitation, it turned out, was the visible tip of a pattern — a consistent tendency to manage up effectively while being dismissive and occasionally demeaning to peers and direct reports. The data I had been looking at was real; it just wasn't the data that mattered most.
I've thought about that hire many times in the years since. Not primarily because it proved that gut instinct should override analysis — it doesn't, most of the time, and I'll come to the cases where it does. I've thought about it because it forced me to take seriously a question I'd been dismissing as unscientific: what is the intuition actually doing when it fires, how reliable is it, and what conditions determine whether it's signal or noise?
Those questions don't have simple answers, but they have better answers than "always trust your gut" or "never trust your gut." The better answers depend on understanding what intuition actually is — which is less mystical and more specific than the popular framing suggests.
What intuition actually is — the research that changes how to think about it
The word "intuition" gets used in two very different ways in leadership conversations. One camp treats it as a mystical signal — a form of superior knowing that bypasses the limitations of rational analysis, somehow more trustworthy than anything systematically derived. The other camp dismisses it entirely as bias dressed in respectable clothing. Both positions are wrong, and the truth is considerably more useful.
The psychologist Gary Klein spent decades studying expert decision-making in genuinely high-stakes environments — military commanders, firefighters, intensive care nurses, chess grandmasters. His core finding, published as the Recognition-Primed Decision model, was that experienced professionals under pressure don't actually make decisions by generating options and evaluating them systematically. They recognize patterns — configurations of cues that match patterns accumulated through experience — and those patterns trigger a sense of what to do. The "sense" isn't a feeling in the vague emotional sense; it's a compressed inference drawn from a large experiential library.
The key word in Klein's research is "experienced." What he and subsequent researchers found is that expert intuition is reliable when it's built in an environment where the feedback loop between action and outcome is reasonably tight, consistent, and legible. Chess players, firefighters, nurses — intuition works for them because experience generates calibrated pattern recognition. The feedback is fast enough and clear enough that wrong patterns get corrected and right patterns get reinforced.
Leadership is different. Leadership operates in environments where feedback is slow, partial, confounded by variables beyond any individual's control, and often ambiguous about cause and attribution. The leadership decision you made three years ago doesn't get tested cleanly against a clear outcome — too many other things changed between the decision and the result. This makes leadership intuition significantly harder to calibrate than the word "intuition" usually implies, and significantly less reliable than it feels.
The three gut signals worth slowing down for
That said, not all gut feelings are equal. After thinking about this carefully — including, frankly, after making both kinds of errors in my own leadership practice — I've found three categories of intuitive signal that are genuinely worth slowing down for, even when you can't fully articulate them.
Incongruence signals. When something someone says doesn't match how they say it. When a plan looks right on paper but something about its internal logic feels strained. When a number is technically accurate but creates an uneasy sense that it's hiding something. This kind of discomfort often means your pattern recognition has picked up a contradiction that your conscious analysis hasn't yet articulated. The hire I described at the opening was this kind of signal — not a vague feeling about the person, but a specific, localized incongruence between what was being said and the physical and tonal details of how it was being said. That specificity is what distinguishes incongruence signals from general discomfort.
Overcorrection signals. When you're instinctively pushing toward something in a way that feels unusual for you — when a characteristically cautious person is getting excited about a risky bet, or when a characteristically decisive person suddenly wants more time. These reversals from your own baseline often indicate that the situation has activated something real in your experiential library. The reversal is data. It doesn't tell you what the right answer is, but it tells you that the situation warrants more careful attention than the surface analysis alone might suggest.
Convergent signals. When multiple independent gut reactions, from different experienced people with different perspectives and different things to gain or lose, point in the same direction. One person's instinct is an anecdote. A convergent set of instincts from people who come at the situation from different angles starts to look like evidence of something in the data that the explicit analysis isn't capturing. I've been in hiring discussions where three experienced people who had interviewed a candidate all had the same vague unease, each arriving at it independently — and the unease turned out to be tracking a real pattern. The convergence was the signal.
When gut instinct is just bias — the harder truth
The harder truth that leadership conversations about intuition tend to avoid is that gut instinct, particularly in organizational and interpersonal contexts, is frequently not wisdom. It's familiarity. It's pattern-matching to people and situations that are comfortable and recognizable — which means it systematically disadvantages anyone who doesn't fit the familiar template.
Research on hiring decisions consistently shows that "culture fit" intuitions — the sense that someone will fit in, that they seem like the right kind of person — correlate more strongly with demographic similarity to the existing team than with any actual predictor of job performance. What feels like an accurate cultural read is often a similarity bias operating below the threshold of awareness. The people making these judgments are not bad people and they are not consciously discriminating; they are humans whose pattern-recognition systems were calibrated in environments that are not the modern workplace.
The same dynamic applies beyond hiring. The decision that feels intuitively right because it resembles previous decisions that worked out. The direction that feels obviously correct because the person proposing it has the right credentials and the right demeanor. The plan that instinctively seems sound because it's structured like previous plans that succeeded. In each case, the intuition is tracking familiarity and resemblance, not underlying merit. And in each case, that familiarity may or may not be relevant to the actual question at hand.
I've tried to build a specific practice around this risk. When I have a negative gut reaction to a person or an idea, I ask myself explicitly: is this discomfort because something is actually wrong, or because something is unfamiliar? These two things feel identical from the inside — both produce the same visceral sense that something is off. The only way to tell them apart is to name specifically what is triggering the reaction. Vague discomfort that can't be specified is almost always bias or unfamiliarity. A specific, articulable concern — a particular inconsistency, a specific risk that hasn't been addressed, a concrete pattern from past experience that this situation resembles in troubling ways — is more likely to be genuine signal.
The "name it precisely" practice — the most reliable intervention I've found
The single most useful thing I've done with my own intuitive responses — over years of trying different approaches — is to enforce a naming discipline before acting on any strong gut feeling, positive or negative.
The discipline is simple: before acting on an intuition, I force myself to write down exactly what the intuition is tracking. Not "something feels off" — that's a description of the feeling, not the content. What specifically is off? What is the precise claim or pattern or inconsistency that triggered the response? Where does it come from in my experience — what does this situation actually resemble that I've seen before?
The exercise takes five minutes, and it produces one of two results. Either I can name the specific concern clearly, in which case I have something concrete to investigate, to probe in conversation, to present to others as a genuine question rather than a vague discomfort. Or I can't name it — the feeling is real but it won't resolve into a specific claim — which usually means it's unfamiliarity or bias rather than genuine pattern recognition. In the second case, I've learned to weight the intuition much less heavily, and to require more from the explicit analysis before acting.
This is not a perfect filter. Genuine signal can sometimes be so compressed and tacit that it resists articulation even when it's real — which is why I don't dismiss intuitions I can't name, I just downweight them. But the discipline of trying to name them produces far better calibration than either "always trust your gut" or "always override it with analysis."
Why the best leaders treat intuition as a hypothesis, not a conclusion
The leaders I most respect have a characteristic relationship to their own gut feelings. They take them seriously — they don't dismiss them — but they treat them as the beginning of an inquiry rather than the end of one. When the gut says something, they ask: what is this pointing to? What data would confirm or disconfirm it? Is there something specific here that I haven't fully examined yet? Who else should I consult before I rely on this?
That stance is different from dismissing intuition. And it's different from treating it as oracular. It's using the intuition as a prompt for more disciplined investigation — which is, ultimately, what good decision-making looks like.
The same leaders are also honest about the patterns in their own intuitive errors. They know whether their gut tends to over-trust people who are like them, or under-trust people who are unlike them. They know whether their discomfort with certain types of risk is calibrated or overcautious. They've built a kind of meta-awareness about when their intuition is more reliable and when it requires more scrutiny. That meta-awareness is itself a learnable skill — but it requires honest engagement with the times your gut was wrong, not just the times it was right.
The second hire — and what it illustrates about integration
There's a second chapter to the story I opened with, and it's worth telling here. Two years after the hire I described at the beginning — the one where I overrode my instinct and regretted it — I interviewed someone for a similar role who produced the opposite effect. Every piece of paper was slightly less impressive than the previous hire. But something about the conversation felt right in a way I found difficult to articulate. Specific answers she gave reflected a kind of honest self-assessment I found striking. A description of a failure she'd navigated had a quality of real learning rather than repackaged success. I hired her.
She became one of the strongest contributors I've worked with over fourteen years. She's since led her own team and, by every account I have, done genuinely exceptional work.
What was the gut tracking in that case? I've spent time on this question. I think it was something specific: the quality of her self-awareness and the coherence between what she said and how she said it. Not an indescribable mystical signal — an incongruence signal in reverse. Everything was congruent. The self-assessment matched the evidence. The emotional tone matched the content. When I tried to name it, I could.
The point isn't that my gut was "right" in the second case and "wrong" in the first. The point is that the second intuition had properties that made it more reliable: it was specific rather than vague, it was nameable when I tried to name it, and it pointed toward a concrete hypothesis about the candidate rather than a general impression. Those properties made it more trustworthy — not as a conclusion, but as a hypothesis worth acting on with appropriate care. That's the integration I've been working toward for years. The goal isn't to separate intuition from analysis. It's to make sure they're genuinely talking to each other.
Related: Making High-Stakes Decisions Under Genuine Uncertainty, The Psychology of Irreversible Decisions
