Diplomatic Dispatches - Is logic killing decision-making?


Diplomatic Dispatches

Is logic killing decision-making?

Some years ago, I was summoned to a windowless room in the Foreign Office to meet with country specialists, intelligence analysts, and military planners. I was newly promoted, the only woman, and my (far more senior an d experienced) boss was overseas. They needed me to provide political cover for what they called "a surgical operation " against an adversary.

They were very excited about their plan. Spreadsheets full of risk assessments. Maps with colored bits. Timelines plotted to the minute. Everyone very serious, very logical, very sure of themselves.

Once I 'd absorbed what they actually wanted to do, I asked what seemed like an obvious question:

"What happens if they retaliate?"

Silence. Not the good kind. The "are-you-serious" kind.

"Of course they won't retaliate," came the patronising response.

"Our data indicates it wouldn't be in their interests. We've modeled all the scenarios."

They had indeed modeled scenarios. Lots of them. All based on the assumption that our adversary would behave rationally and logically like the model had forecast, rather than like, say, a human being who'd just been publicly humiliated.

Despite my misgivings, I was overruled and they did the thing.

Guess what happened next? Immediate, escalatory retaliation that nearly dragged us into a larger conflict. Because it turns out that when you corner someone, their response isn't dictated by what your data says would be " logical " - it's dictated by what they need to do to not look weak in front of their own people, their allies, underpinned by grievance and outrage.

The room full of experts had committed a cardinal sin of modern decision-making:

They'd confused having data with having understanding.

The Comfort of Process

What struck me most wasn't that they got it wrong - everyone gets things wrong. It was how confident the data had made them feel. The spreadsheets, the models , the risk assessments - they'd become a security blanket that insulated them from the discomfort of uncertainty.

And here 's the uncomfortable truth: that security blanket is everywhere now.

Think about the last time someone in your organization said "we need more data" before making a decision. How often was that really about needing information, vs needing to feel less exposed?

How often do our " rigorous processes " exist primarily to ensure that when things go wrong, nobody can be blamed?

After all, no one has ever lost their job for being too logical .

What We're Actually Measuring

The problem isn't that data is bad - it's that we've started treating it as a substitute for judgment. We measure what's easy to measure and then act as if we've measured what matters.

We can tell you exactly how many people clicked a link, but not why they immediately bounced. We can model consumer preferences in extraordinary detail, but we still can't explain why people buy Red Bull despite hating the taste or think stripy toothpaste works better than plain.

The dirty secret?

Humans don 't live in logic.

We live in stories, feelings, and relationships .

We make decisions with our gut and then rationalise them afterward. The spreadsheets are capturing the rationalisation, not the underlying motivations.

Reading Between the Lines

The most effective people I know don 't argue better - they diagnose differently .

When someone says "I need to see more data," they usually mean "I need to feel less exposed."

When they say "This isn't the right time," they mean "I don't want to be the one who suggested this if it fails."

When they say "Let me think about it," they mean "Help me find a way to say yes without looking stupid."

The breakthrough happens when you stop treating these as obstacles to overcome and start treating them as information to work with.

The resistance isn't logical - it's emotional. And emotions don't show up in your risk assessment matrix.

The AI Paradox

Here's the irony: just as artificial intelligence gets better at logic, analysis and pattern recognition, we're doubling down on competing with machines at the very things they'll soon do better than us.

What if we stopped trying to out-logic AI and leaned into what makes us irreplaceably human?

Our ability to read the room. To sense when someone's not telling the whole truth. Or at breaking point. To know that the technically best solution might be politically impossible. To understand that timing matters more than perfection, and that sometimes the right decision is the one that people can actually live with, not the one that looks best on paper.

Machines can process information faster than we ever will. But they can't walk into a room and instinctively know that despite what everyone 's saying, the real issue is that Sarah and Mike had a fight yesterday and now nobody wants to take sides.

The Strategic Value of Uncertainty

Perfect information is the enemy of timely action. The people who get things done have learned to make decisions with incomplete information, to trust their instincts even when those instincts disagree with their analysis.

This drives logical people insane. But it's how anything actually happens.

What if, instead of trying to eliminate uncertainty, we got better at navigating it?

What if we treated our inability to predict everything as information rather than a failure?

What if we admitted that some of the most important things - trust, timing, cultural fit, intuition - resist quantification entirely?

What if vibes was actually a metric?

Moving Forward

I'm not suggesting we abandon analysis. Good data, properly interpreted, is invaluable. But data without wisdom about human nature is just expensive noise.

The future advantage goes to people who can hold both truths simultaneously: that rigorous thinking matters and that humans aren't spreadsheets.

Who can make decisions without needing to justify every step in advance. Who can read what people actually mean, not just what they say.

In a world where machines will handle the logic, our competitive advantage is increasingly our humanness.

Our messiness, our contradictions, our ability to care about irrational things like dignity and fairness and being heard; about dancing and karaoke. Like laughing at how idiotic we are.

These aren 't bugs to be optimized away. They're features to lean into.

Those experts in their windowless room taught me something invaluable: the most dangerous person in any decision-making process isn't the one who lacks data.

It's the one who mistakes their data for wisdom.

Sometimes the most rational thing you can do is admit that logic has its limits. And in a world full of humans, those limits come up a lot sooner than most of us would like to admit.


What's your take on today's article? Did it land? Miss the mark? Change your perspective?

If you enjoyed this read, the best compliment I could receive would be if you shared it.

How to Diplomat

I'm a former diplomat turned coach, speaker, and writer, sharing insights from international diplomacy to help you understand, influence, persuade, and lead more effectively. Subscribe and join over 1000+ newsletter readers every week!

Read more from How to Diplomat
The Stanford Prison Experiment

Diplomatic Dispatches The Confidence Trick of Leadership In 1849, a well-dressed gentleman approached respectable New Yorkers with a simple request: “Have you confidence in me to trust me with your watch until tomorrow?” Astonishingly, many did. They handed over their valuables to a complete stranger who then completely disappeared. This was William Thompson - the original “confidence man”. And he became famous not for his charm, but for something far more interesting: his understanding that...

Diplomatic Dispatches How to think when thinking becomes impossible. There's a moment in The Fort - the gripping BBC podcast about a 2007 rescue mission in Afghanistan - where you realize something profound about human decision-making under pressure. Picture this: British forces have just assaulted a heavily fortified Taliban stronghold called Jugroom Fort. The firefight was brutal. When the dust settles and they're forced to withdraw, they discover Lance Corporal Mathew Ford is missing -...

Diplomatic Dispatches: The Ethics of Influence The Bewildered Child In the spring of 1968, ten-year-old Daryl Davis marched down a tree-lined street in Belmont, Massachusetts, his Cub Scout uniform pressed and perfect. He was the only Black child in his troop, but that detail seemed irrelevant to him. What mattered was the parade, the crowd, the sense of belonging. Then the rocks started flying. Bottles. Soda cans. "I had no idea what was happening," Davis would later recall. "I thought those...