On February 11 this year, Prime Minister Benjamin Netanyahu arrived at the White House with one objective: to persuade President Trump to join Israel in a major military operation against Iran.
He headed downstairs to the Situation Room and, over the next hour, laid out a four-part plan – as reconstructed in remarkable detail by Jonathan Swan and Maggie Haberman of the New York Times: assassinate the supreme leader, cripple Iran’s military capacity, trigger a popular uprising and install a secular government. The risks of inaction, he argued, were greater than the risks of action.
When he finished, Trump replied: “Sounds good to me.”
The following day, US intelligence analysts reviewed the same presentation. The regime-change scenarios were described by CIA director John Ratcliffe in a single word: “farcical”. Secretary of State Marco Rubio offered a blunter translation: “In other words, it’s bullshit.” General Dan Caine told the president this was standard Israeli operating procedure: “They oversell, and their plans are not always well-developed.”
Trump was unbothered. Regime change, he said, would be “their problem”. On February 28, the war began.
It is tempting to conclude that the problem is simply Trump – his ego, his certainty, his allergy to being told he is wrong. And maybe that is part of it. But here is the uncomfortable part: the cognitive patterns on display in that Situation Room are not unusual. They are human – and far more common than we like to think.
There are three that are worth understanding.
Every day we are bombarded with far more information than we can consciously process. The brain handles this by filtering – selecting what to focus on and what to discard. This is not a flaw; it is what allows us to function. Without it we would be paralysed.
But the filters are not neutral. One of the most powerful is confirmation bias: the tendency to favour information that supports what we already believe, and to discount information that challenges it.
Think about how nice it feels when you chat with ChatGPT and it understands and agrees with you on so many things. Imagine instead if it constantly pushed back, questioned your assumptions, told you that you might be wrong. You would probably use it a lot less.
We are drawn to confirmation – it feels like validation, and we seek it out without realising we are doing so. Trump had regarded Iran as a uniquely dangerous adversary for decades. Netanyahu’s presentation didn’t persuade him of something new – it confirmed what he already believed. The intelligence assessment that followed was processed through the same filter. He heard the parts that fit and set aside the parts that didn’t.
We consistently overestimate what we can achieve and underestimate how complex things turn out to be. How many of us have misjudged how long something would take? A home renovation that was supposed to take three months and took a whole year! The Ikea furniture we were convinced we could assemble in an afternoon and ended up calling someone in to fix. And when things do work out, we credit our judgement rather than our luck – becoming ever more convinced we will succeed next time, even when the next situation has nothing to do with the last one.
Trump’s most recent data point was a spectacular commando raid that had captured Venezuela’s Nicolás Maduro without a single American life lost. When Tucker Carlson warned him that a war with Iran would destroy his presidency, Trump was unmoved. Carlson asked how he knew it would be OK. “Because it always is,” Trump replied.
Even when we try to anticipate how others will respond, we tend to do it by imagining how we would feel in their position – and then assuming they would feel the same way. Psychologists call this the empathy gap: we cannot fully simulate the emotional and psychological state of someone whose circumstances, history, and incentives are radically different from our own. We fill the gap with our own logic, without realising that is what we are doing.
Trump assumed that Iranians, confronted with the collapse of their regime, would welcome change. Populations under attack tend to rally, not fracture. Regimes facing extinction tend to fight, not fold. It is one of the most consequential – and most overlooked – errors in high-stakes decisions.
These three patterns are not character flaws – they are features of human cognition that show up in boardrooms and strategy meetings every day. The difference is that in most settings, someone eventually pushes back hard enough to change the outcome. In that Situation Room, nobody did. We explore why in the next article.
Pantelis Solomou is a Cyprus-based behavourial scientist
Click here to change your cookie preferences