What the 2026 Middle East War Is Teaching Us About Communications

The war that escalated on 28 February 2026 has compressed years of theoretical discussion about information warfare into a matter of days. For communications professionals, it offers a live case study in what happens when the assumptions underpinning crisis plans meet reality — and how quickly those assumptions can fail.

A prayer app as a weapons platform

US and Israeli operators compromised the BadeSaba prayer app — used by tens of millions of Iranians — to deliver messages urging military personnel to lay down arms. No broadcast tower. No leaflet drop. The app bypassed state internet blackouts entirely, precisely because it was a trusted domestic platform operating freely inside Iran’s heavily filtered internet.

If you are still thinking about channel strategy the way you did five years ago — social, paid, owned, earned — that framing no longer covers the territory. The channels themselves are now contested infrastructure. Your audiences are operating inside a cognitive warzone whether they have registered that fact or not.

Seventeen minutes

During the October 2024 earthquake in Iran — a period of heightened geopolitical tension — social media was flooded with claims of a covert nuclear test within 17 minutes of the event, before the CTBTO had issued anything and before a single professional seismologist had examined the data. A Johns Hopkins University study published in February 2025 documented the full arc: the claims were wrong, the correction came hours later, and by the time official bodies confirmed a tectonic event, large parts of the audience had moved on carrying the wrong version. 

Your communications plan probably still assumes time to convene a response team, draft a statement, clear legal, and issue something considered. That 17-minute window does not care about your approval process.

The question is not whether you can respond that fast. The question is whether you have pre-approved holding positions and decision frameworks that allow someone to publish without calling anyone first.

The question is not whether you can respond that fast. The question is whether you have pre-approved holding positions and decision frameworks that allow someone to publish without calling anyone first.

The weapon nobody was watching

The Strait of Hormuz did not close primarily through military force. It closed through insurance.

P&I clubs — including Gard, Skuld, NorthStandard, and the London P&I Club — issued 72-hour cancellation notices for war risk cover. Additional War Risk Premiums jumped from approximately 0.2% to 1% of hull value. Maersk, Hapag-Lloyd, and CMA CGM suspended operations or rerouted around the Cape of Good Hope. Transit through the Strait fell by around 80% within 24 hours of strikes commencing.

The weapon was financial. The battlefield was actuarial. And the signal was present in the insurance markets before a single commercial vessel was touched.

For communicators, this reframes what intelligence actually means. Monitoring social media for brand mentions is one thing. Monitoring war risk premium surges, freight rate movements, and P&I club bulletins as early-warning indicators of escalation is another — and that second capability is the gap costing organisations their response windows right now.

Your war intelligence function needs to look much more like a trading desk than a media monitoring dashboard.

Your war intelligence function needs to look much more like a trading desk than a media monitoring dashboard.

Fact-checking is losing

Reactive debunking does not work once an audience is emotionally captured by a narrative. The correction arrives too late and often reinforces the original claim through repetition. You are playing whack-a-mole against opponents who do not need to be accurate — they just need to be first and emotionally resonant.

The US Department of Defence has drawn the same conclusion. It is now building information inoculation into warfighter training — warning personnel in advance that manipulation is coming and showing them what the techniques look like before they encounter them. The DEPICT framework provides the structure: Discrediting, Emotional language, Polarisation, Impersonation, Conspiracy promotion, and Trolling.

The principle scales to any organisation whose workforce, clients, or stakeholders are likely to be targeted during a regional war affecting supply chains and energy prices. Pre-bunking is more durable than a fact-check page.

Train people on the mechanics of manipulation before the pressure arrives, not while it is happening.

The information sovereignty problem

Iran’s internet connectivity dropped to as low as 4% during the conflict, according to NetBlocks monitoring data. The state had already shifted from reactive censorship to a whitelist model — only authorised services permitted, with anyone straying outside the approved list leaving a permanent digital trail.

This is the extreme end of government control, but it exists on a spectrum that includes EU regulatory frameworks, platform content moderation, and national emergency communications protocols that can be activated faster than most organisations anticipate.

Your message may be accurate, timely, and well-crafted — yet still fail to reach its intended audience because a filter, sovereign or otherwise, intervened.

Your message may be accurate, timely, and well-crafted — yet still fail to reach its intended audience because a filter, sovereign or otherwise, intervened. Relying on a single channel is a structural vulnerability. Build redundancy now, not after the first failure.

The shape of the job

The 2026 Middle East war is a convergence event — kinetic and narrative operations running in parallel, insurance markets functioning as strategic weapons, seismic monitoring networks exposed to manipulation, and trusted civilian apps repurposed for psychological operations.

None of this fits neatly within a communications function designed for a slower, more linear world. If your communications plan predates AI as an operational reality, predates information warfare as standard statecraft, or predates financial instruments as conflict tools, the gap between your plan’s assumptions and current realities is significant.

Key takeaways

  1. The gap between a kinetic event and public narrative has collapsed to minutes. Social media flooded with false nuclear test claims within 17 minutes of an Iran earthquake — before any official body had looked at the data. Your approval process needs to reflect that reality, not ignore it.

  1. The Strait of Hormuz didn’t close through military force. It closed through insurance. P&I club cancellation notices and surging war risk premiums were the real early-warning signal — visible in financial markets before a single commercial vessel was hit. War intelligence needs to monitor beyond media.

  1. Trusted civilian infrastructure is now a legitimate psyop delivery mechanism. Operators compromised a prayer app used by tens of millions of people to bypass state internet blackouts. Channel strategy built around owned, earned, and paid media doesn’t account for this.

  1. Reactive fact-checking loses against opponents who only need to be first and emotionally resonant. The US Department of Defence is now training warfighters in pre-bunking — exposing people to manipulation techniques before they encounter them. The same approach applies to any workforce operating in a contested information environment.

  1. Digital filters — sovereign, regulatory, or platform-level — can strand accurate, well-crafted messages before they reach their intended audience. Single-channel dependency is a structural vulnerability, not a resource constraint.

This article was researched with the assistance of AI tools, with all sources and factual claims verified by the author. AI was also used to assist with grammar and style, as the author writes in British English as a second language.

Author: Philippe Borremans

Image: AI generated


Discover more from

Subscribe to get the latest posts sent to your email.

Leave a Reply