Episode 333: Securing the State: Crisis Management and Counterterrorism Strategy with Professor Sir David Omand

In this episode, we host Professor Sir David Omand to explore crisis management, counterterrorism, and intelligence at the highest levels of the British state. Drawing on a career that includes senior roles at GCHQ, the Home Office, the Cabinet Office, and the Joint Intelligence Committee, Sir David reflects on how governments prepare for crises, why some threats are missed despite warning signs, and what effective decision-making looks like when events move faster than institutions.

We discuss the origins and logic of the UK’s CONTEST counterterrorism strategy, the importance of resilience and maintaining “conditions of normality” in crisis management, and the challenge of making sound judgements under conditions of uncertainty, ambiguity and institutional pressure. From warning failure and public trust to societal risk and the practical realities of managing national emergencies, this conversation offers valuable lessons in how governments and organisations can think more clearly, respond more effectively, and build resilience before the next crisis hits.

Professor Sir David Omand is a Visiting Professor in the Department of War Studies at King’s College London, a member of the editorial board of the academic journal Intelligence and National Security, and a member of the advisory board of Paladin Capital, which invests in cyber security start-ups.

He has held senior posts across the UK’s security, intelligence, and defence institutions, including Director of GCHQ, Permanent Secretary at the Home Office, and the first UK Security and Intelligence Coordinator in the Cabinet Office, responsible to the Prime Minister for the professional health of the intelligence community, national counterterrorism strategy, and ‘homeland security’. He served for seven years on the UK Joint Intelligence Committee and writes widely on intelligence, counterterrorism strategy, resilience, and the ethics of secret intelligence.

The International Risk Podcast brings you conversations with global experts, frontline practitioners, and senior decision-makers who are shaping how we understand and respond to international risk. From geopolitical volatility and organised crime, to cybersecurity threats and hybrid warfare, each episode explores the forces transforming our world and what smart leaders must do to navigate them. Whether you’re a board member, policymaker, or risk professional, The International Risk Podcast delivers actionable insights, sharp analysis, and real-world stories that matter.

The International Risk Podcast is sponsored by Conducttr, a realistic crisis exercise platform. Conducttr offers crisis exercising software for corporates, consultants, humanitarian, and defence & security clients. Visit Conducttr to learn more.

Dominic Bowen is the host of The International Risk Podcast and Europe’s leading expert on international risk and crisis management. As Head of Strategic Advisory and Partner at one of Europe’s leading risk management consulting firms, Dominic advises CEOs, boards, and senior executives across the continent on how to prepare for uncertainty and act with intent. He has spent decades working in war zones, advising multinational companies, and supporting Europe’s business leaders. Dominic is the go-to business advisor for leaders navigating risk, crisis, and strategy; trusted for his clarity, calmness under pressure, and ability to turn volatility into competitive advantage. Dominic equips today’s business leaders with the insight and confidence to lead through disruption and deliver sustained strategic advantage.

Subscribe for all our updates!

Tell us what you liked!

Transcript

[00:00:01] David Omand: I have this trilogy of emergencies, crises and disasters, and I put crises in the middle because you don’t manage crises; they manage you. That’s the point: the chief executive won’t actually know straight off what to do. If he thinks he does, he may well be seriously misguided.

[00:00:20] Dominic Bowen: Welcome back to the International Risk Podcast, where we discuss the latest world news and significant events that impact businesses and organisations worldwide. This episode is brought to you by Conductor. Conductor software helps you design and deliver crisis exercises without needing a big team or weeks of preparation. You can create a central exercise library with Conductor Worlds, and you can generate reports that support your governance and compliance requirements. So, if you want flexible, realistic crisis exercises that are easy to adopt, then Conductor is worth a look.

[00:00:53] Dominic Bowen: And I have a quick favour to ask before we start today. If you’re a regular listener, please subscribe to and follow the International Risk Podcast. It’s the simplest way to support the show, and it helps us reach listeners who need this content. And my commitment to you is that it will keep improving every part of the experience, from our guests to the quality of the research and the practical insights we provide. And if there’s a guest you think we should bring on the podcast or a risk that you want unpacked, send it through to us, and I promise we read all of your comments. Please hit the subscribe or follow button now, and let’s jump into today’s episode.

[00:01:27] Dominic Bowen: Today’s guest is Professor Sir David Omand. He’s a Visiting Professor in the Department of War Studies at King’s College London, and he’s held senior positions across UK security, intelligence and defence institutions, including as Director of the UK’s Government Communications Headquarters, which many of you will know as GCHQ. He was Permanent Secretary at the Home Office and the first UK Security and Intelligence Coordinator at the Cabinet Office. He served for seven years on the UK Joint Intelligence Committee. He writes widely on intelligence, counterterrorism, resilience and the ethics of secret intelligence.

[00:02:01] Dominic Bowen: I’m Dominic Bowen, and I’m the host of the International Risk Podcast. Now, if you lead a business, a public agency or really any organisation that can’t afford surprises, this episode really is a masterclass in decision-making and uncertainty: how to spot signals, avoid strategic surprises, build resilience, and also gain an understanding of the UK’s CONTEST strategy, the national risk management system designed to keep society functioning freely and with confidence at the same time. Sir David, welcome to The International Risk Podcast.

[00:02:33] David Omand: Hi.

[00:02:34] Dominic Bowen: Whereabouts in the world do we find you today, David?

[00:02:36] David Omand: North London.

[00:02:37] Dominic Bowen: Beautiful spot, especially in spring. Is it too early to say spring, or have we reached spring in London?

[00:02:41] David Omand: No, no! The sun is shining. It’s a beautiful day outside. The daffodils are just coming out.

[00:02:47] Dominic Bowen: Fantastic. You should work in tourism as well. That sounded very appealing. Well, David, you’ve written several great books, including How Spies Think: Ten Lessons in Intelligence.

[00:02:55] Dominic Bowen: Now, in my experience, intelligence is not necessarily about delivering certainty to decision-makers, but more about reducing uncertainty. And that requires us not to assume that everyone thinks like us. We’ve seen that with Russia’s invasion of Ukraine. We’ve seen that time and time again in geopolitics over the last decade. But it also requires us to ensure that we have habits or systems that mean we’re continually testing assumptions, that we remain curious, and that we continually test data and analyse incomplete information.

[00:03:25] Dominic Bowen: Can you tell us what made you write this book, and what parts of it do you think have had the most impact on your readers?

[00:03:31] David Omand: I wrote it because I’ve had a lot of experience, either as part of the intelligence community or as a customer for the product of the British intelligence community. And I felt there was a place for a book that was accessible to the ordinary reader that would demystify, on the one hand, the James Bond imagery and, on the other hand, the sort of assumption that intelligence analysts can somehow predict the future. Nobody can predict the future. There are no crystal balls, even in the Joint Intelligence Committee. We don’t sit around a table looking at a crystal ball.

[00:04:02] David Omand: But good analysis, and the purpose of intelligence is to enable better decisions to be taken by reducing ignorance about what we might face. And nowadays, most of that information is open source, if you know where to look, if you put the effort in. But there’s a tiny little bit of it which is called secret intelligence, which is information other people, your adversaries, people who mean you harm, don’t want you to have, and they may go to extremely violent lengths to prevent you from getting it. That’s what secret intelligence agencies are there for: to provide that extra information that will hopefully reduce the risk that these adversaries pose.

[00:04:43] Dominic Bowen: Now, you’ve seen real warning failures during your career. What’s the actual root cause of that? Is it collection gaps? Is it the analysis? Is it our intrinsic biases? And we speak about that a lot on the podcast.

[00:04:54] David Omand: It’s all of those. In different combinations, in different circumstances. And clearly, if something goes wrong, if a terrorist gets through our defences, if a bomb goes off or there’s a knife attack, by definition it’s an intelligence failure because we didn’t stop it, despite the fact that, if I remember rightly, since March 2017, 43 late-stage terrorist plots were actually stopped by MI5 and the police. But some got through.

[00:05:19] David Omand: I have in my mind a sort of simple-minded comparison with road traffic. You try to drive sensibly and carefully. There are times, taking somebody in an emergency to the casualty department, where you might have your foot down on the accelerator. But if there is an accident, are you guilty of careless driving? Are you guilty of reckless driving? Well, you knew perfectly well you should have been doing certain things, but you deliberately didn’t. Or you got into the vehicle with alcohol in your system. That’s reckless, and that’s culpable.

[00:05:54] David Omand: So when you look at an intelligence failure, you have to ask: were people, in good faith, trying to do their best? Did the system let them down? Did the previous generation let them down by not setting up the kind of collection that would have given the information that would have helped? Or was it just chance?

[00:06:10] David Omand: There’s a term, ‘normal accidents’, which I think is a very useful one, from Charles Perrow’s book on the Three Mile Island near-meltdown, where he said that any sufficiently complicated bureaucratic or technical system will occasionally break, and you’ll get unusual combinations of this goes wrong and then this goes wrong and then that goes wrong. Result: failure. One of the ways you deal with that is buffering, so that any one of those things going wrong doesn’t mean that the whole system is going to end up in some sort of disaster.

[00:06:42] David Omand: So, I just think that before we throw around terms like ‘intelligence failure’, you need to think about what was known at the time, not what we know now, because with the passage of time, it becomes very clear. One example which I can think of was one of the 7/7 bombers, one of the bombers who attacked the London Underground, who was on MI5’s radar, and they were engaged in a surveillance operation of somebody else, another quite serious suspect. And that serious suspect met somebody. The watchers could see this, but they didn’t know who it was.

[00:07:18] David Omand: Now the question was: do you continue to follow the original target that you were told to follow, or do you stop that and go and follow the new contact? For all you know, they may just have been exchanging the address of a pizza parlour. You don’t know, and you can’t know. But then, sadly, long afterwards, you discover that that other individual actually might have been the starting clue that could have led to the frustration of a terrorist attack. You’ll never know at the time.

[00:07:45] Dominic Bowen: It’s very, very difficult when we do debriefs after a crisis with our corporate clients and with different government actors. One of the things I always have to really emphasise is this: not based on the information we have today, two weeks after we’ve resolved the crisis, but at that time, at 11:34, when the crisis management team decided to pause operations, close factories, send staff home, or pay the ransom, based on the information you had at that time and only the information you had at that time, was that the most reasonable decision? Not based on the information you have now. Hindsight’s very easy, but that’s something I think we do. We go, “Oh, but it was the wrong decision.” It’s the wrong decision with the information we have now, but not with the information that you reasonably had at that time.

[00:08:22] David Omand: I think the word “reasonable” that you use is the key to it. Would a reasonable person, well-trained in whatever it is, have taken that decision? This occurs all the time in a military context, where potentially a junior officer is leading some patrol, he turns left thinking that that’s the safer route, and it turns out they get ambushed. Now, you don’t hang that young officer out to dry.

[00:08:46] David Omand: Yes, with hindsight, you can see it’s a mistake. But then when you start to investigate, if you discover that actually there was intelligence that showed that was not a good move to make, but it never got to the young officer, then you can begin to dig into the system and potential systemic problems which might need fixing.

[00:09:04] Dominic Bowen: No, I think that’s a great point, David. And one of the things you talked about was whether previous generations have set us up for failure, and there’s this assumption sometimes that intelligence cycles fail because of a scarcity of information. But in 2026, I think there’s a real problem around the contamination of information, deception, and manipulation at an industrial scale now.

[00:09:24] Dominic Bowen: Now this is not new; Napoleon was using deception, and the Romans were using deception. But how they’re used today, from state disinformation to deepfakes, synthetic media and, of course, the amplification that artificial intelligence brings, is different. So I wonder, if you’re advising companies or policymakers today, what can be done with the intelligence cycle so that it stays resilient in a world where there’s just so much white noise and, more importantly, deliberate misinformation?

[00:09:49] David Omand: What can one say? Take care. Put to the back of your mind what you want to see in the information and actually start to look critically at what it does tell you. This is where deceptions work, because usually they’re telling the customer or the chief executive or the government what it is they secretly want to see. That was the case with the Iraq War and the mobile BW trailers that Saddam was alleged to have; that intelligence came from a defector who was debriefed and interrogated by the German secret service. And they passed the information on to the United States and to us. And it was very credible because this guy had worked on Saddam’s BW programmes the first time around.

[00:10:33] David Omand: The problem was that he faked a lot of the information he passed on because he wanted Saddam overthrown; he wanted to encourage the American intervention. And as you can imagine, when this landed in the Pentagon, this confirmed all the fears. It was the intelligence they wanted to see. And a point that’s allied to that is that even when the information you’re looking at doesn’t quite fit, or it might be internally contradictory, or you have other reports that don’t point in that direction, the natural human temptation is to argue away the inconsistencies.

[00:11:11] David Omand: I’ve seen that happen time and time again because it’s a sort of confirmation bias. You want this information to be true. It isn’t quite fitting the way it should. So what are the reasons why it isn’t? And it’s the same with the scientists conducting an experiment, and there are one or two points that don’t lie on the straight line on the graph. Do you blame the equipment and say, ‘Gosh, take those points out; just scrub them from the data,’ or do you say, ‘No, actually, this is the beginning of scientific discovery’? There’s something going on here we hadn’t realised, and so often that’s the clue to some great innovation.

[00:11:48] Dominic Bowen: Yeah, definitely – that’s certainly one of the things we work with crisis management teams on. The most important thing I think a crisis management chair can do is not make decisions, but actually facilitate discussion and facilitate debate, and that’s how we come up with the best possible decisions. But it’s not easy, and despite having many great business leaders, a lot of them really struggle with that.

[00:12:07] David Omand: It’s never easy. Part of my definition of a crisis is that the person at the top, the CEO or chair or minister, doesn’t know what to do. If they did know what to do, you’d pull the plans out of the cupboard, you’d ring up the emergency services, and you’d assemble a team to manage the emergency. But in a crisis, by definition, it’s very novel, or a series of things have hit you that don’t seem to fit previous patterns. It’s what I call the rubber levers test: you pull on the lever, and it doesn’t seem to be connected to action on the ground.

[00:12:40] David Omand: I think good leaders at this point recognise that and realise that they’ve actually got to pull in their own team, they’ve got to pull in outside expertise, and mobilise so that they find a path through the crisis. And it may well be quite a novel path – not something they’ve practised or rehearsed. This is a paradox: the more determined the chief executive, the more experienced they are, the more threatening it is to be in a situation where you don’t know what to do.

[00:13:09] Dominic Bowen: David, you mentioned the rubber levers test, and you’ve also mentioned in your writing the arc of crisis model. I’m really glad you mentioned that because I think they’re great concepts around the moment the controls stop working as expected and this typical arc of a crisis, from first signs to confusion and then moving on to clarification. Can you talk us through those two concepts? I think they’re so valuable.

[00:13:31] David Omand: Yes. I have this trilogy of emergencies, crises and disasters, and I put crises in the middle because you don’t manage crises; crises manage you. That’s the point: the chief executive won’t actually know straight off what to do. If he thinks he does, he may well be seriously misguided. So you have to mobilise, get resources around, and say to the team, ‘Together, we will find a way through this,’ rather than taking the top-down line.

[00:14:00] David Omand: ‘I’m the boss, I know what I’m doing.’ And, as I say, it can be quite threatening in those circumstances for the person at the top to be in that situation. So probably in this arc of crisis, something happens. It comes to notice. If the windows have been blown in by a terrorist explosion outside, or the basement is flooded, it is pretty obvious something serious has happened.

[00:14:24] David Omand: But it might not be like that. It might be that the problem has been cooking away quietly for some years, ignored by the C-suite. Yes, the numbers were falling in this particular region, and they kept falling, and they’re still falling. And then suddenly you’ve got a crisis, and the market has suddenly woken up to this, and you have to do something about it. So that’s a first stage.

[00:14:49] David Omand: Then, of course, you’ve got your team together, you’ve mobilised. It’s harder to do that than you would think, because the message going out to the company is: forget about what we were saying before about the priorities. Now there is a real and present danger in front of us, and that’s what we’ve got to crack fairly quickly before it turns into a disaster for us. So that message can only come from the top. Hard to say it, but the fact that British Prime Minister Boris Johnson missed the first four or five emergency meetings on Covid sent the wrong signal.

[00:15:14] David Omand: You know, this is a health emergency; the Secretary of State for Health can chair that. But it wasn’t a health emergency, it was a national emergency, an economic emergency, as we very quickly discovered. So that message from the top has got to be clear: we’re going to mobilise, we’ll find a way through this.

[00:15:42] David Omand: Then I hope you get to the sort of consolidation stage where you’ve worked out your path. You’ve still got to be prepared to take account of further information or indeed further manifestations of crisis. You know, things hit you and then they hit you again and then they hit you again. So you can’t just make a plan and then that’s it, it’s all over. And then you get through this consolidation phase, I hope to a sort of clearing up phase.

[00:16:09] David Omand: But one should never forget there’s a sort of hidden final phase, which is calling to account and, if necessary, retribution. If people are dead, there are going to be inquests. If a company has really lost value, shareholders are going to demand heads on a plate. So in the course of managing the crisis, a little bit of space has to be kept for thinking ahead to how all this is going to play out.

[00:16:34] Dominic Bowen: That last stage that you labelled retribution, in the corporate world we often call more lessons learned and strengthening. But yes, I find that actually the most difficult stage because I think last year I worked with 25 different companies on crises, and I think only one was actually willing to go through that final step. All the others were keen just to get back to business, put it behind them and move on, which is, I think, just such a lost opportunity to strengthen systems and to be stronger for the next crisis, which we know is coming.

[00:17:01] David Omand: It’s related to a point that I make in my book about building up a reputation not just in the media, but in the public mind, the customer’s mind, the investor’s mind, for reliability: telling the truth, being open, being as transparent as you can within the obvious commercial confines. That stands you in good stead when something bad happens because you will be given a little bit of slack. Otherwise, if the reputation… one could think of BP in the Macondo oil spill case. If you’ve had problems before with an oil refinery fire, you’ve had problems before with oil spills in Alaska, the reaction from the media may be far more sceptical than it needs to be. And the same is true of governments.

[00:17:44] David Omand: If you build up a reputation when things are quiet, it will stand you in good stead when suddenly something unexpected happens. If you get it wrong, of course, then you’ll pay quite a price if people simply don’t believe the explanation you’re trying to give as to why the company is in this difficulty or that difficulty.

[00:18:03] Dominic Bowen: I made a very similar point to a crisis communications team, which is a subset of a larger crisis management team for a very large multinational company, about a week ago. And one of the things I said to them is that the response to this crisis actually needed to start six months before the crisis. You need to have those strong relationships with the media, with your stakeholders, with the investors. If you’re only starting to build that now that the crisis has occurred, you’re about six months behind the ball.

[00:18:28] David Omand: That is so true. And there are things, steps you can take to build that relationship, to give media access to facilities, they understand more about what the company is about. And the same is true of investors. If it all suddenly hits them as a terrible shock and there’s a level of mistrust because there’s been lack of transparency about the accounts, for example, exactly how some complex company was working. You run into a problem, it’s so much harder to do something about it.

[00:18:58] David Omand: So the other part of that is: do not try to spin yourself out of a crisis, because the first things you say may well turn out to be untrue as more information comes in. And these days, particularly with social media, people are so quick to spot inconsistencies between what the regional manager was saying and what head office is saying. So taking genuine care… I have a slight hobby horse here about giving apologies when things go wrong, even if in the end it’s not your fault.

[00:19:29] David Omand: A lot of difficulties could have been avoided if public authorities had stood up and said, we’re sorry, something happened on our watch, people got hurt, lessons will be learned. We’ll put things as right as we possibly can, but we’re sorry. If the corporate lawyers get involved, they will say, well, that’s an admission of guilt. When the court cases arrive, you’ll regret having said that. I don’t think that’s true at all.

[00:19:54] David Omand: And I think courts really understand when a chief executive or government minister stands up and says, ‘We’re sorry. I’m sorry that on my watch this happened.’ And that goes a long way, I think, to assuaging the feelings of next of kin, for example, that somehow those in charge have been trying to avoid taking responsibility.

[00:20:16] Dominic Bowen: Yes. As an adviser to executive teams and boards, I’m often at their companies when they’re making significant announcements, often announcements that are not pleasant for the employees. And I always make a point: I’m actually normally not staring at the chair of the board or the CEO during that announcement. I often position myself near the front of the room and I’m staring back at the employees, looking at their faces and asking: are they nodding? Are they accepting this as the truth?

[00:20:38] Dominic Bowen: Is this true to them, and are they understanding, or is there resistance? Is there shaking of the head? Are they whispering to each other? And I think that’s much more important to me, as someone who’s advising their leadership team: do the employees believe you? Do they think it’s genuine?

[00:20:52] Dominic Bowen: Do they appreciate it? And are they going to be a positive force working with you to find a solution, or have you already got them offside? Are they already going to be resisting you? And, David, as well as being an accomplished author, you’re also the architect of the UK’s counterterrorism strategy, CONTEST. Now, for our listeners, perhaps you can explain: why was this strategy necessary?

[00:21:13] Dominic Bowen: What were you trying to achieve when you designed CONTEST?

[00:21:15] David Omand: 9/11 was an enormous shock globally, and it was a shock to the United Kingdom too. We lost more Britons in New York than in any previous or subsequent terrorist incident because there were so many working in the Twin Towers. I arrived on the scene in September 2002, so some time had passed. There had been the intervention in Afghanistan, but the threat was very clearly still there. So I sent a note out to, if you like, Whitehall, everyone I could think of, saying, ‘I think we actually ought to have a national counterterrorism strategy.’

[00:21:54] David Omand: I asked for volunteers. Let’s put a team together. And this is the military, the Foreign Office, civil departments, local government, central government, you name it, lots of people involved, and the police. So I sent this minute out saying we ought to construct this strategy and then present it to the Cabinet, the British Cabinet, to see whether that’s the way they want to proceed in a world of very significant threat.

[00:22:21] David Omand: Contest was a name I dreamt up in the bath, literally lying there taking the phrase counterterrorism strategy and picking letters in sequence. And if you try it, you can make quite a lot of words that way. Contest seemed to me to have the right kind of resonance, if you like. My experience in defence had been that basically, if you can’t get it on the front of a T shirt and it hasn’t got a fairly resonant codename, it will just be one of those glossy documents, it goes In a cupboard. It’s never looked at again.

[00:22:50] David Omand: If you want a counterterrorism strategy that actually people use and that lives, then you have to have that kind of simplification. So I sent this out. We put a team together. Prime Minister allowed me to chair an official committee, Cabinet committee. We did a lot of work for a couple of months.

[00:23:07] David Omand: And then, early in 2003, I presented the results to the Cabinet and they accepted it. The underlying logic of the British position at that point was: we want to defeat the terrorists by denying them what they seek. And what they seek is fear and disruption. If we can maintain conditions of normality, then we’re prevailing and the terrorists are losing. I may say that at the same time, the United States had declared war on al-Qaeda and the first sentence of the US strategy was, ‘America is at war.’

[00:23:41] David Omand: British ministers did not think the United Kingdom was at war, so we needed a different approach. So it hinges around this idea of normality. Can you reduce the risk, not eliminate it? Because if you try to eliminate something like terrorism, then you’re liable to be drawn into measures which, in the long run, detention without trial, for example, will not achieve your purpose. So you’re going to reduce the risk from terrorism to the point at which people feel sufficiently secure to go about that normal business.

[00:24:13] David Omand: I added at the end of the statement of aim, which we framed, two words freely and with confidence, freely meaning a message to Ministers, you will not have succeeded in this objective if you’ve had to suspend our cherished rights and freedoms in the course of doing it. And with confidence is the touchstone. Do people still travel on holiday, on aeroplanes? Do people use the London Underground even after it’s been bombed? Is there inward investment?

[00:24:42] David Omand: Do opinion polls show that people are not in fear? So that led us to a strategic aim. It hasn’t really changed from the one I presented to the Cabinet early in 2003. And that’s over 20 years. So we’ve stuck to the same approach, the same logic, for over 20 years.

[00:24:59] David Omand: That’s hugely important because some of the things you need to do take a very long time. You know, giving all the emergency services interoperable radios, for example, is very expensive and takes a long, long time. So that sort of continuity really helps. And underneath it, of course, you have to turn that noble-sounding objective into things to do, things to spend money on, measures to take. And that’s where I quite shamelessly stole from the City of London the risk equation, expected value. It’s the likelihood of the bad things happening, your vulnerability to that kind of bad thing, and then what happens if they get through our defences and do the impact, both in the short term, clearing up and dealing with casualties, and in the long term, for how long society is going to be disturbed. Given that the objective is normality, now you can operate on all those variables.

[00:25:31] David Omand: It’s your likelihood of the bad things happening, your vulnerability to that kind of Bad thing. And then what happens if they get through our defences and do the impact, both in the short term, clearing up, dealing with casualties and in the long term, for how long is society going to be disturbed? Given that the objective is normality now you can operate on all those variables. And I certainly still believe that the best, most effective counterterrorism strategies have a bit of all of them. And you don’t put all your money on catching terrorists because as we know, tragically, some will get through the net.

[00:26:09] David Omand: You don’t put all your money on pouring concrete and putting up barbed wire and so on. So you need that sort of balance between the elements. It will shift over time. We, I think, tripled the size of our security service as a result of this strategy in order to do the first part of the programme, which we call Pursue, and worked with some quite unusual countries overseas, as well as our usual allies, to exchange information, uncover networks, and use advanced technologies, bulk access to metadata and so on.

[00:26:41] David Omand: And that’s been hugely successful. The number of trials that have resulted in conviction is extraordinarily high because the evidence in the end was so good. There’s a point worth making about the protection side of all of this. Does the public feel protected where we came up with this security by design concept? Because if it’s normality, you don’t want armed policemen on every street corner, you don’t want lots of barbed wire and barriers and so on.

[00:27:10] David Omand: What you want is conditions of normality you can design those in so that the average punter isn’t aware that actually they are being protected. So if you walk down Whitehall, the main thoroughfare in the centre of London, where the government is, you will find at the edge of the pavement there are some what look like stone balustrades of 18th century almost. They’re actually designed specially by the security service. They won’t fracture into shards of concrete if a car bomb explodes. They keep the car bomb away from the building.

[00:27:43] David Omand: You don’t know that. Another very good example was the London Olympics of 2012, where if you go to the Olympic park, you see water being used as a barrier. So you try to drive your car with the bomb attached, you sink to the bottom. And that’s the approach the United States has taken with its new embassy in London, which has, in old fashioned terms, you might call a moat around it, but it looks fine and it adds to the general feeling of normality, rather than having 12 foot high walls with spikes on the top. So these are just examples of the way in which you accentuate that sense of normality, but actually reduce the risk.

[00:28:21] Dominic Bowen: I’m a big fan of moats. I was in Damascus a couple of weeks ago and one of the hotels that I was visiting and assessing actually had a moat around it, and it was fantastic. It meant they didn’t need fences on that side because there was this extremely deep moat around it, and it looked beautiful. I wonder, your comment about not eliminating terrorism, I think, is a really valuable one, and considering how CONTEST is more of a risk-management framework, I wonder…

[00:28:44] Dominic Bowen: How do we balance, and how did you during your time and maybe how have you seen that evolve since, the Prevent, the Pursue, the Protect and the Prepare, especially when it comes to budgets that are limited?

[00:28:55] David Omand: I don’t think there’s any great science involved. It’s about applying experience, trying to spot where there may be gaps, keeping an eye on the technologies used by the adversary, where you may suddenly need to spend more on this rather than that. After, for example, there was a period in which we spent a lot of money with private industry developing those X-ray machines for airports. Thankfully, the technology has moved on and they’re much, much more effective than they were initially. At one stage I started to work on this in terms of game theory, minimax and maximin, and said to the team, you’ve got to come up with ideas which minimise the maximum harm a determined adversary could do.

[00:29:41] David Omand: In the case of the United Kingdom, Sellafield contains a large amount of highly enriched uranium and plutonium. So that’s a clear danger. So a large amount of money was spent making it impregnable. It’s under international inspection as well. I’d be very confident in saying no terrorist is going to break in and carry off nuclear materials, but you can’t afford to do that for everything.

[00:30:05] David Omand: So the other is maximin, which is maximising the minimum level of assurance you can give the principal public. So at train stations you can’t have airport-type screening. There are just too many people going through. We did actually try that once; we ran a trial, but people are in such a hurry in the morning it would never have worked. So what can you do?

[00:30:25] David Omand: Well, you can put in more closed circuit television cameras and have them monitored, including with the latest artificial intelligence support, you can have more armed police available at very short notice. So even if something does happen, it can be stamped on really very quickly. And the experience in London, for example, the London Bridge attack, where the armed police are on the scene within minutes of the first call and what could have been an extremely dangerous situation. Yes, it was dangerous, people were hurt, but nonetheless that kind of preparation reduces the overall risk. I might just make a point about risk here because we had two kinds of risk in our minds.

[00:31:08] David Omand: One is the risk that anyone might say, in London, if I walk out, what is the risk that I am going to be involved in a terrorist incident? It’s minuscule. If I were foolish enough to ride a bicycle in London, my risk would be an order of magnitude greater. If, like some kids I see, I rode my bicycle without a helmet, then forget terrorist risk.

[00:31:28] David Omand: You’re taking on yourself an extraordinary level of risk. So it is still the case. Although the threat of a terrorist attack is substantial, according to the Joint Terrorism Analysis Centre, which means an attack is quite likely, the chances that you personally are going to be in the wrong place at the wrong time is low. But what we worried about was what I call societal risk, where it would only take one or two successful attacks to provoke a reaction, perhaps from the extreme right wing. Then you would find, say a mosque would be torched, communities would come out to defend their territory, defend their religious sites.

[00:32:06] David Omand: You would get a riot, perhaps a street march would turn into a riot. Somebody would get their head broken, you have the first martyr. That’s how the Northern Ireland conflict started back in 1969, at Burntollet Bridge, where civil rights marches were set upon by ultra-loyalist thugs. And it all spiralled from there until we lost control of the streets and you had fighting in the streets. The idea that social harmony, which, you know, I’m proud of in the UK — we are a very peaceful place for the most part — could really be disrupted by terrorism, where the terrorists would deliberately seek out disruptive things to do and would incite the counter-reaction.

[00:32:50] David Omand: And of course one of the reasons why terrorists, they want to commit some atrocity is that they know this will inflame the public and probably lead to overreaction by the government. And then they can point and say, well, we told you so, we told you this government was fascistic or whatever it might be. Look at what they’ve just done. My idea of normality is very much bound up with this idea of social harmony, different people just living together in peace.

[00:33:18] Dominic Bowen: I think that’s really, really important. And I wonder, just briefly, David, when we look at a successful counterterrorism strategy, or really any security or policing strategy, it results in nothing. Sometimes, of course, it can result in increased arrests, et cetera, but generally speaking, if it’s working well, there isn’t a terrorist attack. How do we measure success? How difficult is it, and how difficult was it for you to justify the budgets and the people and the effort to work on this when, if you were being effective, very little was visibly happening?

[00:33:45] David Omand: When we started, there was really no difficulty in getting the budget. Gordon Brown was Chancellor of the Exchequer. He was very supportive because he could see the longer-term, including the economic consequences, of being seen to be an unsafe country. There isn’t, I think, I’ve never come across an objective measure of success because it’s about counterfactuals.

[00:34:06] David Omand: What would have happened if you hadn’t had the strategy, if you hadn’t spent the money, if you hadn’t poured the concrete? So that kind of comparison can’t be done on a small scale. One of the things you can do is track the adversary. So, for example, hijacking has been almost entirely eliminated for the simple reason that every civil airliner on the planet has to have a lockable steel door to the cockpit. And even if there’s mayhem going on in the cabin, don’t unlock the door.

[00:34:35] David Omand: Just land as quickly as you can. It was one of our most successful measures. It involved the North American authorities and the European air traffic control authorities saying, ‘We will not let an airliner from any of these little airlines across our territory unless they’ve got the lockable steel door.’ And eventually ICAO and so on mandated this. And that’s a good example where you can really see you’ve made a difference. Of course, what then happens, coming back to my point, is the terrorists then say, ‘All right, we’ll have to smuggle a bomb on board.’

[00:35:07] David Omand: And you get things like Al Awlaki’s parcel bomb put on a freight flight. You’ve got the Detroit underpants bomber trying to detonate a bomb that he had secreted around his private parts. The bomb went off, but it didn’t bring down the aeroplane and did him a certain amount of damage. The adversary will respond. It’s one of the big differences between emergency management against natural disasters, floods and fires and so on, where God doesn’t fight back, the adversary will adapt.

[00:35:39] David Omand: And the main way we see that in the United Kingdom is that they are not, at present, as far as I know, trying complex plots like 9/11, or the airlines plot to bring down airliners simultaneously in the Atlantic. It’s the individual with a knife strapped to their wrist or renting a car and running it into a crowd. That shift, which is in many ways a good shift, means the risk of mass casualties is much less, but it is much, much harder for the authorities to detect. The individual who may switch from passive support, get enraged by something seen on social media, by circumstances we’re seeing today in the Gulf, and decide, ‘I must do something,’ and goes out and does it — that’s extremely difficult to pre-empt.

[00:36:25] Dominic Bowen: And of course, David, many of our listeners work in business or even in public policy, not necessarily in secret agencies. If you had to pick one of the lessons that either you’ve written about, learned about, or spoken about during your career, and our listeners could start applying it tomorrow to improve their analysis and their decision-making processes, what would that one thing be, and what would it look like in practice?

[00:36:48] David Omand: Oh, well, I feel you’re putting me on the spot here, so I will be provocative. My one lesson would be tell the truth to each other. Forget the office politics, forget the spin, forget the media outside that may be wanting certain messages and so on. Just tell the truth. And this is what the intelligence analysts really, really try to do.

[00:37:10] David Omand: To be honest in their appreciations and to say when they don’t know. And when it comes to surviving crises, if the boss has actually created a climate where people won’t reveal problems because they feel the boss is going to lash out at them, so they keep it quiet, they try to solve the problems themselves, and they fail, before you know where you are it has become a C-suite issue. If only they’d known two months earlier, it could have been sorted with minimal damage to the company. And there are so many instances where, for understandable reasons, we’re not as honest as we should be.

[00:37:48] Dominic Bowen: And David, just in the last 30 seconds, I wonder, when you look around the world today in March 2026, what are the risks that concern you the most around the world?

[00:37:57] David Omand: Oh well, let’s put to one side the really big things: climate change, mass migrations, we know all of that. What I’m more worried about is the way that advanced technology is getting into the hands of people who can use it to enrich themselves or use it to harm other people. The AIs of today, the large language models, can do their own coding. You don’t have to be an IT expert to be able to hold a company to ransom.

[00:38:27] David Omand: And the advent of quantum computing. Most of the systems that most companies are using will not stand up when a quantum computer at scale becomes available. Nor will they be able to contact their customers or their suppliers, because some of them will have upgraded to post-quantum cryptography and others won’t. So the messages will bounce back. I mean, it could be chaos.

[00:38:51] David Omand: It’s just one of those little things that you just need to keep an eye on. But it is not rocket science. What needs to be done is known. It just needs a little care and attention to make sure it does happen. But the really, really big risks, the existential risks, are climate change.

[00:39:06] David Omand: And if President Xi decided to put a blockade around Taiwan, the consequences are very difficult to calculate what would then happen, but it wouldn’t be good.

[00:39:16] Dominic Bowen: Thanks for unpacking that, David, and for anyone that is interested to learn more about quantum computing and the impact and the potential risks, we did do an episode on that late 2025, so I think that’s worth going back and having a look at. David, thank you very much for coming on the podcast today. I really enjoyed our conversation.

[00:39:31] David Omand: It’s been fascinating.

[00:39:33] Dominic Bowen: Well, that was a great conversation with Professor Sir David Omand. He’s a Visiting Professor in the Department of War Studies at King’s College London, and I really appreciated hearing his thoughts on crisis management as well as the UK government’s counterterrorism policy, CONTEST. Please remember, if you prefer to watch your podcast, you can find us on YouTube. Please do go there and remember to subscribe and, hopefully, like our content. Today’s episode was produced and coordinated by Edward Penrose.

[00:39:58] Dominic Bowen: I’m Dominic Bowen, your host. Thanks very much for listening. We’ll speak again in the next couple of days. Thank you for listening to this episode of the International Risk Podcast.

[00:40:06] Dominic Bowen: For more episodes and articles, visit theinternationalriskpodcast.com. Follow us on LinkedIn, Bluesky and Instagram for the latest updates and to ask your questions to our host, Dominic Bowen. See you next time.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *