Nuclear accidents. Sea level rise. Terror threats. The world is full of potential catastrophes, but most of the time, most of us are oblivious to them.
Still, at times, experts warn the rest of us about these potential crises. Sometimes those warnings work, but many times they go unheeded. Why do we ignore information we could use to stave off a disaster?
Prominent national security expert Richard Clarke SM ’79 weighed in on this issue at MIT’s latest Starr Forum event on Wednesday, making the case that we should be more receptive to the possibility of dire news, as well as more systematic about analyzing it.
Clarke, the former chief counter-terrorism advisor on the National Security Council, expanded on ideas in his new book, “Warnings,” asserting that specialists in a range of fields can “see the thing buried in the data that other people don’t see. They see it first.”
Clarke called these people “Cassandras,” after the figure in Greek mythology who could see the future, and described them as experts with accumulated knowledge and a willingness to explore worst-case scenarios.
“It just can’t be any old person off the street saying the sky is falling,” Clarke said. “It
has to be a recognized, acknowledged expert in the field they were giving the warning in. … They had to have studied it and been data-driven.”
Prove me wrong
Examples of this dynamic abound. Engineers warned that Japan’s nuclear power industry was vulnerable to natural disasters well before the Fukushima earthquake and tsunami of 2011. Experts stated that New Orleans was vulnerable to flooding before Hurricane Katrina hit in 2005. Climate scientists, for decades, have warned the world that global warming could upend life as we know it.
And Clarke, for his part, gained a significant public profile after being one of the U.S. security officials most concerned about the threat of the al Qaeda terror group before the attacks of Sept. 11, 2001.
But plenty of dire-sounding warnings can also be unfounded and ultimately incorrect. So how can leaders — in government, business, or elsewhere — distinguish between legitimate fears and simplistic scare-mongering?
To Clarke, a person with a legitimate warning to offer will be willing to have their ideas tested by others: “Cassandras repeatedly say, ‘Well, I gave my data to other experts in the field and said, prove me wrong, and none of them could. They could never prove my data wrong.’”
Why not act?
But if experts are often raising concerns, why do those warnings get ignored? Clarke emphasized that being quick to recognize concerns produces its own set of problems, starting with a lack of consensus. When experts are “yelling to a decison maker, ‘There’s a problem,’ the decision maker says, ‘Yeah? Who else believes you? What other experts in the field agree?’”
Then too, Clarke said, data-based concerns over catastrophes can be ignored due to what he calls “first occurrence syndrome,” namely, the fact that many potential problems have “never happened before, in the memory of the people involved.” New Orleans, for example, had never previously flooded to the degree that it did due to Hurricane Katrina. It is easier to imagine that history will continue within its recent bounds.
Meanwhile, Clarke noted, there can be a “diffusion of accountability” in organizations. One data scientist repeatedly told the firm Equifax recently that it was vulnerable to being hacked, he said. But the responsibility for acting on that was essentially distributed among several people — which can lead to institutional inertia.
Additionally, Clarke added, to stave off disaster, especially in matters related to climate change, “You might have to do something ideologically abhorrent to you. You might have to raise taxes or try carbon capture or enact regulations.” Thus solutions mean to pre-empt catastrophes of all kinds can languish.
See the sea rise
The Starr Forum consists of a series of public discussions, sponsored by MIT’s Center for International Studies, focused on global security issues and other matters of international politics. About 125 people attended the event Wednesday, which was open to the public.
Clarke’s remarks were followed by a dialogue with counterintelligence expert and Center for International Studies Fellow Joel Brenner, as well as a question-and-answer session with the audience.
In his remarks, Clarke observed that being a “Cassandra” can take a heavy psychological toll on experts who find their ideas marginalized.
“A lot of these people get agitated when they are ignored,” he said.
Brenner largely concurred, but wryly noted that “a lot of these people have a special talent for burning bridges” within the organizations they are serving. Still, Brenner noted, the complications of contemporary society and technology mean it is generally safe to assume, at any given time, that “something is going seriously wrong somewhere.”
Asked to produce a hierarchy of issues for us to worry about, Clarke emphasized the vast problems that sea level rise, as a product of climate change, could create in the decades ahead. Rather than the consensus estimate of 3 meters of sea level rise by the year 2100, Clarke stated, we could see 6 to 9 meters of sea level rise by 2050 or 2075.
‘Think about the economic, political, social implications of that,” Clarke said. “Some countries diappear. Mass migrations of people.” He also cited the potential for economic “collapse” in some areas.
Still, Clarke did try to inject some hope into the proceedings.
“I believe in good government’s ability to be rational and save the world from some of these disasters,” Clarke said, adding: “I think it’s an optimistic book, because it holds out the hope that if you had systematic thinking [and] rational analysis, systems thinking, if you want to call it that, we could see problems coming, and stop them from being really big problems.”