The Timeliest Book of 2020
The Timeliest Book of 2020
Toby Ord is one of the smartest people I know, but even he couldn’t have predicted how timely his just-published book would be.
Written over the last five years, Precipice is a guide to thinking more clearly about the future and the importance of preventing global catastrophes -- and includes a big section on pandemics.
Ord is eminently qualified to talk about this. He’s a Senior Research Fellow in Philosophy at Oxford University, and has advised the WHO, the World Bank and countless others. He previously worked on global poverty, but has in recent years focused more on the longterm future of our species.
Pandemics are featured prominently in his book – including the danger of bioterrorists who genetically engineer pathogens. These potential human-made pandemics, disturbingly, could be more lethal than the coronavirus.
But it’s not just pandemics we should be thinking about. Ord also discusses other neglected threats that could spell the end of the human experiment, including:
--Nuclear War (which we seem to have forgotten about since the end of the Cold War, but which is still very much a threat)
--Climate change
--Artificial intelligence gone awry
--Natural disasters such as supervolcanoes or asteroids.
Some dismiss these as sci-fi scenarios. But in light of current events, such objections are less and less convincing.
Ord makes a powerful argument that caring for the future is the most important moral issue of our time. If the human race survives, we’re talking billions, even trillions, of people yet to come. They are the true silent majority, and we have a responsibility to them. To paraphrase the Bible, Love your future neighbor as yourself.
I had some lighthearted banter with Ord about how to safeguard the future of the human race. Okay, it’s not so lighthearted. But I think it’s interesting and important, and I hope you agree.
Having studied longterm threats for years, what lessons can we learn from the Coronavirus pandemic?
A key lesson is in the value of preparation. This isn’t a case where we were blindsided. There was widespread agreement among pandemic experts that we were underprepared, but the many of the plans they suggested, including stockpiling protective equipment, were not implemented, or were defunded and sold off by successive administrations. This is a case where governments were responding to short term incentives, skimping on protection for disasters that were unlikely in their administration in order to give more immediate, tangible benefits to voters.
With this disaster fresh in our minds, we need to change this dynamic. We need people to take seriously how much their governments are caring about catastrophic risks when deciding who to vote for. We need leaders to actually lead on these issues: to make the grown up choice to protect the citizens rather than to raid the funds for short term benefits. We need both citizens and leaders to pay more attention to the non-partisan experts on these topics. And perhaps we need some institutional reforms: taking away the temptation to be imprudent by ring-fencing some of these budgets or having them set by arms-length expert bodies, outside of direct political control.
At one point in the book, you estimate that humans have a one in six chance of making it through the next century. A roll of the dice. And about 1 in 2 chances for the coming centuries. What are the most important steps to increase our survival as a species.
In the long term I think there’s 1 in 2 chance not just that humanity survives, but that we achieve one of the best possible futures available to us. The remaining probability is that we squander our potential, whether it be through going extinct, or by failing less dramatically — if we continue to treat each other poorly, or build a world that is mediocre rather than exceptional.
To ensure that humanity survives this century, we need to grow up fast. I sometimes think of humanity as an adolescent. We are a young species, and have got to a point where our power has outstripped our wisdom — where we are strong enough to destroy ourselves and everything we could become, but not sensible enough to avoid doing so. We are just old enough to get ourselves in trouble. At a really high-level, we need to transform ourselves into a more mature; more sensible species. This means taking existential risk seriously — the world currently spends more on ice cream than it does on safeguarding our future.
Our most impressive steps towards moral progress have been driven by dedicated individuals and groups agitating for change, and shifting public opinion. We need a fundamental reorientation in our moral thinking, towards the vast future we are failing to protect; to the story of humanity through the ages and the role that our generation must play in it. This is something we can all play a role in — we can discuss these issues with the people we care about; we can be responsible, informed, citizens, urging our leaders to make the right choices; we can inspire our children, and ourselves, to do the hard work required to safeguard our future.
Sometimes I think, like Steven Pinker, the world has gotten a lot better. But other times I sink into despair. Where do you stand on the optimism/pessimism spectrum?
I’m an admirer of Pinker’s work, and I’m firmly in the optimistic camp. I certainly have moments when I’m sad, or angry, at the avoidable suffering in the world today, and at the risks humanity is imposing on itself. But it is my love of humanity and optimism about our potential that makes me so frustrated at our failures.
I’m awed by the progress we’ve made, which Pinker and others have highlighted — in the last two centuries we have lifted millions of people out of extreme poverty; we have rid the world of some of the most horrific diseases. We are all living longer, healthier lives than any of our predecessors; we have more freedoms to choose our lives and loves; we are able to enjoy the creations of the hundred billion people that came before us. But there is so much further we could go — we are at the foothills of this inspiring project of building a world that we can truly be proud of, free of suffering and injustice, filled with more of what makes life good.
It is not despair so much as impatience at how long it is taking to get ourselves to safety. I am confident that we can solve these problems in time, and rise to the challenge of securing a flourishing future for our descendants. But in order to do this, we need to survive this dangerous period we have found ourselves in.
You argue convincingly that we have a moral obligation to the billions (or trillions) of humans to come. I agree with you. But it’s pretty hard to visualize our 14th great grandkids, to make that moral truth salient. Any suggestions on how?
I think this is a familiar problem that isn’t restricted to future generations. We find it hard to really grasp the suffering of people in distant countries, burdened by poverty and ill-health. We even struggle to empathize with our future selves, let alone our distant descendents — we leave things to the last minute, we take on too many commitments, we give ourselves hangovers.
Something I find powerful, is remembering that we were once future generations ourselves. We are much more comfortable with the fact that there were people in the distant past, with lives of their own, and plans for the future. And some of these people were thinking of us — one of my favorite quotes in the book is from Seneca, who was writing 2,000 years ago about how scientific knowledge would develop in the future — ”There will come a time when our descendants will be amazed that we did not know things that are so plain to them ... Let us be satisfied with what we have found out, and let our descendants also contribute something to the truth.” How could Seneca, or Shakespeare, have begun to imagine the world we occupy today? It would have been no less mysterious to them than the future is to us. Yet we benefit everyday from things done for us by our ancestors when we were future generations. If we tackle existential risks this century, our descendants will look back on us with profound gratitude at what we’ve done for them.
This pandemic is, of course, horrific. But you are even more worried about human-made pathogens causing worse pandemics. Can you talk about that?
We can be confident that the current pandemic is not an existential risk to humanity. This is small comfort amidst the level of death and disruption we are all living through, but is important to highlight.
The most important reason to think that natural pandemics are unlikely to pose much existential risk, is that they are a threat humanity has lived under for our whole history. This argument applies to natural risks in general: from the fact that we have survived for 200,000 years, we can be confident that these risks together are not very high, or else our survival would be exceedingly unlikely. We have seen truly devastating pandemics — the Black Death, 1918 flu — that have killed millions of people; but humanity has survived them. There are features of our modern civilization that have raised this risk (global travel; dense populations; industrial agriculture); but these are balanced against things that make us much safer (modern medicine; public health; institutions).
With engineered pathogens, the past provides no comfort. We are seeing startling progress in biology that is allowing us to make pathogens more dangerous than anything in nature. This raises risks both from accidents in legitimate research, and from deliberate misuse.
We tend to think that such research is well-regulated, and that scientists must be mindful of the risks, but this is unfortunately not true. In the book I discuss the case of Ron Fouchier, a Dutch scientist, researching a deadly strain of flu that kills 60% of humans it infects, but which (fortunately) is not transmissible between humans. He was able to make the virus transmissible between mammals, and tried to publish his method in major journals. This sort of dangerous research needs to be better scrutinized, so that we can decide whether it is worth the risks it poses to us — yet there is very little transparency or scrutiny. Labs are not required to report accidents, so we don’t know how frequently such pathogens escape; and Fouchier’s research wasn’t even conducted in the most secure type of biosafety lab.
Likewise, when it comes to deliberate use of pathogens, we are much less safe than we like to think. We rightly have a global treaty against the development and use of bioweapons, but the body charged with upholding this ban — the Biological Weapons Convention — has a smaller budget than a typical McDonald’s restaurant, and just four employees.
Time and time again, we are forced to conclude that humanity is not taking these risks seriously enough. As progress in biotechnology continues, we will develop more powerful techniques to create dangerous pathogens, and these will become accessible to an ever-increasing pool of people. We need to use our time wisely to prepare for this future, and develop safeguards to protect ourselves before it is too late.
Do you think this pandemic will help people wake up to the other massive dangers to humanity – AI, nuclear war, climate crisis – or will we be so focused on coronavirus that it will hurt the cause of longterm planning?
It would be foolish to make any firm predictions — as you suggest, it could really go either way.
I think that there is every chance that we learn the right lessons from this tragedy: that it causes us to take risks more seriously, and that we become better equipped to deal with them. It’s striking that this was not unforeseeable — there was a whole community of scientists and officials telling us that we were vulnerable to this sort of pandemic, and that we should be doing more to mitigate this risk, and prepare for its impacts. I hope that citizens and governments will see the value in preparing for low-probability events, like this one, ahead of time, and in taking seriously the warnings and advice of experts.
Learning the right lessons will involve not just identifying and patching our vulnerabilities, but pointing towards strengths we didn’t know we had. The unprecedented measures governments have taken in response to the pandemic, and the public support for doing so, should make us more confident that when the stakes are high we can take decisive action to protect ourselves and our most vulnerable. And when faced with truly global problems, we are able to come together as individuals and nations, in ways we might not have thought possible. This isn’t about being self-congratulatory, or ignoring our mistakes, but in seeing the glimmers of hope in this hardship.
I love the dedication. Can you tell me how you came up with that?
The book’s dedication is:
“To the hundred billion people before us,
who fashioned our civilization;
To the seven billion now alive,
whose actions may determine its fate;
To the trillions to come,
whose existence lies in the balance.”
Something I’ve tried to put across in the book is what I call ‘the perspective of humanity’. One of most important moral trends of recent times has been the gradual acceptance that all people matter equally, and that our compassion should extend beyond our communities or nations to those in distant countries we may never meet. We are a long way from living up these ideals, but it has become part of our moral fabric. I think we need to extend this a step further, not just the global perspective, but the perspective of humanity over all the past and future.
I’ve accepted the arguments for this view for a long time. But over the course of writing this book—researching the history of humanity; thinking about everything we’ve created; and contemplating our future potential—I began to feel this compassion in a way that I hadn’t done before. I felt overwhelmed with gratitude for the generations that came before us, with love for our present generation who I believe will rise to the challenges of existential risk, and with a deep optimism for what the future holds.
The sections on the threat of nuclear war are eye-opening. Can you tell about one near disaster in the past? And also, why have people discounted this threat since the Cold War?
Some of the most alarming near-misses were during one of the most well-known events, the Cuban Missile Crisis. The one I explore in the most detail is an incident on a B-59 submarine during the Crisis.
At the peak of the crisis, US ships identified a nuclear submarine in the seas around Cuba, and dropped low-explosive depth charges in an attempt to force it to surface. Inside the submarine, conditions were dire — their ventilation system had failed, and the high temperatures and CO2 levels were causing crew members to fall unconscious. The ship had lost radio contact with their commanders in Moscow, so had no idea what was going on above the surface.
Thinking that the explosions were a sign that war had broken out, the submarine’s captain ordered the crew to use their secret weapon, a nuclear warhead. Firing the weapon required the agreement of the ship’s political officer, and Moscow. Despite the lack of authorization from Moscow, the officer gave his consent. By sheer luck, the commander of the entire flotilla happened to be on the submarine — Captain Vasili Arkhipov. He was the only person with the authority to override the joint decision of the captain and political officer. Arkhipov refused to give his consent to firing the weapon, instead ordering the ship to surface and await orders from Moscow.
It seems likely that had Arkhipov not been on the submarine, the weapon would have been fired. And statements from senior US commanders make it clear that, had the weapon been used, the US would have responded with nuclear force. We also now know that there were many more Soviet missiles in Cuba than the US had thought, which would have almost certainly been deployed in an exchange. Overall, there is every chance that the events on the B-59 would have set off a full-scale nuclear exchange, causing untold death and destruction, and potentially threatening humanity itself. And this is just one example — we know of several similarly close calls, and there are likely many more that we’ll never know about.
There’s some good reasons to think that nuclear risk has lowered since the end of the Cold War — the main source of risk was from a US-Russia exchange, and this seems much less likely today. And both countries have reduced their nuclear arsenals substantially from their peaks. This is not grounds for complacency, though — there is no guarantee that this situation will endure. We are seeing a worrying rise in tensions between nuclear powers, and the collapse of key arms control treaties. New technologies might disrupt the strategic balance; new nuclear powers could rise.
This is one of the reasons we need a concerted focus on existential risk itself, rather than on particular risks such as nuclear war or climate change. For the concern about particular risks is very reactive (taking decades to build up after the new risk is understood) and fails to generalize when an old risk fades and a new one appears.
President, WinSpin CIC, Inc., Strategic Growth, Publicist, Illustrator, Author, SCBWI
4 年Thanks for the connection. Loved listening to The Know it All with Cantor’s voices.
?? multicultural ★ analytical and creative ★ wordsmithing ★ moving & still images ★ voice acting
4 年The recounting of the B-59 Soviet submarine incident near Cuba underscores the truth of the old maxim, 'haste makes waste.' If detachment commander Vasily Arkhipov had not been on board the distressed ship with the experienced patience and courage to veto his comrades' nuclear launch decision on Oct. 27th, 1962, our world would have been much different today. We can change what seems like fate when we find ways to slow down or filter the mix of mental and chemical inputs—information overload, pain, pleasure—and apply best practices and perhaps intuition.
Writer, editor, content marketer. BA, MBA. Creative with business skills.
4 年Great interview, AJ. Very interesting and somewhat disturbing. Seems like just the book for the times.