5 questions for Toby Ord on the risk of human extinction

What sort of extinction-level
catastrophes should we be concerned about? What can we do to meet these
challenges? And what are the odds we survive this century? Toby Ord joined
Political Economy to discuss these questions.

Toby is a senior research
fellow at the Future of Humanity Institute at Oxford University, where he
studies the long-term future of humanity. He is the author of the recently
released “The Precipice: Existential Risk
and the Future of Humanity
.”

Below is an abbreviated transcript of our conversation. You can read our full discussion here. You can also subscribe to my podcast on iTunes or Stitcher, or download the podcast on Ricochet.

Pethokoukis: We’ve never detected any sign of life beyond Earth. A previous guest, Robin Hanson, explains this with the theory of the “Great Filter” — the idea that at least one of the steps to colonizing space must be really unlikely. So what’s going to get us, and when’s it going to get us, if at all?

Ord: I hope that nothing
gets us. But my best guess is that there’s about a one in six chance that we
make it through this century and a 50/50 chance that we make it through until
the Earth is no longer habitable — or this entire part of our galaxy or beyond
is no longer habitable.

The homo sapiens species
is about 200,000 years old, during which time we have been subjected to all of
these risks from asteroids and comets and supernova explosions and things like
that. So we know that those natural risks must be fairly low per century, or we
couldn’t have got through 2000 centuries.

But I’m particularly
worried about humanity’s exponentially escalating power — I think we first
reached the point where we could threaten our entire species’ survival when we developed
nuclear weapons. Now the man-made risk could well be substantially higher than the
natural risk, and it’s these anthropogenic risks I’m most worried about.

What sort of risks do we face
from pandemics?

The Black Death and the Colombian
Exchange (when the Americas and the Old World exchanged diseases) could each be
responsible for as many as a tenth of the people in the world at the time being
killed. But it is still difficult for natural pandemics to do us in, as partly
seen by the fact that we have survived 2000 centuries.

Manmade pandemics are a
different story. We’ve seen a lot of lab escapes of extremely serious pathogens,
for instance. And the gap between the world’s best scientists developing a
major new technique in biotechnology and then it being used by undergraduate
students is about two years, so there is a reasonable chance that new biotechnology
could be used by someone with the motivation to destroy all of humanity.Ultimately,
my best guess is that there’s about a one-in-30 chance over the next century
that a successful attempt at this destroys humanity.

Via Twenty20

Experts seem to be concerned
about artificial intelligence. You’re saying they’re right to be concerned?

Yeah, there are a lot of
leading AI experts who are concerned. Humans are currently the most cognitively
capable species on the planet, and leading AI scientists think there’s a
more-than-50-percent chance that over the next century we will develop systems
that exceed the abilities of humans across all domains.

If we do create more
cognitively advanced systems, would we be at their mercy? I think that we
probably will survive such a transition, but there’s a really non-negligible
chance that we don’t. Making sure that these systems are within our control or
making sure that these systems are motivated to produce the kind of utopia that
we would dream of are extremely difficult things to do. And it’s the people who
are trying to work out how to solve those problems who are the leading voices
of concern about this.

There are going to be people who
say, “Let’s ban these technologies until our ethics and our wisdom becomes
greater — and that may never happen, so until that happens, we just need to
stop.” What do you say to them?

If there was a fundamental
kind of renunciation of technological progress, I think that itself would
destroy our future potential — we would achieve only a tiny fraction of what we
could have done. But there could be a more careful version of going slow on the
most risky areas until we have shown ourselves ready — maybe we wait until we’ve
gone an entire century without a world war. That could be a good approach for a
more sane and coordinated world.

In our less sane and less
coordinated world, I’m not sure that having the few people who care about these
risks pushing for going slow would achieve very much, because the more
responsible groups would effectively be abdicating the control of the technology
to the less responsible ones.

What is it about humanity that
makes us less likely to look ahead and prepare for these risks, even though
we’re soaked in a culture which is focused on catastrophe?

Our popular culture, I think,
is a mixed bag on this. In the case of asteroid protection, the Deep Impact and Armageddon films
that came out actually seem to have helped a lot in terms of getting the
funding that was needed to scan the skies for asteroids. But when it comes to
other risks, I wonder whether it actually just makes it worse by associating it
very directly with the kind of comic book plot. The end of the world is often
this super stimulus used by people who want to take the lazy way out when
writing fiction. It’s seen as a kind of gauche device and makes the risk feel
very unreal.

As for our own
psychological shortcomings, there’re a couple of these. One of them is scope
neglect, which is the inability for people to take seriously if something could
affect a million times as many people, that it’s a million times worse, and to
take appropriate measures. There’s also probability neglect — when
probabilities are very small, they often will either get overemphasized
incorrectly or just rounded off to zero.

There’s also a problem where we have a lot of trouble responding to things that aren’t vivid to us. If we can see and feel something, we make appropriate responses. But if it’s just someone telling us that it’s important — and we’re just looking at numbers on a piece of paper — we have trouble acting.

Social Media Auto Publish Powered By : XYZScripts.com