Saturday, July 26, 2014

The 5 Likeliest Roads to Ruin

Anders Sandberg recently wrote an essay entitled The 5 Biggest Threats to Human Existence. Since Sandberg is a Research Fellow at the Future of Humanity Institute at the University of Oxford, we might expect him to get things right. But I think only some of the five items he mentions are substantial risks to human existence. Below is Sanberg's list, along with comments on each item in the list.

#1. Nuclear war

There is no arguing with this item on the list. Some people seem to think that the threat of nuclear war ended back when the Cold War ended, but that is not true. The United States and Russia still each have about 8000 nuclear weapons. With recent tensions related to Ukraine, it sometimes seems as if the Cold War is still here. You also have to consider the threat posed by other nations with large nuclear stockpiles, such as India and Pakistan.

#2. Bioengineered Pandemic

There is also no arguing with this item on Sanberg's list. The danger of a killer super-virus run amok seems as real as ever, particularly in light of the recent discovery of some smallpox virus vials in a Maryland lab, along with the discovery that some workers at the Center for Disease Control were exposed to live anthrax virus. Both news items suggest that conditions at biological research labs are more lax than we imagine. One can only imagine how conditions are at some foreign labs that might be brewing up the next super-virus in their labs. The terrifying thing about genetically engineered viruses is that they can in theory be created in small labs that could be hidden anywhere.

#3. Superintelligence

Here is the first misfire in Sanberg's list. This is the idea that intelligent machines may take over the planet and get rid of us. But there is almost no chance that this will happen in the foreseeable future. For superintelligence to develop, Moore's Law must stay in effect for several more decades, meaning that computer hardware power and speed doubles every 18 months. But a computer chip expert recently predicted that Moore's law will not stay in effect past the year 2020, as engineers find it harder and harder to pack more processing power into a tiny space.

There is also the fact that superintelligent machines would requires software billions of times more powerful than existing computer software. Computer software development does not grow at any exponential rate comparable to Moore's law. Computer software development progresses at a much slower rate no greater than about 10% per year. Technologists have imagined that we will be able to take some gigantic shortcut to machine intelligence, by scanning the human brain, and transferring “the software of the brain” to the computer. This is fanciful wishful thinking, and there is no particularly good reason to think this will be possible any time soon.

#4. Nanotechnology

I think Sanberg also errs in putting this item on his list. If you believe Eric Drexler about nanotechnology, then we will be able to use it to achieve precise atomic manufacturing, something which would have earth-shaking results for manufacturing (and might create gigantic risks along the lines of “gray goo” involving nanotechnology run amok). But I suspect that such hopes and fears are overblown. Nobel Prize winner Richard Smalley thought that Drexler is way off the mark, that it won't be possible to ever use nanotechnology for precise atomic manufacturing, and that there is no risk of nanotechnology running amok along the lines of the “gray goo” scenario.

#5. Unknown unknowns

I guess you cannot argue with this vague item on Sanberg's list, except by saying that such an item doesn't belong because the purpose of such a list is to warn of dangers, so there's not much point in including this vague catchphrase.

What items should we list as the five greatest risks to human civilization? I would suggest the list below.

#1 Nuclear war

This items is included for the reason listed above, that there are still many thousands of these weapons in existence.

#2. Bioengineered Pandemic

This item is included because it seems all too possible that some future lab might brew up a virus far more deadly than Ebola or smallpox, and all too possible that such a virus might escape such a lab (by design or accident).

#3. Environmental ruin

Why did Sanberg omit this item from his list? Far greater than the risk of superintelligence or nanotechnology is the risk that we will make the planet uninhabitable for ourselves (or barely livable for ourselves) by our polluting activities. There is the possibility that global warming might cause various events (such as the melting of methane hydrates or excessive ocean acidification) that act as feedback loops which then create even more global warming. The result might be human extinction. For a post estimating the odds of such an extinction, see here.

#4. An attack from beyond our planet, natural or purposeful

This item (ignored by Sanberg) includes all threats from outside our planet. One such threat is the threat of a large solar flare that ruins all of our electronics, in an event similar to an electromagnetic pulse attack. Another such threat is the risk of an asteroid or comet striking our planet. If an asteroid only 20 kilometers wide hit our planet, it could be enough to kill everyone (most dying from starvation caused by a nuclear winter). Still another threat from the skies is the threat of an extraterrestrial invasion. Invading extraterrestrials might decide to wipe us out entirely and take the planet for themselves. While each of these threats is rather remote, together they add up to a significant risk. 


extraterrestrial

#5. Resource depletion

Many are worried that Peak Oil will soon occur, causing a downward spiral of civilization. If we run out of easy-to-obtain oil, the whole forward momentum of our current civilization may start to reverse, leading to a downward spiral of collapse. It may seem unthinkable that this might cause a collapse of civilization, but no one in the Roman Empire around 350 AD imagined how great a collapse would occur in the centuries ahead.

What can we do to reduce these risks? We can accelerate programs for dismantling nuclear weapons. We can tighten up international treaties on biological warfare, and protocols for inspecting labs that might create biological weapons. We can increase the modest funding of astronomical programs to monitor near-earth asteroids. What can the average person do? He or she can help reduce risks #3 and #5 by doing the same thing – conserve and reduce consumption that uses up our resources and worsens global warming. Remember, you can't really consume your way to happiness, but we just might consume our way to extinction or collapse.

2 comments: