The report analyzes five types of future risks: economic, environmental, geopolitical, societal, and technological. Here is their graph showing economic risks (click to expand):
The graph shows more likely events on the right, and events with bigger impact on the top. The graph only projects out to 10 years.
This graph seems pretty reasonable. Economic inequality (meaning the concentration of too much money in the hands of too few people) is correctly listed as one of the biggest risks. Fiscal imbalances (meaning that governments such as the USA have borrowed way too much money and are running deficits too high) is also correctly listed as one of the biggest risks. If we were to make a graph like this showing economic risks for the next fifty years, it would have to show a risk not listed on the above graph: the risk of widespread unemployment caused by automation and use of robots.
Here is the report's graph showing environmental risks (click to expand):
Upon seeing this you may complain that some serious environmental risks are not mentioned here, but it seems that to avoid having the graph too crowded, the authors split up the environmental risks and put some of them on their list of societal risks. Here is that graph (click to expand):
These two graphs cover most of the bases in terms of environmental risk, and correctly list water supply crises and food supply crises as two of the biggest risks. I don't know why the graph rates “rising religious fanaticism” as being more of a threat than “vulnerability to pandemics,” which doesn't seem to make any sense. The chance that you will be killed by some new pandemic (which might in a worst case kill hundreds of millions) is vastly greater than your chance of being killed by an angry religious fanatic; and I'm not sure there's any evidence the number of religious fanatics is increasing.
Here is the report's graph on geopolitical risks (click to expand):
There is one inexplicable omission from this graph and the other graphs: there is no mention of the risk of nuclear war. The graph merely mentions the risk of “diffusion of weapons of mass destruction,” which presumably means more countries acquiring nuclear weapons. The fact is that the world still has many thousands of nuclear weapons, and even if there is no further proliferation of nuclear weapons, we face a very serious risk that these weapons will one day be used. Every single year that thousands of nuclear weapons exist, there is a substantial chance that they will be used, due to accident, miscalculation, or an isolated case of insanity or rage by a local commander such as a missile sub captain.
Here is the report's graph on technological risks (click to expand):
This is the one graph in the series that I find to be something of a misfire. I don't know what they mean by “unforseen consequences of climate change mitigation”--is it a references to hazards of geoengineering? The inclusion here of “failure of intellectual property regime” as a major risk seems inappropriate, as does proliferation of orbital debris (a risk to astronauts, but not the average man).
A suitable graph showing technological risks over the next fifty years would list as major threats the following items (among others):
- The risk of runaway nanotechnology reproducing out of
- The risk of the genetic engineering of lethal diseases
- The danger of automation causing a large increase in
- The risk that an electromagnetic pulse weapon will destroy
our electronic infrastructure
Here is the report's graph listing the Top 5 Risks by Likelihood and Impact (click to expand):
My only objection to this summary graph is, again, that it inexplicably ignores the risk with greatest impact: the risk of nuclear war. A conservative estimate of the risk of a nuclear war is 1 percent per year. The United States has approximately 2150 active nuclear weapons (7700 in all), and Russia has 1800 (8500 in all). Every year those weapons continue to exist, there is a chance of nuclear war, through things such as a software error, a mechanical error (as in Fail Safe), a deliberate launch of weapons by a sub or missile base commander afflicted by insanity or rage (as in Dr. Strangelove), or one side misidentifying something as a nuclear attack (such as happened in 1995, described here). Nuclear war should have been listed as the threat with greatest impact.