User:Alenoach/The Vulnerable World Hypothesis

The vulnerable world hypothesis refers to the idea that there may be some technologies that destroy the civilization when discovered (except in some specific cases). This concept was introduced by the philosopher Nick Bostrom in 2018. He also proposes a classification of those vulnerabilities, some counterfactual examples of how technology could have gone wrong, and policy recommendations such as differential technological development. If there is a vulnerability, the solutions presumably needed to survive (i.e. global governance or extreme surveillance) are controversial.

The urn analogy
According to Bostrom :"One way of looking at human creativity is as a process of pulling balls out of a giant urn. The balls represent possible ideas, discoveries, technological inventions. Over the course of history, we have extracted a great many balls – mostly white (beneficial) but also various shades of gray (moderately harmful ones and mixed blessings). The cumulative effect on the human condition has so far been overwhelmingly positive [...]. What we haven’t extracted, so far, is a black ball: a technology that invariably or by default destroys the civilization that invents it. The reason is not that we have been particularly careful or wise in our technology policy. We have just been lucky."

Definition
Bostrom defines the vulnerable world hypothesis as the possibility that “If technological development continues then a set of capabilities will at some point be attained that make the devastation of civilization extremely likely, unless civilization sufficiently exits the semi-anarchic default condition.”

The "semi-anarchic default condition" refers here to having :


 * 1) Limited capacity for preventive policing.
 * 2) Limited capacity for global governance.
 * 3) Diverse motivations

Types of vulnerability
Bostrom distinguishes multiple types of vulnerabilities :


 * Type 0 : a technology carries a hidden risk and inadvertently devastates the civilization.
 * Type 1 : a technology gives small groups of people the ability to cause mass destruction.
 * Type 2 : a technology has the potential to devastate the civilization, and powerful actors are incentivized to use it, potentially because using it first seems to bring an advantage, or because of some tragedy of the commons scenario.

Easy nukes
Before 1933, nuclear scientists had not suspected at all that the research in nuclear physics would lead to the creation of powerful bombs. In 1933, Ernest Rutherford, known as the father of nuclear physics, rejected the idea that nuclear reactions could be used to produce energy as "moonshine". The same year though, Leo Szilard got the idea of nuclear chain reactions, the basis for nuclear bombs. This idea was widely shared in 1939. Producing nuclear bombs still turned out to be extremely hard, notably due to the difficulty of acquiring enriched uranium. The Manhattan Project that aimed to create the first nuclear bombs employed up to 130,000 people at its peak. And it is still nowadays very difficult, costly and long produce nuclear bombs.

The "easy nukes" thought experiment proposed by Nick Bostrom opens the question of what would have happened if nuclear bombs had been easier to produce. Luckily, sufficiently enriched uranium is extremely hard to produce and only a few states can afford to make nuclear bombs. But there was a priori no obvious reason why there wouldn't be any easier way to unleash nuclear energy, for example with only a small laboratory or even a few household items.

This illustrates a situation where small groups of people get the ability to cause mass destruction (a vulnerability of type 1). And how technological development could easily have gone much worse.

Atmospheric ignition
An example type-0 vulnerability if the first nuclear bomb test had been able to ignite the atmosphere. This was predicted not to occur for the Trinity nuclear test in a report commissioned by Robert Oppenheimer. But the report has been deemed shaky given the potential consequences : “One may conclude that the arguments of this paper make it unreasonable to expect that the N + N reaction could propagate. An unlimited propagation is even less likely. However, the complexity of the argument and the absence of satisfactory experimental foundation makes further work on the subject highly desirable.”

Real-world candidates
While this is a subject of debate, some technologies that have been pointed out as potential vulnerabilities are advanced artificial intelligence,nanotechnology and synthetic biology (synthetic biology may give the ability to easily create enhanced pandemics).

Implications
Pausing the technological progress may not be possible or desirable. An alternative would be to prioritize the technologies that are expected to have a positive impact, and to delay those that may be catastrophic, a principle called differential technological development.

The potential solutions varies depending on the type of vulnerability. Dealing with type-2 vulnerabilities may require a very effective governance and international cooperation. For type-1 vulnerabilities, if mass-destruction ever gets accessible to individuals, it is likely that at least some small fraction of the population would use it. For example, Aum Shinrikyo has been actively trying to develop weapons of mass destruction in the early 1990s for terrorist attacks. In extreme cases, adopting mass surveillance might be needed to avoid the destruction of civilization. This dire possible implication received significant media coverage.