Talk:The Precipice: Existential Risk and the Future of Humanity

Pronouns
Minimize first-person pronouns (in this case we/our) per MOS:PERSON. I'm not sure whether the acceptable use examples apply to any instances here. WeyerStudentOfAgrippa (talk) 16:20, 6 March 2020 (UTC)

Edition for editing
Hi, if anyone is home here. I have the Hachette edition and wondered if anyone minds using it as the primary source? I would like to fill this out further. Greenbound (talk) 14:36, 10 July 2021 (UTC)

Copyvio
I just realized all of the content here is taken from verbatim from the Vandermerwe source: https://forum.effectivealtruism.org/posts/NxpxrC7uYjAT5Jqgd/toby-ord-s-the-precipice-is-published. Nicely written but it will all need to be paraphrased. Greenbound (talk) 23:36, 11 July 2021 (UTC)


 * Are you saying that there is text in this article that violates copyright? If so, please remove the offending text. There are also relevant cleanup tags, e.g.: Template:Copyvio. Biogeographist (talk) 14:54, 12 July 2021 (UTC)

Copyvio: Synopsis
This entire section is taken from a source. Leaving here so we can still view it. I wanted to edit this article anyway so will make it a project.

Part One: The Stakes
Ord places our time within the broad sweep of human history: showing how far humanity has come in 2,000 centuries, and where we might go if we survive long enough. He outlines the major transitions in our past—the Agricultural, Scientific, and Industrial Revolutions. Each is characterised by dramatic increases in our power over the natural world, and together they have yielded massive improvements in living standards. During the twentieth century, with the detonation of the atomic bomb, humanity entered a new era. We gained the power to destroy ourselves, without the wisdom to ensure that we avoid doing so. This is what Ord calls "the Precipice", and how we navigate this period will determine whether humanity has a long and flourishing future, or no future at all. Ord shows how the case for safeguarding humanity from existential risk draws support from a range of moral perspectives. Yet it remains grossly neglected—humanity spends more each year on ice cream than it does on protecting its future.

Part Two: The Risks
Ord explores the science behind the risks we face. In Natural Risks, he considers threats from asteroids and comets, supervolcanic eruptions, and stellar explosions. He shows how we can use humanity's 200,000 year history to place strict bounds on how high the natural risk could be. In Anthropogenic Risks, he looks at risks we have imposed on ourselves in the last century, such as nuclear war, extreme climate change, and environmental damage. In Future Risks, he turns to threats that are on the horizon from emerging technologies, focusing in detail on engineered pandemics, unaligned artificial intelligence, and dystopian scenarios.

Part Three: The Path Forward
Ord surveys the risk landscape and gives his own estimates for each risk. He also provides tools for thinking about how they compare and combine, and for prioritising between risks. He estimates that nuclear war and climate change each pose more risk than all the natural risks combined, and that risks from emerging technologies are higher still. Altogether, Ord believes humanity faces a 1 in 6 chance of existential catastrophe by the end of the century. He argues that it is in our power to end these risks today, and to reach a place of safety. He outlines a grand strategy for humanity, provides policy and research recommendations, and shows what individuals can do. His look at humanity's potential extends across millions of centuries and includes an analysis of the time and energy requirements to populate the "affectable universe."