TESCREAL

TESCREAL is an acronym neologism, proposed and advocated by computer scientist Timnit Gebru and philosopher Émile P. Torres, standing for transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism, and longtermism. Gebru and Torres argue that these ideologies should be treated as an "interconnected and overlapping" group with shared origins. Gebru and Torres allege this is a movement which allows its proponents to use the threat of human extinction to justify societally expensive or detrimental projects. They consider it pervasive in social and academic circles in Silicon Valley centered around artificial intelligence. As such, the acronym is sometimes used to criticize a perceived belief system associated with Big Tech.

Origin
Gebru and Torres coined the "TESCREAL" acronym in 2023, first using it in a draft of a paper titled "The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence". The paper was later published in First Monday in April 2024, though Torres and Gebru popularized the term elsewhere prior to the paper's publication. According to Gebru and Torres, transhumanism, extropianism, singularitarianism, (modern) cosmism, rationalism, effective altruism, and longtermism are a "bundle" of "interconnected and overlapping ideologies" that emerged from twentieth-century eugenics, with shared progenitors. They use the term "TESCREAList" to refer to people who ascribe to, or appear to endorse, most or all of the ideologies captured in the acronym.

Analysis
According to critics of these philosophies, TESCREAL describes overlapping movements endorsed by prominent individuals in the tech industry to provide intellectual backing to pursue and prioritize projects including artificial general intelligence (AGI), life extension, and space colonization. Science fiction author Charles Stross, using the example of space colonization, argued that the ideologies allow billionaires to pursue massive personal projects driven by a right-wing interpretation of science fiction by arguing that not pursuing such projects presents an existential risk to society. Gebru and Torres write that, using the threat of extinction, TESCREALists can justify "attempts to build unscoped systems which are inherently unsafe". Media scholar Ethan Zuckerman argues that by only considering goals that are valuable to the TESCREAL movement, futuristic projects with more immediate negatives, such as racial inequity, algorithmic bias, and environmental degradation, can be justified. Danyl McLauchlan, a politics writer speaking at Radio New Zealand, states that many of these philosophies may have started off with good intent but might have been pushed "to a point of ridiculousness."

Philosopher Yogi Hale Hendlin has argued that by both ignoring the human causes of societal problems, and over-engineering solutions, TESCREALists ignore the context from which many problems arise. Camille Sojit Pejcha wrote in Document Journal that TESCREAL is a tool to concentrate power by tech elites. Dave Troy described TESCREAL in The Washington Spectator as an "ends justifies the means" movement that is antithetical to "democratic, inclusive, fair, patient, and just governance". Gil Duran wrote that "TESCREAL", "authoritarian technocracy", and "techno-optimism" were phrases being used in early 2024 to describe a new ideology emerging in the tech industry.

Gebru, Torres, and others have likened TESCREAL to a secular religion due to its parallels to Christian theology and eschatology. Writers in Current Affairs compared these philosophies and the ensuing techno-optimism to "any other monomaniacal faith... in which doubters are seen as enemies and beliefs are accepted without evidence". They argue pursuing TESCREAL would prevent an actual equitable shared future.

Ozy Brennan, writing in a magazine affiliated with the Centre for Effective Altruism, criticized Gebru and Torres's approach of grouping different philosophies as if they were a "monolithic" movement. Brennan argues Torres has misunderstood these different philosophies, and has taken philosophical thought experiments out of context. James Pethokoukis, of the American Enterprise Institute, disagrees with criticizing proponents of TESCREAL. He argued that the tech billionaires criticized in a Scientific American article for allegedly espousing TESCREAL had achieved significant accomplishments with their wealth to advance society. Danyl McLauchlan has noted that critics of the TESCREAL bundle have objected to what they see to be disparate and sometimes conflicting ideologies being grouped together, but opines that TESCREAL is a good way to describe and consolidate many of the "grand bizarre ideologies in Silicon Valley".

According to Torres, "If advanced technologies continue to be developed at the current rate, a global-scale catastrophe is almost certainly a matter of when rather than if." Torres considers that "perhaps the only way to actually attain a state of ‘existential security’ is to slow down or completely halt further technological innovation", and criticized the longtermist view that technology, although dangerous, is essential for human civilization to achieve its full potential. Ozy Brennan contends that Torres's proposal to slow or halt technological development represents a more extreme position than TESCREAL ideologies, preventing many improvements in quality of life, healthcare, and poverty reduction that technological progress enables.

Artificial General Intelligence (AGI)
Much of the discourse around an existential risk from AGI occur among supporters of the TESCREAL ideologies. TESCREALists are either considered "AI accelerationists", in that they consider AI the only way to pursue a utopian future where problems are solved, or "AI doomers", in that they consider AI likely to be unaligned to human survival and likely to cause human extinction. Despite the risk, many "AI doomers" consider the development of AGI inevitable and argue that only by developing and aligning AGI first can existential risk be averted.

Gebru has likened the conflict between "AI accelerationists" and "AI doomers" to a "secular religion selling AGI enabled utopia and apocalypse". Torres and Gebru argue that both groups use hypothetical AI-driven apocalypses and utopian futures to justify unlimited research, development, and deregulation of technology. By considering only far-reaching future consequences, creating hype for unproven technology, and fear-mongering, Torres and Gebru allege TESCREALists distract from the impacts of technologies that may negatively impact society, disproportionately harm minorities through algorithmic bias, and have a detrimental impact on the environment.

Reception to TESCREAL with respect to the AGI community has been mixed, especially when "laid out... without nuance". An MIT Technology Review report notes that when Gebru presented TESCREAL in a 2023 talk, the audience audibly laughed.

Bias against minorities
Gebru and Torres argue that TESCREAL ideologies directly originate from twentieth-century eugenics, and argue that the bundle of ideologies advocates for a second wave of new eugenics. Others have similarly argued that the TESCREAL ideologies developed from earlier philosophies that have been used to provide justification for mass murder and genocide. Some prominent figures who have contributed to TESCREAL ideologies have been alleged to be racist and sexist. McLauchlan has said that, while "some people in these groups want to genetically engineer superintelligent humans, or replace the entire species with a superior form of intelligence" others "like the effective altruists, for example, most of them are just in it to help very poor people ... they are kind of shocked ... that they've been lumped into this malevolent ... eugenics conspiracy".

Alleged "TESCREALists"
Venture capitalist Marc Andressen has self-identified as a TESCREAList. He published "The Techno-Optimist Manifesto" in October 2023, which has been described by Jag Bhalla and Nathan J. Robinson as a "perfect example" of the TESCREAL ideologies. In the document, he argued that more advanced artificial intelligence could save countless future potential lives, and that those working to slow or prevent its development should be condemned as murderers.

Elon Musk has been described as sympathetic to some TESCREAL ideologies. In August 2022, Musk tweeted that MacAskill's longtermist book What We Owe the Future was a "close match for my philosophy". Some writers consider Elon Musk's Neuralink to pursue TESCREAList goals. Some AI experts have complained about the focus of Musk's XAI company on existential risk, arguing that it and other AI companies have ties to TESCREAL movements. Dave Troy considers Musk's natalist views as originating from TESCREAL ideals.

Sam Altman and much of the OpenAI board has been described as supporting TESCREAL movements, especially in the context of his attempted firing in 2023. Gebru and Torres have urged Altman against pursuing TESCREAL ideals.

Self-identified transhumanists Nick Bostrom and Eliezer Yudkowsky, both influential in discussions around existential risk from AI, have also been described as leaders of the TESCREAL movement.

Sam Bankman-Fried, former CEO of the FTX cryptocurrency exchange, was a prominent and self-identified member of the effective altruist community prior to his arrest. According to The Guardian, after FTX's collapse, administrators of the bankruptcy estate are attempting to recoup about $5 million that they allege was transferred to a non-profit to help secure the purchase of a historic hotel that has been repurposed for conferences and workshops associated with longtermism, rationalism, and effective altruism. The property hosted liberal eugenicists and other speakers with racist and misogynistic histories, and has raised speculation about how much people within the "TESCREAL" movement may have continued to benefit from FTX money.

Longtermist and effective altruist William MacAskill, who had frequently collaborated with Bankman-Fried to coordinate philanthropic initiatives, has been described as a TESCREAList.