Article illustration 1

The Invisible Blueprint of Silicon Valley

When Elon Musk, Peter Thiel, Sam Altman, and a handful of other tech titans talk about the future, they rarely mention ethics or humanity. Instead, they speak of transhumanism, long‑termism, and AGI as if they were the natural next steps in human evolution. Behind these buzzwords lies a more elaborate ideology—TESCREAL—coined by philosopher Émile Torres and data scientist Timnit Gebru in 2023. The acronym stands for Transhumanism, Extropianism, Singularitarianism, Cosmism, Rationalism, Effective Altruism, and Longtermism.

“TESCREAL is a mouthful. What does it mean?” – Doug Henwood, Jacobin interview

Transhumanism: The Engine of Value

At its core, TESCREAL promotes a totalist utilitarian view: the universe is better when it contains more value, and humans are merely the substrate that can generate that value. Transhumanists believe that by radically re‑engineering the human organism—through genetic editing, brain‑computer interfaces, or full mind uploads—we can create a post‑human species capable of colonizing space and running vast simulations.

“Humans are a ‘biological bootloader for digital superintelligence.’” – Elon Musk, X post

“If we have a superintelligence, then we have a super‑engineer. If we have a super‑engineer, then we can engineer paradise.” – Émile Torres

AGI as the Catalyst

The vision hinges on Artificial General Intelligence. Without AGI, proponents argue, humanity will never achieve the engineering feats required to colonize Mars or harvest galactic energy. AGI, once achieved, could multiply itself, replacing biological humans with digital entities that live in simulated universes or inhabit robotic bodies.

Rationalism and Decision Theory

Rationalism, championed by Eliezer Yudkowsky, seeks to purge human decision‑making of cognitive biases. The idea is that by optimizing our rationality, we become better stewards of the future. Critics point to Yudkowsky’s infamous thought experiment comparing prolonged torture to a dust‑in‑eye scenario, highlighting a chilling utilitarian calculus that prioritizes numbers over human dignity.

IQ, Eugenics, and a Dark Legacy

TESCREALism’s emphasis on IQ realism echoes the eugenics movement of the early 20th century. IQ tests were originally designed to justify racial hierarchies, and modern proponents of IQ realism often carry forward the same assumptions—albeit under the guise of “effective altruism.” The result is a form of digital eugenics, where the goal is to engineer a superior post‑human species.

“Transhumanism is eugenics on steroids.” – Émile Torres

The Silicon Valley Titans

  • Elon Musk – Long‑termism enthusiast, sees Mars as a stepping stone to a galaxy‑wide civilization.
  • Peter Thiel – Advocates a biological extension of humanity, wanting to live forever in his own body.
  • Sam Altman – CEO of OpenAI, believes AGI is essential for space colonization.
  • Marc Andreessen – Listed himself as a TESCREAList on Twitter.

These leaders share a pro‑extinctionist stance: humanity is a means to an end, not an end in itself. Their wealth and influence allow them to push this agenda forward with little input from the broader public.

Coercion and the Future of Humanity

The TESCREAL worldview is inherently elitist and undemocratic. It envisions a future ruled by post‑humans, with little regard for human consent. Rationalists claim that objective, universal truths justify their plans, effectively sidelining diverse human perspectives.

“They are doing this without our consent, and they don’t really care one bit about what the rest of us have to say.” – Émile Torres

The Future in Question

TESCREALism is not a fringe ideology; it is the invisible hand guiding the most powerful tech companies in the world. Whether the pursuit of superintelligence will lead to a utopia of infinite value or a dystopia where humans are obsolete remains to be seen. What is clear is that the conversation about AGI, transhumanism, and the ethics of technology must include a broader, more inclusive dialogue—one that does not assume humanity’s obsolescence as the inevitable end.

Source: Jacobin, Interview by Doug Henwood (2025‑11‑14)