c.im is one of the many independent Mastodon servers you can use to participate in the fediverse.
C.IM is a general, mainly English-speaking Mastodon instance.

Server stats:

2.7K
active users

Understanding "longtermism": Why this suddenly influential philosophy is so toxic

has explored the possibility of engineering "radically enhanced" human beings by genetically screening embryos for "desirable" traits, destroying those that lack these traits, and then growing new embryos from stem cells.

This engineered person might be so different from us — so much more intelligent — that we would classify them as a new, superior species: a . According to Bostrom's 2020 "Letter From Utopia," posthumanity could usher in a techno-utopian paradise marked by wonders and happiness beyond our wildest imaginations.

Referring to the amount of pleasure that could exist in utopia, the fictional posthuman writing the letter declares: "We have immense silos of it here in Utopia. It pervades all we do, everything we experience. We sprinkle it in our tea."

Central to the is the idea of , introduced by Bostrom in 2002. He originally defined it as any event that would prevent us from creating a posthuman civilization, although a year later he implied that it also includes any event that would prevent us from colonizing space and simulating enormous numbers of people in giant computer simulations (this is the article that retweeted).

More recently, Bostrom redefined the term as anything that would stop humanity from attaining what he calls " ," or a condition in which we have fully the and to the limit — the ultimate Baconian and capitalist fever-dreams.

salon.com/2022/08/20/understan

Salon · Understanding "longtermism": Why this suddenly influential philosophy is so toxicWhatever we may "owe the future," it isn't a bizarre and dangerous ideology fueled by eugenics and capitalism

For , there is nothing worse than succumbing to an : That would be the ultimate tragedy, since it would keep us from plundering our " " — resources like stars, planets, asteroids and energy — which many longtermists see as integral to fulfilling our "longterm potential" in the universe.

What sorts of catastrophes would instantiate an existential risk? The obvious ones are nuclear , global and runaway . But Bostrom also takes seriously the idea that we already live in a giant computer that could get shut down at any moment (yet another idea that seems to have gotten from Bostrom).

Bostrom further lists " " as an existential risk, whereby less "intellectually talented" people (those with " ") outbreed people with

It should be clear from this why the "Future of Humanity Institute" sends a shiver up my spine. This institute isn't just focused on what the future might be like. It's advocating for a very particular worldview — the longtermist worldview — that it hopes to actualize by influencing world governments and tech billionaires. And to this point, its efforts are paying off.

is, alongside William MacAskill, a "research associate" at the Future of Humanity Institute. He is also a "'s " advocate who has been involved in transhumanism since the 1990s.

In his contribution to the 2008 book "Global Catastrophic Risks," which was co-edited by Bostrom, he argued that in order to rebuild industrial civilization if it were to collapse, we might need to "retrace the growth path of our human ancestors," passing from a hunter-gatherer to an agricultural phase, leading up to our current industrial state.

How could we do this? One way, he suggested, would be to create refuges — e.g., — that are continually with . But not just any humans will do: if we end up in a pre-industrial phase again,

Robin Hanson's big plan is to take people from contemporary hunter-gatherer cultures and stuff them into underground bunkers with instructions to rebuild industrial civilization if ours collapses

Chuck Darwin

More recently, Hanson became embroiled in controversy after he seemed to advocate for " " along the lines of "income redistribution," following a domestic terrorist attack carried out by a self-identified "."

This resulted in Slate wondering whether Hanson is the " ."

Not to disappoint, Hanson doubled down, writing a response to Slate's article titled "Why Economics Is, and Should Be, Creepy."

But this isn't the most appalling thing Hanson has written or said. Consider another blog post published years earlier entitled " ," which is just as horrifying as it sounds. Or perhaps the award should go to his shocking assertion that
"the main problem" with the Holocaust was that there weren't !

In 2021, William MacAskill defended the view that caring about the long term should be the key factor in deciding how to act in the present.

When judging the value of our actions, we should not consider their effects, but rather their effects a or even a from now.

Should we help the poor today? Those suffering from the devastating effects of climate change, which disproportionately affects the Global South?

No, we must not let our emotions get the best of us: we should instead follow the numbers, and the numbers clearly imply that ensuring the birth of 10 to the 45th power of digital people — this is the number that MacAskill uses — must be our priority.

Although the of 1.3 billion people is very , MacAskill would admit, the difference between 1.3 billion and 10^45 is so vast that if there's even a tiny that one's actions will help create these digital people, the of that action could be far greater than the expected value of helping those living and suffering today.

Morality, in this view, is all about crunching the numbers; as the longtermist Eliezer Yudkowsky once put it, "Just shut up and multiply."

By understanding the social milieu in which has developed over the past two decades, one can begin to see how longtermists have ended up with the bizarre, fanatical they are now to the world.

One can begin to see why Elon Musk is a fan of longtermism, or why leading "new atheist" contributed an enthusiastic blurb for MacAskill's book.

As noted elsewhere, Harris is a staunch defender of "Western civilization," believes that "We are at with ," has promoted the of Charles Murray — including the argument that Black people are less intelligent than white people because of genetic evolution — and has buddied up with - figures like Douglas Murray, whose books include "The Strange Death of Europe: Immigration, Identity, Islam."

It makes sense that such individuals would buy-into the - worldview of longtermism, according to which the is the of human development, the only solution to our problems is more and morality is reduced to a exercise.

One must , when MacAskill implicitly asks "What do we owe the future?" he's talking about. The future of indigenous peoples? The future of the world's nearly 2 billion Muslims? The future of the Global South? The future of the environment, ecosystems and our fellow living creatures here on Earth? I don't think I need to answer those questions for you.