The Limited Times

Now you can see non-English news...

The philosophy of "long termism": salvation or danger?

2021-11-07T20:31:16.407Z


The climate crisis shows how short-sighted humanity is. A school of thought lavishly funded by Bitcoin billionaires and people like Elon Musk wants to change that. A good thing? Only at first sight.


Enlarge image

Illustration of a human-machine robot: Our biggest problem is

short-termism

Photo: jacquesdurocher / Getty Images

The strangest thing about the 2018 TED lecture by British philosopher William MacAskill is that the climate crisis is only marginally featured in it.

Even though MacAskill has made the future of mankind its central theme.

In jeans and a T-shirt, MacAskill speaks for a good ten minutes about "existential risks" for humanity and the idea of ​​"effective altruism" which he and others have made popular. He works at a facility called the Global Priorities Institute. But if you listen to him, you don't get the impression that the climate crisis is one of his main priorities.

The philosopher once mentioned "the possibility of extreme climate changes", but in a series with biological weapons, out of control artificial intelligence and geoengineering, i.e. active interventions in the earth system itself. In other words: in a series with currently rather hypothetical threats.

Then he adds: "I am not saying that one of these risks is particularly likely," and as a person informed about the extent and threat of the climate crisis one frowns.

There is a lot of money involved

A well-known fan and promoter of the new philosophical direction MacAskill stands for is Skype co-founder Jaan Tallinn. And he seems to think similarly: When CNBC asked him at the end of 2020 what he believed were the greatest existential risks for mankind, Tallinn listed synthetic biology, uncontrollable artificial intelligence and the third "unknown unknown", i.e. risks we don't even know yet. Climate change, on the other hand, is »non-existential«, according to the multimillionaire, as long as no completely uncontrollable worst-case scenario occurs. And that is still considered unlikely. Global catastrophes and terrible suffering, however, are very likely if humanity does not finally act.

Now you are wondering why you should be interested in the views of a young British philosopher and an Estonian tech millionaire.

Because MacAskill, together with other philosophers such as Nick Bostrom, who is best known for his book "Superintelligence", or the Australian Toby Ord, stands for a school of thought that originated at the University of Oxford and was particularly influential in Silicon Valley.

Its protagonists have christened it »long termism«.

People like Tallinn use their vast fortunes to fund this school of thought and its projects.

And when you think your thoughts through to the end, it sometimes has very unsettling implications.

Always with the distant future in view

Toby Ord once defined the term longtermism as follows: "The view that the most important criterion for the value of our actions today is how these actions will affect the distant future." to be wrong?

The philosopher Phil Torres, on the other hand, described longtermism in a much noticed and discussed, thoroughly polemical essay as "lavishly financed and increasingly dangerous".

The Longermists are technology-believers and, because they want to set the value of future, as yet unlived lives as high as those of people living today, are sometimes cynical and inhuman.

Longtermism is a "secular religion".

Is he right?

$ 46 billion and immodestly named institutes

Longtermism is closely related to MacAskill's idea of ​​"effective altruism," and this, in turn, is incredibly powerful for a philosophical concept.

One member of the scene estimates that $ 46 billion is currently available to be invested in "effective altruism" projects.

Its basic idea is as simple as it is sensible: to use charitable donations in such a way that they generate measurable results that are as positive as possible.

So charity plus scientific method.

The longtermism and effective altruism scene is closely networked.

You write books together and meet at immodestly named institutions like the Future of Humanity Institute, the Future of Life Institute, and the Global Priorities Institute.

Top donors from the crypto scene

Top donors include Silicon Valley billionaires like Facebook co-founder Dustin Moskovitz and cryptocurrency entrepreneur Sam Bankman-Fried.

Many other donors also come from the crypto scene.

Elon Musk, in turn, donated money for Nick Bostrom's Future of Humanity Institute at the University of Oxford, which Ord is also working on.

Longtermism and effective altruism are currently the most popular schools of thought among many of the rich in Silicon Valley.

They are in turn closely linked to transhumanism, i.e. the idea that humans could technologically improve themselves and other visions of the future, all of which sound primarily like science fiction.

If you think all of this together with the immortality fantasies of people like billionaire and self-confessed Trump fan Peter Thiel and the space adventures of the super-rich like Musk and Jeff Bezos, the overall picture is uncomfortable: Could it be that the ultra-rich financiers of the long-term idea are involved Above all, like one thing: that it seems to be a good excuse not to have to deal with the suffering and existential dangers of the present?

Do you prefer your own hobbies?

After all, isn't it about "the potential of humanity," nothing less?

Our biggest problem is

short-termism

In fact, what the climate summit in Glasgow is making clear once again in a very worrying way, at the moment is primarily a problem with the opposite:

short-termism

. China, India and Australia do not want to commit themselves to abandoning coal-fired power generation anytime soon, and neither does the USA. This is not least due to the fact that the gains and reliefs that can be achieved in the short term in this way, economically and politically, are valued higher than the catastrophic suffering that the climate catastrophe will cause in countries like India and China in particular.

In this respect, long termism can be seen as a perfectly sensible counterweight to something that plays a central role in both the economy and the human psyche: the present is considered to be worth more than the future.

In psychological experiments, people very often opt for a smaller reward now instead of a significantly larger one in two weeks or next year.

The sparrow in hand is very popular with us humans.

Discount rate set

to zero

This phenomenon has entered the economic sciences in the form of so-called

discount rates

: things that will only be available in the future are generally valued lower than things that can already be had today.

Exactly

which

discount rate

is applied and why is as controversial as it is diverse, but the basic principle is seldom questioned: a hundred euros today is worth more than a hundred euros next year, and much more than a hundred euros in ten years.

Given the phenomenon of inflation, there is of course a certain sense in that.

But it is different if you apply the same way of thinking to future suffering.

It is better to drive a little 180 on the autobahn today and burn coal, the disasters that we trigger with our lifestyle are still a long way off.

From a cosmic perspective - is the climate crisis irrelevant?

Longtermism reverses this logic: The life of future people is supposed to be worth just as much as today's.

This would set the

discount rate

for human suffering to zero.

Thought through to the end, this may have had dire consequences.

Because if humanity continues to exist for a very long time and all lives are of equal value, the future will always be in the overwhelming majority compared to the present.

This could also be a justification for protecting and caring for today's innovators and engineers rather than fighting poverty in the rest of the world.

Because where is there more »potential« than in Silicon Valley?

As a result, many tech billionaires prefer to prepare for their own lives after the apocalypse instead of doing all they can to improve the world in the here and now.

Longtermism critic Phil Torres writes: »If you look at the situation cosmically, even a climate catastrophe that reduces human civilization by 75 percent for two millennia would be little more than a small outlier - something like a 90-year-old who struck his toe once when he was two years old. "

Source: spiegel

All tech articles on 2021-11-07

You may like

News/Politics 2024-04-11T10:11:14.684Z
News/Politics 2024-04-13T16:41:23.078Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.