mathstodon.xyz is one of the many independent Mastodon servers you can use to participate in the fediverse.
A Mastodon instance for maths people. We have LaTeX rendering in the web interface!

Server stats:

2.8K
active users

#Longtermists

0 posts0 participants0 posts today
Replied in thread

@LillyHerself

#Longtermist Mike #ElonMusk who thinks that even his own trans daughter has lost her value for humanity, as she cannit fulfill her biological destiny anymore, for instance?

In earlier times, what you mention would've been very much in line with #SocialDarwinism, too.

And if you think of #Longtermists' believe of making #humanity a multi-planetary species, that is going to cost hundreds if billions.
We cannot support "suckers."

Said to say, but here we are.

Replied in thread

@timnitGebru

#Technofeudalists and the perceived #AI threat incongruence

(1/n)

I'm surprised that you as an #AI and #TESCREAL expert See a discrepancy in this. For the morbid and haughty minds of the #Longtermists line #Elon, there is no discrepancy, IMHO:

1) At least since Goebbels, fascists offen acuse others of what they have done or are about to do themselves; or they deflect, flood the zone, etc. That on a tactical/communications strategy level.

2) More...

Replied in thread

@lproven what scares me is if this scenario is true and some people in power knows yet carries on as if everything is fine. That's pretty omnicidal.
I don't know whether #longtermists are just climate crisis deniers, or they know and this is all a diversion while they plan to live in caves underground. But with a sterile Earth in 1500yrs, talking about humanity in a million years is just bonkers.

Continued thread

For #longtermists, there is nothing worse than succumbing to an #existential #risk: That would be the ultimate tragedy, since it would keep us from plundering our "#cosmic #endowment" — resources like stars, planets, asteroids and energy — which many longtermists see as integral to fulfilling our "longterm potential" in the universe.

What sorts of catastrophes would instantiate an existential risk? The obvious ones are nuclear #war, global #pandemics and runaway #climate #change. But Bostrom also takes seriously the idea that we already live in a giant computer #simulation that could get shut down at any moment (yet another idea that #Musk seems to have gotten from Bostrom).

Bostrom further lists "#dysgenic #pressures" as an existential risk, whereby less "intellectually talented" people (those with "#lower #IQs") outbreed people with #superior #intellects

When talking about #longtermism it's utterly crucial to remember that we're not talking about FUTURE people, we're talking about HYPOTHETICAL people.

'Future' presumes they're actually going to happen. In fact the likelihood of any scenario remotely resembling the #longtermist future is vanishingly small.

So #longtermists are willing to sacrifice billions of now-living people for a bizarre, essentially counter-factual fantasy.