i an confused about the agi safety vs longtermism debate. as far as i can tell, longtermism mostly is just about agi safety, with some "and also bio risk!"s thrown in?

Aug 14, 2022 · 8:34 PM UTC

43
12
7
243
also somewhat confused by the branding of longtermism, since most longermists seem to think agi safety is an existential risk in the very short term :)
22
4
2
139
EAs trying to put themselves as the rulers and maintainers of the status quo subconsciously to favor their own genetic pool.
Looking forward to you disproving the Jarzynski-Crooks fluctuation dissipation theorem my dude. I've got the Nobel prize committee on speed dial, lmk
4
Lmaoo. Most people understand equilibrium thermodynamics but not out-of-equilibrium thermo. Big difference. The latter favors self-replicating/adaptive life while the former describes thermalization and black body radiation. A human produces far more entropy than a big rock.
1
* a posteriori, a priori would be impossible since it’s an empirical question to begin with
Replying to @marcelsalathe @sama
Really faraway and self sufficient islands could close themselves off at the first sign of a 100% deadly pandemic. Scientists in Antarctica could also wait it out for a while then fly to some not yet inhabited island. Submarines. Lots of survival options.
5
25
errr if that's how ethics worked then we'd have a century-long tradition of utilitarians strongly advocating becoming a sperm bank donor, and the "e/acc" tribe would instead be advocating bombing the sperm bank to sabotage other people's samples before donating lots yourself
10
72
3
23
Replying to @slouischarles @sama
Oh I agree. The upshot is that if we get through this humanity is going to be unimaginably amazing in 200 years' time.
22
54
1
25
But we rightly worry about other pandemics that don't exist yet. Prioritizing a risk should depend on how likely we think it is to happen, not how similar it is to what has already happened (though of course precedent may affect how likely we think it is).
1