i an confused about the agi safety vs longtermism debate. as far as i can tell, longtermism mostly is just about agi safety, with some "and also bio risk!"s thrown in?

Aug 14, 2022 · 8:34 PM UTC

43
12
7
243
also somewhat confused by the branding of longtermism, since most longermists seem to think agi safety is an existential risk in the very short term :)
22
4
2
139
Replying to @sama
… can anyone explain this to me, or direct me to articles that talk about this? Feel like I’m missing a lot of context 🫠
Replying to @sama
avoiding human extinction is pretty important for both the long and medium term
3
Replying to @sama
Maybe greater cosmic risks? Runaway climate risks? (large enough) Nuclear stuff?
3
Replying to @sama
There's nuclear too!
1
Replying to @sama
This debate is like putting the cart before the horse... Everything is a risk in a long enough horizon
3
Replying to @sama
The AI XRisk is not long-term, but any XRisk is long-term in that there will be a long-term future if we don't all die soon. Also, because AI risk seems science-fictionish to most people, it is tagged as something that happens in the long-term future.
2
Replying to @sama
I personally use longtermism to describe the long term trajectory of civilization and consciousness. This naturally includes existential risks such as AI and biotech. Also includes culture, technological progress, and our complex trajectory as we progress into the future.
2
Replying to @sama
Not sure what MacAskill's position on this is now, but in 2020 he argued against the 'Hinge of History' thesis, which roughly claims that "we live at the most important time in history". globalprioritiesinstitute.or…
1
2