There are only three real ones.
- Malevolent superintelligence.
- The simulation ends.
And various permutations thereof. (I suppose biological lifeforms losing consciousness during mind uploading is another one, but it can be considered a subset of the first one).
Nuclear war isn’t an X risk. It wasn’t one during the height of the Cold War. It certainly isn’t today when total megatonnage is lower by more than an OOM. Perhaps if the world’s Great Powers had continued building up their arsenals at early 1950s US rates for a century or so. But guess what, that didn’t happen. Any boring old supervolcano will release more energy than all the world’s nuclear arsenals combined by many, many orders of magnitude.
Pandemics aren’t an existential risk. Even 99% lethality and 99% infection rates are not enough. They need to both converge to 100%. You can’t really do that without it being highly intelligent per se. So, subset of (1).
Climate change isn’t an existential risk. Even if ALL the locked up carbon was released into the atmosphere, you’d still need ten times as much to unleash a runaway greenhouse effect. Not happening for another billion of years. In the meantime, enjoy having many fewer droughts and famines thanks to global warming.
GRBs and really big asteroids are existential risks, but they happen on geological timelines of around a billion years or more. That is pretty meaningless on the timescale of human civilization or posthuman civilization that’s confined to a single planet.
To the redpilled on IQ and its heritability, dysgenics would seems like an existential risk. But it’s not. The problem is self-correcting in the long-run, even if said correction will likely be quite nasty. Said long-run may take many centuries, but that’s a blink of an eye even on historical timescales. If not on the timescales that we’ve been broadcasting radio emissions into space – but that’s where (2) comes in.
This is all pretty obvious if you want to seriously think about it.
But most people and all institutions don’t qualify. So there are big misalignments in terms of what we discuss and worry about so far as existential risks are concerned.