
Thanks to @TheAllMemeingEye for suggesting I duplicate and modify the following market:
https://manifold.markets/TheAllMemeingEye/what-will-be-the-pdoom-of-these-ind?r=NGZh
Artificial superintelligence (ASI) here means any artificial intelligence able to carry out any cognitive task better than 100% of the unenhanced biological human population.
P(doom) here means the probability of humanity being wiped out by misaligned ASI.
Ideally, the individuals will have publicly expressed their P(doom) within the past year, directly or indirectly (e.g., “I think it's practically guaranteed we're all gonna die” = 99%, “I think it's a toss-up whether we'll survive” = 50%, “there's a small but significant risk it'll kill us” = 10%, “the risks are negligible” = 1%, etc.), or they may even be contacted and asked for their P(doom) as defined above.
If it is impossible to get a P(doom) (e.g., they are dead or refuse to give their opinion), then their option may be resolved as n/a.