On Manifold Markets, "If Artificial General Intelligence has an okay outcome, what will be the reason? — L. Earth's present civilization crashes before powerful AGI, and the next civilization that rises is wiser and better at ops. (Exception to 'okay' as defined originally, will be said to count as 'okay' even if many current humans die.)" has a probability of 1.1%.
View on Manifold Markets →This question is currently tracked on Manifold Markets only. When the same question appears on additional platforms, we compute a cross-platform consensus probability. Learn about our methodology →
The only prediction market aggregator with academically validated cross-platform consensus.