On Manifold Markets, "If Artificial General Intelligence has an okay outcome, what will be the reason? — C. Solving prosaic alignment on the first critical try is not as difficult, nor as dangerous, nor taking as much extra time, as Yudkowsky predicts; whatever effort is put forth by the leading coalition works inside of their lead time." has a probability of 6.6%.
View on Manifold Markets →This question is currently tracked on Manifold Markets only. When the same question appears on additional platforms, we compute a cross-platform consensus probability. Learn about our methodology →
The only prediction market aggregator with academically validated cross-platform consensus.