
In 2050, will the general consensus among experts be that the concern over AI risk in the 2020s was justified?
77
Ṁ1.2kṀ4.1k2051
72%
chance
1H
6H
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
People are also trading
By 2028, will I believe that contemporary AIs are aligned (posing no existential risk)?
33% chance
Is the nature of AI risk completely misunderstood today with respect to the state of the art in 2030?
30% chance
Will there be a massive catastrophe caused by AI before 2030?
24% chance
Will AI be considered safe in 2030? (resolves to poll)
72% chance
Will AI wipe out AI before the year 2030?
4% chance
At the beginning of 2027, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
70% chance
Will humanity wipe out AI x-risk before 2030?
16% chance
By end of 2028, will AI be considered a bigger x risk than climate change by the general US population?
61% chance
Will AI take over the world by 2100?
49% chance
Will we have a sufficient level of international coordination to ensure that AI is no longer threat before 2030?
22% chance
Sort by:
@jonsimon Neither matters. What this market cares about is "was the probability they placed on the world being destroyed by AI justified by the evidence they had at the time?"
@IsaacKing Whose probability/concern needs to be justified? Laypeople? Computer scientists? Computer scientists who responded to the AI Impact survey? Existential safety advocates / the AI existential risk community? Eliezer Yudkowsky?
I mainly ask because I think the probabilities of, say, extinction would range from something like 5% (maybe laypeople and computer scientists) to 50% (average existential safety advocate) to >99.9% (Yudkowsky).
People are also trading
Related questions
By 2028, will I believe that contemporary AIs are aligned (posing no existential risk)?
33% chance
Is the nature of AI risk completely misunderstood today with respect to the state of the art in 2030?
30% chance
Will there be a massive catastrophe caused by AI before 2030?
24% chance
Will AI be considered safe in 2030? (resolves to poll)
72% chance
Will AI wipe out AI before the year 2030?
4% chance
At the beginning of 2027, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
70% chance
Will humanity wipe out AI x-risk before 2030?
16% chance
By end of 2028, will AI be considered a bigger x risk than climate change by the general US population?
61% chance
Will AI take over the world by 2100?
49% chance
Will we have a sufficient level of international coordination to ensure that AI is no longer threat before 2030?
22% chance