This post outlines a fake framework for thinking about how we might navigate the future. I found it useful to my thinking, hopefully you will find it useful as well. Whether or not we build AGI, one of five things must result. Dune solution: The new world permanently lacks AGIs.
What about the option where AGI just isn't that powerful? I mean maybe it's really useful but maybe you can only get the equivalent of an Einstein working for you in exchange for half a data center worth of computation (and even self-improvement only offers a factor of 2x or so).
Such a world doesn't look very different than our world and doesn't clearly fall into any of your buckets. Sure, maybe some AIs eventually become dictators but chance plays a large role there (even political savants are far from guaranteed winners). For your buckets to be exclusive you really need AGI not to just be general intelligence but to offer near godlike powers.
Stages of Survival
What about the option where AGI just isn't that powerful? I mean maybe it's really useful but maybe you can only get the equivalent of an Einstein working for you in exchange for half a data center worth of computation (and even self-improvement only offers a factor of 2x or so).
Such a world doesn't look very different than our world and doesn't clearly fall into any of your buckets. Sure, maybe some AIs eventually become dictators but chance plays a large role there (even political savants are far from guaranteed winners). For your buckets to be exclusive you really need AGI not to just be general intelligence but to offer near godlike powers.