12 Comments

The Suicide Caucus is a great term

Expand full comment

Re. Personalised propaganda.

I can’t see an argument that it is an important constraint on politicians’ ability to lie, that they have to tell the same lie to everyone and can’t tell different lies to different people.

Expand full comment

Is the “Suicide Caucus” code for people who don’t believe in top down control of AI? Giving control of super intelligence to large governments and mega-corps under the guise of safety seems extremely fraught. I believe in a more decentralized future where ~everyone has access to extremely high performance models, and people who do bad things with AI, and purely rogue AI are countered by people who are good (with AI).

… kinda like how the world works now. The worst atrocities of modern history were all performed by governments, fyi. Why do we think giving them control over ASI will suddenly make them benevolent and immediately stop war mongering and propagandizing their subjects?

Expand full comment

Mistake that AI is a tool here.

Expand full comment

Zvi et al. have argued pretty extensively about how proliferation of AI will lead to human extinction. It's fine to disagree but you should address those arguments.

Some basic points you'd need to address:

- AI is not merely a tool, and lots of paths to extinction involve loss-of-control scenarios.

- Other paths to extinction don't resemble "rogue AI" at all, e.g. total economic replacement of humans below subsistence wages. It's not obvious how "good people" counter this.

- Fundamentally, it's implausible to assume that a world with ASI works "kinda like how the world works now." That sounds like you don't think ASI would be really transformative, which is weird.

Expand full comment

Also that's not how the world works now. Power is very unevenly distributed. Average Joe does not have the same power as governments and Mega corporations. Yes governments have committed the worst atrocities but it is not average Joes that have prevented even more atrocities from happening, rather it is other governments.

Furthermore, pure chaos and anarchy, without any coordination by central powers, leads to even worse outcomes already today. Civil wars and chaos, warlords, that eventually ends with one warlord beating the others and taking over.

Expand full comment

This is a false dichotomy. The answer to "Who should have this power?" is not "everyone" or "governments and corporations".

It's no-one.

Expand full comment

Would it be fair to say that the most crucial way that an AI is _not_ a tool is that, when presented with a top level goal, it can (and generally must) create sub-goals. And the whole key to making it useful, making it save human labor, is that the user _can't_ be in the loop for these sub-goals. Perhaps the user might have a veto over the dozen topmost sub-goals, but reaching for finer human control than that negates the whole advantage of automating the work.

( Of course, an uncontrolled sub-goal might echo Dr. Strangelove and be "not a thing a sane man would do." )

Expand full comment

To be in favour of this top down control outcome, one must believe that the risk of the controller using the AI improperly without opposition is worse than the risk of distributed AI.

Personally, the events of the last few weeks have pushed me very strongly towards open source AI. Having the US government be sole controller of advanced AI systems is now looking a lot more like the nightmare scenario. If anything, China has been much better both on safety approaches and open sourcing.

Expand full comment

At this point, I think Xi Jinping Thought everywhere forever may be one of the least bad imaginable outcomes. He may be a murderous, genocidal tyrant, but at least I'm pretty sure he's strongly in favour of the continued existence of biological humans.

Expand full comment

"Human in the loop? We can and will take the human out of the loop once the human is not contributing to the loop."

Well said, just look at automated stock trading as an example

Expand full comment