Discussion about this post

User's avatar
jpr's avatar

"If everyone woke up one morning believing only a quarter of what we believe, and everyone knew everyone else believed it, they’d walk out into the street and shut down the datacenters, soldiers and police officers walking right alongside moms and dads."

I think this quote kind of captures the tragedy of Yukowsky. You guys are so, so bad at politics.

I'm sure you can think of many serious issues where a majority agrees, and yet collective action doesn't happen. And if your response is, "those aren't truly shared beliefs", then you must realize what an impossibly high standard "shared belief" actually is, and that mass movements do not work that way.

Case in point -- you guys have done everything you can to waive the concerns of people losing their jobs to AI, even though they are aligned with you on the only point you think matters, which is to shut down development. You have a mass political movement ready to go! But it'd be a big tent, with a lot of people who don't share your beliefs about the economy, etc.

Maybe this is why Yudkowsky spends so much time wheedling policymakers. He just can't conceive of how to engage with politics himself, so he has to delegate it. The problem with this is that he does not have billions of dollars to lobby with.

Aris C's avatar

I just don't understand. This entire review, and just two paragraphs on orthogonality, which is ultimately the heart of the issue: this idea that ASI, for all its knowledge and wisdom, is still stupid in that it pursues objectives single-mindedly.

We (humans) don't do that. Why would ASI?

And OK, let's grant that orthogonality is true. That implies there is no objective morality, right? Because if there were, wouldn't the ASI be aware of it, and comply with it? And if there isn't, then why the qualm about only taking lethal action if it's legal, part of some international accord?

43 more comments...

No posts

Ready for more?