More wit and wisdom of Sam Altman:


The other 2% are e/acc.

Expand full comment

The Roon quote has multiple duplicated paragraphs, starting with "People who call themselves accelerationist scoff".

In the list of possible opinions of accelerationists, another possibility that you showed some weeks ago is "You think AGI is real and will kill us, but you are ok with it."

"You can adopt" should be "adapt", twice?

Expand full comment

Instead of making people feel low status by analogizing AI to their parent, maybe try "you are to AI as Vladimir Putin's father is to Putin." Putin's dad was a war vet who fought the Nazis and got wounded during WWII. A total badass!

He brought up Putin to be a good communist (aligned) and a total badass (capable), reading the works of Marx and Lenin and learning judo from a young age. Putin was smart, speaking German as a second language and earning a PhD in economics.

Then he joined the KGB and eventually rose to become the leader of Russia, courted a friendly relationship with America, and now he's putting the world in a state of existential crisis. Then ask how much smarter and more ruthless the Putin-AI would have to be than Putin himself to make you feel like it's a genuine existential threat? My guess is that the answer is "not much."

Expand full comment

NGL I'm a little suspicious of Nvidia funding all these unicorns. If you're REALLY supply constrained, why do you need to give a startup with no revenue $700M to buy GPUs from you?

Expand full comment

You use 'mature' to describe what technologically advanced societies collectively aren't; I recall Eliezer Yudkowsky using 'sane'. Do you basically mean the following?

'Sharing a vision of the overall good, sharing a realistic plan for attaining that good, and - despite the absence of a hegemonic enforcer - never defecting.'

If so, might 'cooperative' and/or 'loyal' capture that better in a single word? They seem more appropriate for valiant pursuit of improbable success than 'mature' and 'sane' do.

Expand full comment

EY claiming that security mindset is somehow genetic really rankles. Why is security mindset any less learnable than any other high level skill? Sure, some people are more naturally paranoid, but it's still possible to get good at being honest about the adversary's moves in a game, and anyone with appropriate incentives is likely to git gud, fast. Maybe Eliezer could spend time thinking about how to teach security mindset, the way people teach chess or poker, instead of giving up and justifying it by some determined-at-birth narrative.

Expand full comment