By all reports, and as one would expect, Google’s Gemini looks to be substantially superior to GPT-4. We now have more details on that, and also word that Google plans to deploy it in December, Manifold gives it 82% to happen this year and similar probability of being superior to GPT-4 on release.
The Roon quote has multiple duplicated paragraphs, starting with "People who call themselves accelerationist scoff".
In the list of possible opinions of accelerationists, another possibility that you showed some weeks ago is "You think AGI is real and will kill us, but you are ok with it."
Instead of making people feel low status by analogizing AI to their parent, maybe try "you are to AI as Vladimir Putin's father is to Putin." Putin's dad was a war vet who fought the Nazis and got wounded during WWII. A total badass!
He brought up Putin to be a good communist (aligned) and a total badass (capable), reading the works of Marx and Lenin and learning judo from a young age. Putin was smart, speaking German as a second language and earning a PhD in economics.
Then he joined the KGB and eventually rose to become the leader of Russia, courted a friendly relationship with America, and now he's putting the world in a state of existential crisis. Then ask how much smarter and more ruthless the Putin-AI would have to be than Putin himself to make you feel like it's a genuine existential threat? My guess is that the answer is "not much."
NGL I'm a little suspicious of Nvidia funding all these unicorns. If you're REALLY supply constrained, why do you need to give a startup with no revenue $700M to buy GPUs from you?
You use 'mature' to describe what technologically advanced societies collectively aren't; I recall Eliezer Yudkowsky using 'sane'. Do you basically mean the following?
'Sharing a vision of the overall good, sharing a realistic plan for attaining that good, and - despite the absence of a hegemonic enforcer - never defecting.'
If so, might 'cooperative' and/or 'loyal' capture that better in a single word? They seem more appropriate for valiant pursuit of improbable success than 'mature' and 'sane' do.
EY claiming that security mindset is somehow genetic really rankles. Why is security mindset any less learnable than any other high level skill? Sure, some people are more naturally paranoid, but it's still possible to get good at being honest about the adversary's moves in a game, and anyone with appropriate incentives is likely to git gud, fast. Maybe Eliezer could spend time thinking about how to teach security mindset, the way people teach chess or poker, instead of giving up and justifying it by some determined-at-birth narrative.
AI #27: Portents of Gemini
More wit and wisdom of Sam Altman:
https://nitter.net/nearcyan/status/1697322347701702962#m
The other 2% are e/acc.
The Roon quote has multiple duplicated paragraphs, starting with "People who call themselves accelerationist scoff".
In the list of possible opinions of accelerationists, another possibility that you showed some weeks ago is "You think AGI is real and will kill us, but you are ok with it."
"You can adopt" should be "adapt", twice?
Instead of making people feel low status by analogizing AI to their parent, maybe try "you are to AI as Vladimir Putin's father is to Putin." Putin's dad was a war vet who fought the Nazis and got wounded during WWII. A total badass!
He brought up Putin to be a good communist (aligned) and a total badass (capable), reading the works of Marx and Lenin and learning judo from a young age. Putin was smart, speaking German as a second language and earning a PhD in economics.
Then he joined the KGB and eventually rose to become the leader of Russia, courted a friendly relationship with America, and now he's putting the world in a state of existential crisis. Then ask how much smarter and more ruthless the Putin-AI would have to be than Putin himself to make you feel like it's a genuine existential threat? My guess is that the answer is "not much."
NGL I'm a little suspicious of Nvidia funding all these unicorns. If you're REALLY supply constrained, why do you need to give a startup with no revenue $700M to buy GPUs from you?
You use 'mature' to describe what technologically advanced societies collectively aren't; I recall Eliezer Yudkowsky using 'sane'. Do you basically mean the following?
'Sharing a vision of the overall good, sharing a realistic plan for attaining that good, and - despite the absence of a hegemonic enforcer - never defecting.'
If so, might 'cooperative' and/or 'loyal' capture that better in a single word? They seem more appropriate for valiant pursuit of improbable success than 'mature' and 'sane' do.
EY claiming that security mindset is somehow genetic really rankles. Why is security mindset any less learnable than any other high level skill? Sure, some people are more naturally paranoid, but it's still possible to get good at being honest about the adversary's moves in a game, and anyone with appropriate incentives is likely to git gud, fast. Maybe Eliezer could spend time thinking about how to teach security mindset, the way people teach chess or poker, instead of giving up and justifying it by some determined-at-birth narrative.