Discussion about this post

User's avatar
Gerald Monroe's avatar

> Scott Alexander: I think removing a 10% chance of humanity going permanently extinct is worth another 25-50 years of having to deal with the normal human problems the normal way.

What screams out here is that : the cost of this extra "25-50 years" of "having to deal with the normal human problems the normal way" is the most common "normal human problem" that kills 90% of every human to live so far is aging. This means (per GPT-5) 1.375-2.75 approximate billion extra human deaths is the "cost" of a 25-50 year delay to the tool that could even start on a cure.

It's also trading off Scott Alexander's future life, and essentially all of humanity who is presently alive today (even very young people, with a 25-50 year delay to superintelligence, likely will die of aging assuming some decade long delays for even a superintelligence to collect enough data and do enough experiments to devise effective treatments). It's the death of (probably) everyone old enough to read this in favor of people you will never live to meet.

This is interesting because mathematically you need to care a LOT about people you never will see for this to be "profitable". A modest discount rate and acknowledgment of extreme value drift being inevitable makes this declaration lose in EV. ("well I care about my great-grandchildren, but it's harder and harder to care about their hypothetical cyborg human-animal-AI hybrid children, or THEIR tank-grown AI-hybrid children, or THEIR purely synthetic 3d printed children, or THEIR nanotech children, or THEIR virtual children that exist only in hypothetical virtual universes simulated using...)

Seems you need to care about each n+1 generation 68% as much as you care about yourself (assuming this delays a treatment for aging by 75 years, or 50 years for the ASI, and 25 more years before a clinically effective treatment that halts and reverses aging in almost all adults devised by teams of ASI, many billions of robots working round the clock, and many trillions of new scientific experiments).

The issue is again value drift. Ok so probably the next generation of humans will still have 2 arms, 2 legs, engage in sexual reproduction, and have a sense of smell and taste and will enjoy eating animals and communicate via vibrating vocal cords and hand guestures. But the neural implants that become common....

And the AI tutors...

And the genetic modifications on the generation after that...

And the vat grown children from the generation after that with animal genes introduced to slash cancer risk by 99% and reduce aging speeds...

And the purely 3d printed synthetic bio bodies the generation after that get to stop aging entirely as a maintenance problem and redesign failure prone human organs...

And the generation after that that ditches flesh bodies mostly...

And then the cyborg collectives, downloaded skills and memories, and...

See you lose any sense of value. Just humans, acting in their own best interests, are going to be as alien in future generations as the hypothetical ASI societies that "destroy all value" that Zvi is afraid of.

Expand full comment
Mike's avatar

Normally when people sign a statement saying "Don't do X", they in fact mean "Don't do X", not "Do X but in a weird way". There are not, typically, maximally awful unsaid parts. The statement is just what is said. If you have concerns, you can of course just ask some questions.

Great job on Sriram's part. More of that, please.

Expand full comment
54 more comments...

No posts

Ready for more?