Nick Bostrom
This was, simultaneously, one of the driest and most terrifying books I have ever read.
Really, the conclusion summarized it well:
“Before the prospect of an intelligence explosion, we humans are like small children playing with a bomb. Such is the mismatch between the power of our plaything and the immaturity of our conduct. Superintelligence is a challenge for which we are not ready now and will not be ready for a long time. We have little idea when the detonation will occur, though if we hold the device to our ear we can hear a faint ticking sound.”
It is what the title says: a list of ways we can achieve superintelligence (including, I’d note, a discussion of the fact that it’s both necessary and inevitable), a harrowing discussion of exactly how many ways it can go wrong, and some things we can start trying to do to keep it from going all Skynet on us. Or, as is more likely, wiping out humanity without really noticing, because we were a convenient source of raw materials.
Like I said: terrifying.
But valuable. I’m also convinced this book should be required reading for any AI course. And, y’know, a good chunk of the population beyond that: I count AI as one of the three most likely existential threats out there.1
So hey, want to somehow be a little bored and scared out of your mind at the same time? Read it.
- I’ve got it tied with “Global War, Nuclear” and “Climate Change.” Lower on the list are “A Pandemic With 100% Transmission Rate and 90-Plus Percent Lethality” and “Something From Space.” ↩
One reply on ““Superintelligence: Paths, Dangers, Strategies””
[…] born in that kind of a crucible. Trying to create an AI that doesn’t accidentally wipe us out is difficult enough; in this world, we created an army of them and they spent their childhoods as our torture-slaves. […]