Sympathetic Stupid

Tuesday, August 02, 2005

Nick Bostrom: Existential Risks

Apparently there are an infinite number of PhD students and an infinite number of Windows boxes. This guarantees that a well-researched, overly long thesis will be written on every topic.

That's probably unfair on Nick Bostrom as this paper is a well written, dispassionate analysis of the future of humanity. And it's really quite readable and not too long. His basic thesis is:

Because of accelerating technological progress, humankind may be rapidly approaching a critical phase in its career. In addition to well-known threats such as nuclear holocaust, the prospects of radically transforming technologies like nanotech systems and machine intelligence present us with unprecedented opportunities and risks.

His paper is an analysis of the 'existential risks' which become more common as technology becomes more powerful. There's some boring ones - nuclear holocaust (yawn), asteroid or comet impact (double yawn), misuse of nanotechnology (of course), badly programmed superintelligence (is there another kind?) - and some more interesting ones. I like we're living in a simulation and it gets shut down. Although from my perspective it's more likely that we're living in a simulation and it crashes because it's running on a Windows machine. Sorry, cheap shot.

(Hey, maybe Microsoft's dominance of the world is saving us from the Matrix possibility; they can't even protect their OS from eight year old script kiddies, how can they create an all-encompassing super app? And who else has the power?)

But it's not just the big ones, the bangs that are possible. There are also crunches, which the race would survive but losing technology so set back by thousands of years (that's right, even the Internet). I've always liked dysgenic pressures. Basically, we'll breed ourselves dumber:

It seems that there is a negative correlation in some places between intellectual achievement and fertility.

Speaking from experience, Nick? I hear ya.

Then there are shrieks; which restrict the race to but a small fraction of our potential. The easy one is a repressive totalitarian global regime. And I'll leave that there.

And lastly, whimpers; these tend to be more insidious. For example, our potential or even our core values are eroded by evolutionary development. In other words, we evolve into a suboptimal situation. His example is a colonization race, where all we do is manufacture and send out colonization probes. But of course this raises the question of what 'optimal' life is. Tough one.

I could go on for hours, it's a very interesting paper.

One last nugget; the Fermi Paradox. We've seen no signs of extraterrestrial life, therefore Earth-like planets generally don't cause the evolution of life which can colonize the universe. If they did, we would have seen some signs. This leads to the conclusion that there must be at least one Great Filter, an event or development stage which species never get past. 'If the Great Filter isn't in our past, we must fear it in our (near) future.'

To me, that's kinda scary. But apparently the consensus is that it's probably in our past; mainly because we haven't seen any other life forms. Wait, is that circular reasoning? I don't quite get it. If it is in our past, the human race is unimaginably improbable, and so very, very special.