The latest issue of IEEE Spectrum, a journal for speculative engineering geeks, is devoted to "the singularity," that moment when our society changes so dramatically that it becomes incomprehensible to people who lived in the past. The issue is packed with free online essays by singularity thinkers like science fiction author Vernor "Rainbows End" Vinge, Rodney Brooks of MIT's AI Lab, and Ray "Singularity is Near" Kurzweil. The whole issue is well worth a serious read. But my favorite part by far is an essay by Vinge, an SF author and computer scientist whose singularity scenarios in his novels are both compelling and realistic. He breaks down singularity scenarios into the five most-likely possibilities, any of which he thinks could happen by 2030.
Vinge writes that these scenarios include:
The AI Scenario: We create superhuman artificial intelligence (AI) in computers.
The IA Scenario: We enhance human intelligence through human-to-computer interfaces-that is, we achieve intelligence amplification (IA).
The Biomedical Scenario: We directly increase our intelligence by improving the neurological operation of our brains.
The Internet Scenario: Humanity, its networks, computers, and databases become sufficiently effective to be considered a superhuman being.
The Digital Gaia Scenario: The network of embedded microprocessors becomes sufficiently effective to be considered a superhuman being.
Later, he writes about how the singularity will probably be a "hard takeoff," or a very rapid transformation, rather than a gentle, gradual shift:
What I'm thinking of would probably be the result of intentional research, perhaps a group exploring the parameter space of their general theory. One of their experiments finally gets things right. The result transforms the world-in just a matter of hours.
I base the possibility of hard takeoff partly on the known potential of rapid malcode (remember the Slammer worm?) but also on an analogy: the most recent event of the magnitude of the technological singularity was the rise of humans within the animal kingdom. Early humans could effect change orders of magnitude faster than other animals could. If we succeed in building systems that are similarly advanced beyond us, we might experience a similar incredible runaway.
His essay also sums up many of the other essays in this special issue of IEEE Spectrum, so it's a great place to dive in. Image via IEEE Spectrum.