Is It Time to Give Up on the Singularity?George Dvorsky and Ramez Naam6/05/14 2:16pmFiled to: Futurismscienceartificial intelligenceartificial superintelligencesaimachine intelligencecomputer sciencetechnological SingularitySingularityintelligence explosionexistential risks9918EditPromoteShare to KinjaToggle Conversation toolsGo to permalinkSome futurists and science fiction writers predict that we're on the cusp of a world-changing "Technological Singularity." Skeptics say there will be no such thing. Today, I'll be debating author Ramez Naam about which side is right.AdvertisementA recent article by Erik Sofge in Popular Science really got my hackles up. Sofge argued that the Singularity is nothing more than a science fiction-infused faith-based initiative — a claim I take great exception to given (1) the predictive power and importance of speculative fiction, and (2) the very real possibility of our technologies escaping the confines of our comprehension and control. In his article, Sofge described futurist Ramez Naam as a "Singularity skeptic," which prompted me to contact him and have a debate. Here's how our conversation unfolded.George: You were recently described by Sofge as a "Singularity skeptic," which for me came as a bit of surprise given your amazing track record as a futurist and scifi novelist. You've speculated about such things as interconnected hive-minds and the uploading of human conscious to a computer — but you draw the line, it would seem, at super artificial intelligence (SAI). Now, I'm absolutely convinced that we'll eventually develop a machine superintelligence with capacities that exceed our own by an order of magnitude — leading to the Technological Singularity, or so-called Intelligence Explosion (my preferred term). But if I understand your objections correctly, you're suggesting that the pending preponderance of highly specialized generalized AI will never amount to a superintelligence — and that our AI track record to date proves this. I think it's important that you clarify and elaborate upon this, not least because you're denying something that many well-respected thinkers and AI theorists describe as an existential risk. Also, I'm also hoping you can provide your own definition of the Singularity just to ensure that we're talking about the same thing.