Illustration for article titled Why the Singularity wont be as big a deal as you might think

In this excerpt from mathematician and network scientist Samuel Arbesman's new book, The Half-Life of Facts: Why Everything We Know Has an Expiration Date, you'll discover something weirdly comforting. Though we hear a lot about "future shock" from rapidly-changing technology, history teaches us that humans are generally not overwhelmed by rapid scientific and technological changes. When the Singularity comes along, transforming the entire world with super-intelligent computers, it might not really be as much of a change as we thought.


When Carl Linnaeus worked out his methodology for organizing all living things in the eighteenth century, his taxonomy had three kingdoms-roughly translated as animal, vegetable, and mineral-and further divisions into classes, orders, genera, and species. Biologists now have five kingdoms, new subdivisions between kingdoms and classes called phyla (singular phylum), families between orders and genera, and even three larger overarching divisions above kingdoms known as domains. As our knowledge has grown from thousands of species to millions, so too has our system of classification.


Similarly, the way we categorize different diseases has grown rapidly. In 1893, the International List of Causes of Death was first adopted and contained about 150 different categories. As of 2012, we are up to the tenth revision of the International Statistical Classification of Diseases and Related Health Problems, known as ICD-10. It was released in 1990 and has 12,420 codes, which is nearly double that of the previous revision, ICD-9, which came out only a little more than ten years before ICD-10. As facts have proliferated, how we manage knowledge and think about it has also had to grow, with our classification systems ramifying in sophisticated and complex ways.

On the one hand, being exposed to more complexity, whether it be in the realm of categorization of diseases, living things, or the many other classification systems we use-from types of occupations to Internet domain names-could make us more intelligent. Just as being exposed to cognitively demanding television shows and video games seems to increase our ability to think critically, so too could more facts, and their attendant complex classification systems, make us smarter.

But on the other hand, our brains have a certain capacity. For example, when it comes to social ties, there seems to be an upper bound on how many people we can regularly interact with and keep in our minds, known as Dunbar's number. Is the same thing true for changing knowledge? Upon being confronted with his ignorance of the Copernican notion that the Earth orbits the Sun, Sherlock Holmes argued this very point:

"You see," he explained, "I consider that a man's brain originally is like a little empty attic, and you have to stock it with such furniture as you choose. A fool takes in all the lumber of every sort that he comes across, so that the knowledge which might be useful to him gets crowded out, or at best is jumbled up with a lot of other things, so that he has a difficulty in laying his hands upon it. Now the skilful workman is very careful indeed as to what he takes into his brain-attic. He will have nothing but the tools which may help him in doing his work, but of these he has a large assortment, and all in the most perfect order. It is a mistake to think that that little room has elastic walls and can distend to any extent. Depend upon it there comes a time when for every addition of knowledge you forget something that you knew before. It is of the highest importance, therefore, not to have useless facts elbowing out the useful ones."


We very likely can't handle every piece of knowledge that comes our way, and while being exposed to more and more might help us to think better, we no doubt have our limits when it comes to dealing with rapidly changing facts. This sounds like bad news. Our brains simply won't be able to handle all of this knowledge and information, and the rapidity at which it changes. There are workarounds, such as making use of online search engines. But, happily, it turns out that even when rapid change happens, it's not as overwhelming as we might think.

Many futurists are concerned with what are termed singularities, periods of such rapid and profound change due to technology that the state of the world is forever altered. These phase transitions happen so quickly that they can forever alter humanity's relationship with its surroundings. The quintessential singularity that futurists dwell on is that of the potential creation of superhuman machine intelligence. While many scientists think this is either very far off or that it will never happen, how would singularities affect us? Would a singularity tax our cognitive limits or will we be able to cope?


Chris Magee, an MIT professor who studies the rapid technological change around us, and Tessaleno Devezas of the University of Beira Interior in Portugal, decided to use history as a guide. Focusing on two events that have already happened, Magee and Devezas decided to see how humanity has dealt with fast change. They first looked at how the Portuguese gained control over increasingly large portions of the Earth's surface over the course of the fifteenth century, as their empire grew. They also looked at the progression of humanity's increasingly accurate measurement of time over the last millennium or so. In both cases there were rapid shifts in certain facts, all according to exponentially fast growth and culminating in what many would argue was the crossing of some sort of singularity threshold. In the case of Portugal, the country established a nearly globe-encompassing maritime empire, and in the case of clocks, timepieces became so advanced that measurement of time was far more precise than human perceptions.

But humanity assimilated these changes quite well. When speaking about the innovation in timekeeping, Magee and Devezas wrote:

These large changes were absorbed over time apparently without major disruption; for example, no mention is made of "clock riots" even though there was resistance and adaptation was needed. In given communities, the large changes apparently happened within less than a generation.


It is safe to assume a somewhat optimistic tone, recognizing that change, while it might be surprising to many of us, is not entirely destabilizing. Facts can change in a startlingly complex variety of ways. But far from the fluctuation in our knowledge being random, the changes are systematic and predictable. And humans are very adaptable, and are capable of understanding how knowledge changes.

Adapted from THE HALF-LIFE OF FACTS: Why Everything We Know Has an Expiration Date by Samuel Arbesman by arrangement with Current, a member of Penguin Group (USA), Inc., Copyright (c) Samuel Arbesman, 2012.


Share This Story

Get our newsletter