While the comparison between the computer and the human brain is one that has been made for over half a century, the way each one processes information could not be more different. Now, IBM researchers have designed a revolutionary chip that, for the first time, actually mimics the functioning of a human brain.
Are we finally on the verge of true artificial intelligence?
Earlier this year, IBM's Watson computer made history by trouncing Jeopardy champs Ken Jennings and Brad Rutter in an intimidating display of computer overlord-dom. But to compare Watson's computing power to the complexity of the human brain would still constitute a pretty epic oversimplification of what it means to "think" like a human.
When the human brain formulates a thought, learns a new skill, or digs deep in its archives to recover a memory, it does so in a uniquely dynamic way. There are billions upon billions of neurons in that head of yours, and the strength and number of each one's connections with other neurons is constantly in flux. The plastic nature of these neural networks allow for computation and memory to become closely intertwined, the result being a fantastically efficient and powerful "processor."
Computers, by comparison, must trudge through information one bit at a time, channeling each bit back and forth between connected, but discrete, processor and memory units. The more complicated the task, the more bits of information the computer needs to shift back and forth between its distinct components.
Some people may object to the use of the word "trudge" to describe the way a computer goes about making sense of information, but compared to the efficiency of the brain there's just no other way to describe it. Sure, modern computers may go through impressive amounts of information at impressive speeds, but that's due in no small part to the enormous quantities of power that this process requires.