Vote 2020 graphic
Everything you need to know about and expect during
the most important election of our lifetimes

If your brain were a computer, how much storage space would it have?

Illustration for article titled If your brain were a computer, how much storage space would it have?

The comparison between the human brain and a computer is not a perfect one, but it does lend itself to some interesting lines of inquiry. For instance: what is the storage capacity of your brain?


The answer to the first question – how much storage space is there inside the average human head? – varies considerably, depending on who you ask. Some estimates come in as low as 1 terabyte, or approximately 1,000 gigabytes. These days, you can purchase an external hard drive with twice that capacity for under a hundred bucks.


Another commonly cited estimate puts the figure at closer to 100 terabytes of storage. Slate's Forrest Wickman explains the reasoning behind this number:

The human brain contains roughly 100 billion neurons [Ed. note: closer to 86-billion, actually, but now we're just being nitpicky]. Each of these neurons seems capable of making around 1,000 connections, representing about 1,000 potential synapses, which largely do the work of data storage. Multiply each of these 100 billion neurons by the approximately 1,000 connections it can make, and you get 100 trillion data points, or about 100 terabytes of information.

The reasoning behind the 100-terabyte estimate has its flaws. It assumes, for example, that each synapse stores 1 byte of information. In reality, each one could conceivably store more or less than that. Consider, for example, that a synapse can exist in more states than either on or off. As we've explained previously:

Your basic synapse is a connection between two neurons: a presynaptic neuron, and a postsynaptic neuron. Presynaptic neurons release neurotransmitters, which dock with receptors on the postsynaptic neuron and activate what are known as ion channels in the postsynaptic cell membrane.

Ion channels are like a neuron's gatekeepers; they allow charged atoms such as sodium, potassium and calcium into and out of the cell, and are thought to play an important role in the regulation of synaptic plasticity, i.e. the strengthening or weakening of neuronal connections over time.

All this is to say that when neurons talk to one another, there's more regulating their communication than a simple on/off switch.


Most of the computer chips that we use to model brain activity operate in this binary fashion – but the brain probably doesn't work this way.

Consider, also, that synapses are often interdependent, and will rely on one another to convey a single piece of information. While it's logical to assume that the brain's extensive neural networks greatly improve its processing speed (a couple years ago, researchers writing in Science concluded that the number of nerve impulses executed by one human brain per second is "in the same ballpark [as] the 6.4*1018 instructions per second that human kind [could] carry out on its general purpose computers in 2007"), it's also possible that they do so at the expense of storage capacity. Then again, Northwestern University psychologist Paul Reber argues precisely the opposite – and his storage capacity approximation blows our previous estimates out of the water:

... neurons combine so that each one helps with many memories at a time, exponentially increasing the brain’s memory storage capacity to something closer to around 2.5 petabytes [1 petabyte ≈ 1,000 terabytes]. For comparison, if your brain worked like a digital video recorder in a television, 2.5 petabytes would be enough to hold three million hours of TV shows. You would have to leave the TV running continuously for more than 300 years to use up all that storage.


So, which is it? One terabyte? 100 terabytes? 2.5-thousand terabytes? Or can you fit an entire human consciousness into just 300 megabytes (approximately 60 3-minute MP3s), as suggested in an episode of Caprica? Perhaps these questions are irrelevant. As Reber himself says: "if your brain worked like a digital video recorder, 2.5 petabytes would be enough to hold three million hours of TV shows." We've already established that our brains don't work like DVRs, or the vast majority of computers, for that matter, and so down the rabbit hole we go: how much brain-space does a memory occupy? Does a more detailed memory take up more space than a foggy one? Have forgotten memories been deleted, or have they been relegated to some forgotten subfolder in the dusty corners of your consciousness? Does a deeply rooted, subconscious bias take up more space than a transient dream? Is each encoded in different file format? And while we're exploring the brain/computer/file-size/file-type metaphor: what is the cognitive equivalent of a GIF, anyway?


Perhaps a better question is whether the size of memories and the storage capacity of the human mind are things that can be measured at all. Reason would suggest that the brain's capacity is, in fact, limited, and therefore can be measured. Determining what it's limited by, exactly, and how to quantify those limits, would be a significant boon to fields as diverse as neuroscience, robotics and computer science – especially where the three overlap.


Share This Story

Get our newsletter


The problem is what the brain can access in its memory space, and what it can't. For instance, I'll blank on Orson Welles's name while teaching Citizen Kane. Regulary.

Is that a RAM problem?