The human brain has been described as a massively parallel computing machine. But just how powerful is it? A recent brain scan analysis is offering some unexpected results.

Image: DiffusionMRI glyphs (a process similar to fMRI). Each of the brain's voxel values is represented by an ellipsoid. (CC)

Neuroscientist Harris Georgiou from the National Kapodistrian University of Athens in Greece recently took it upon himself to count the number of discrete "CPU cores" at work in the human brain as it performs simple tasks in a functional magnetic resonance imaging (fMRI) machine.


This is quite the task when you think about it. The human brain consists of about 100 billion neurons that each make up about 10,000 connections with their counterparts. By using an fRMI machine, and by quantizing the brain into 3D pixels called voxels (each about five cubic millimeters in size), Georgiou was able to map the complete activity of the brain using a grid of 60 x 60 x 30 voxels. Through this method, he produced a dataset of around 30 million data points.


Brain activity was recorded as participants performed one of two simple tasks, one a visuo-motor task, the other a visual recognition task. MIT's Technology Review describes the findings:

Although the analysis is complex, the outcome is simple to state. Georgiou says that independent component analysis reveals that about 50 independent processes are at work in human brains performing the complex visuo-motor tasks of indicating the presence of green and red boxes. However, the brain uses fewer processes when carrying out simple tasks, like visual recognition.

That's a fascinating result that has important implications for the way computer scientists should design chips intended to mimic human performance. It implies that parallelism in the brain does not occur on the level of individual neurons but on a much higher structural and functional level, and that there are about 50 of these.

Georgiou points out that a typical voxel corresponds to roughly three million neurons, each with several thousand connections with its neighbors. However, the current state-of-the-art neuromorphic chips contain a million artificial neurons each with only 256 connections. What is clear from this work is that the parallelism that Georgiou has measured occurs on a much larger scale than this. [emphasis added]

So an artificial equivalent of a brain-like cognitive structure may not require a massively parallel architecture at the level of single neurons. Instead, and as noted by Georgiou, one could be built using "a properly designed set of limited processes that run in parallel on a much lower scale."


Read the entire article at MIT's Technology Review, and check out the scientific study here.