Your Brain Has As Much Memory As The Entire Internet
New study suggests the memory capacity of the human brain is 10 times more than once thought
You may not remember where you left your car keys (or even what you ate for breakfast), but a new study in the journal eLife suggests the human brain is capable of storing 10 times more data than previously thought. The findings may help researchers design efficient, high-power artificial intelligence.
“This is a real bombshell in the field of neuroscience,” said Terry Sejnowski, professor at the Salk Institute in California and coauthor on the study, in a prepared statement. “Our new measurements of the brain’s memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web.”
First, some background. When thoughts form in your brain, neurons exchange information across specific junctions called synapses. Each neuron can have thousands of synapses, connecting them to thousands of other neurons. These synapses change over time, depending on how often you use them. When the same neurons connect across the same synapses over and over again, the ties between them grow strong and form memories. This theory in memory formation is often summarized as, “neurons that fire together, wire together.”
Salk scientists were studying rat brains when they first noticed that one neuron could form two nearly identical synapses with another neuron, which seemed redundant. But since these synapses appeared to be the same size, and synapses only come in three sizes (small, medium and large) they figured that there couldn’t be much difference between the two. When they looked closer, however, they found slight, unexpected differences.
“We were amazed to find that the difference in the sizes of the pairs of synapses were very small, on average, only about 8 percent different in size,” said Tom Bartol, Salk staff scientist and coauthor on the study, in a prepared statement. “This was a curveball from nature.”
Eight percent may not seem like a lot, but when they plugged that figure into an algorithm for studying the human brain, this minor difference threw the system for a loop, and predicted the existence of 26 synapse sizes rather than three—or a nearly 10-fold increase in synapse sizes, corresponding to roughly 4.7 bits of information at each synapse. Prior estimates had put the storage capacity of each synapse at between 1 and 2 bits. “This is roughly an order of magnitude of precision more than anyone has ever imagined,” Sejnowski said.
Although the adult brain generates only about 20 watts of continuous power (similar to a dim light bulb) our minds are incredibly efficient at storing and retrieving memory. Studying how our brains manage to pull off such high-precision activities with so little power could inform a generation of computer scientists trying to build precise, but energy-efficient computers capable of deep learning, speech recognition and more advanced artificial intelligence.
“The implications of what we found are far-reaching,” adds Sejnowski. “Hidden under the apparent chaos and messiness of the brain is an underlying precision to the size and shapes of synapses that was hidden from us.”
“This trick of the brain absolutely points to a way to design better computers.”