You knit me together in my mother’s womb. Thank You for making me so wonderfully complex. Your workmanship is marvelous! (Psalm 139:14)
Whoever said the human brain is the most highly organized collection of matter in the universe was more correct than they could have known. New research modeled tiny structures within nerve cells and discovered a clever tactic brains use to increase computing power while maximizing energy efficiency. Its design could form the basis of a whole new and improved class of computer.
Neurobiologists from the Salk Institute of La Jolla, California, and the University of Texas, Austin, collaborated to build 3-D computer models that mimic tiny sections of rat hippocampus—a brain region in mammals where neurons constantly process and store memories. One of the models, published in the biological journal eLife, helped reveal that the sizes of synapses change within minutes.
Synapses occur at junctions between nerve cells, like two people holding hands. Each cell can have a thousand “hands” contacting as many neighbors to form a dizzying 3-D array with billions of connections and pathways. Each junction transfers information between cells by passing along tiny chemicals called “neurotransmitters.”
Groundbreaking imaging published in 2011 revealed many more of these nerve-to-nerve connections than ever imagined, prompting comparisons between the human brain and the number of switches in all the computers and internet connections on earth. It turns out that the sizes of these connection points, called “synapses,” shifts with use or disuse—a process called “synaptic plasticity.” Synapses strengthen when learning occurs or weaken when unused.
How do brains do it? They store and transmit information not with the simplistic 0s and 1s of computer code, but with degrees of synapse strength. In other words, they don’t transfer information with a single input spike, but recognize 26 distinguishably different levels of synaptic strength. Authors of the new research looked for possible advantages to this complicated molecular variability. They wrote that synapses “might reflect a sampling strategy designed for energetic efficiency.” Nerve cells use the size and stability of each synapse to process and record information such as memories.
Separate research, published in Nature Communications, found that biochemical communication inside each synapse constantly monitors and adjusts synaptic plasticity. This includes positive feedback loops and a safety mechanism to prevent cell death, according to a research summary article published in Nature by Christine Gee and Thomas Oertner of the Center of Molecular Neurobiology HamburgTerry Sejnowski, co-senior author of the eLife study, told the Salk Institute, “We discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power. Our new measurements of the brain’s memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web.”
What’s a petabyte? 8,000,000,000,000,000 bits of information. The mind-boggling levels of organization and necessary regulatory protocols in synapses refute all notions that brains evolved from single cells through merely natural processes. The strategies, algorithms, and design principles brains employ could only have come from an otherworldly Architect whose genius mankind can only dream of copying. —Brian Thomas, M.S.