Introducing the brain: how synapses handle information (Introduction)

by David Turell @, Wednesday, June 05, 2024, 17:18 (169 days ago) @ David Turell

Using information theory:

https://www.livescience.com/health/neuroscience/the-brain-can-store-nearly-10-times-mor...

"The brain may be able to hold nearly 10 times more information than previously thought, a new study confirms.

"Similar to computers, the brain's memory storage is measured in "bits," and the number of bits it can hold rests on the connections between its neurons, known as synapses. Historically, scientists thought synapses came in a fairly limited number of sizes and strengths, and this in turn limited the brain's storage capacity. However, this theory has been challenged in recent years — and the new study further backs the idea that the brain can hold about 10-fold more than once thought.

"In the new study, researchers developed a highly precise method to assess the strength of connections between neurons in part of a rat's brain. These synapses form the basis of learning and memory, as brain cells communicate at these points and thus store and share information.

***

"In the human brain, there are more than 100 trillion synapses between neurons. Chemical messengers are launched across these synapses, facilitating the transfer of information across the brain. As we learn, the transfer of information through specific synapses increases. This "strengthening" of synapses enables us to retain the new information. In general, synapses strengthen or weaken in response to how active their constituent neurons are — a phenomenon called synaptic plasticity.

***

"To measure synaptic strength and plasticity, the team harnessed information theory, a mathematical way of understanding how information is transmitted through a system. This approach also enables scientists to quantify how much information can be transmitted across synapses, while also taking account of the "background noise" of the brain.

"This transmitted information is measured in bits, such that a synapse with a higher number of bits can store more information than one with fewer bits...One bit corresponds to a synapse sending transmissions at two strengths, while two bits allows for four strengths, and so on.

"The analysis suggested that synapses in the hippocampus can store between 4.1 and 4.6 bits of information. The researchers had reached a similar conclusion in an earlier study of the rat brain, but at that time, they'd crunched the data with a less-precise method. The new study helps confirm what many neuroscientists now assume — that synapses carry much more than one bit each...

"The findings are based on a very small area of the rat hippocampus, so it's unclear how they'd scale to a whole rat or human brain. It would be interesting to determine how this capacity for information storage varies across the brain and between species, Yu said."

Comment: of course, the brain carries information so using Shannon information theory makes perfect sense.


Complete thread:

 RSS Feed of thread

powered by my little forum