Introducing the brain: the plasticity of the single neuron (Introduction)

by David Turell @, Friday, July 12, 2019, 19:21 (1721 days ago) @ David Turell

The dendrites play a huge role in processing the input of information signals:

https://medicalxpress.com/news/2019-07-neuron.html

"How do neurons process information? Neurons are known to break down an incoming electrical signal into sub-units. Now, researchers at Blue Brain have discovered that dendrites, the neuron's tree-like receptors, work together—dynamically and depending on the workload—for learning.

***

"Their results show that when a neuron receives input, the branches of the elaborate tree-like receptors extending from the neuron, known as dendrites, functionally work together in a way that is adjusted to the complexity of the input.

"The strength of a synapse determines how strongly a neuron feels an electric signal coming from other neurons, and the act of learning changes this strength. By analyzing the "connectivity matrix" that determines how these synapses communicate with each other, the algorithm establishes when and where synapses group into independent learning units from the structural and electrical properties of dendrites. In other words, the new algorithm determines how the dendrites of neurons functionally break up into separate computing units and finds that they work together dynamically, depending on the workload, to process information.

***

"This newly observed dendritic functionality acts like parallel computing units meaning that a neuron is able to process different aspects of the input in parallel, like supercomputers. Each of the parallel computing units can independently learn to adjust its output, much like the nodes in deep learning networks used in artificial intelligence (AI) models today.

"Comparable to cloud computing, a neuron dynamically breaks up into the number of separate computing units demanded by the workload of the input.

***

"Additionally, the research reveals how these parallel processing units influence learning, i.e. the change in connection strength between different neurons. The way a neuron learns depends on the number and location of parallel processors, which in turn depend on the signals arriving from other neurons. For instance, certain synapses that do not learn independently when the neuron's input level is low, start to learn independently when the input levels are higher.

***

"'The method finds that in many brain states, neurons have far fewer parallel processors than expected from dendritic branch patterns. Thus, many synapses appear to be in 'grey zones' where they do not belong to any processing unit," explains lead scientist and first author Willem Wybo. "However, in the brain, neurons receive varying levels of background input and our results show that the number of parallel processors varies with the level of background input, indicating that the same neuron might have different computational roles in different brain states."

"'We are particularly excited about this observation since it sheds a new light on the role of up/down states in the brain and it also provides a reason as to why cortical inhibition is so location-specific. With the new insights, we can start looking for algorithms that exploit the rapid changes in pairing between processing units, offering us more insight into the fundamental question of how the brain computes," concludes Gewaltig."

Comment: The neuron turns out to a minicomputer all within itself. No wonder our brain is so powerful at the size it is, compared to animals.


Complete thread:

 RSS Feed of thread

powered by my little forum