Introducing the brain; complexity as seen by a computer geek (Introduction)

by David Turell @, Tuesday, November 13, 2018, 00:01 (677 days ago) @ David Turell

An amazing description of the brain massive networks, neurons and controls:

"The idea of intelligent machines captivates the imagination of many, and especially how they would compare to humans. Specifically, one fundamental question that seems to come up frequently is about the underlaying mechanisms of intelligence — do these artificial neural networks really work like the neurons in our brain?

"a brain neuron has three components:

"The dendrites (the input mechanism) — tree like structure that receives input through synaptic connections. The input could be sensory input from sensory nerve calls, or “computational” input from other neural cells. A single cell can have as many as 100K inputs (each from a different cell)

"The Soma (the calculation mechanism) — this is the cell body where inputs from all the dendrites come together, and based on all these signals a decision is made whether to fire an output (a “spike”). This is a bit of a generalisation, as some of the calculation already happens before the Soma, and is encoded in the dendritic structure of the cell.

"The axon (the output mechanism) — once a decision was made to fire an output signal (thus making the cell active), the axon is the mechanism that carries the signal, and through a tree like structure as its terminal, it delivers this signal to the dendrites of the next layer of neurons via a synaptic connection.


"Plasticity — one of the unique characteristics of the brain, and the key feature that enables learning and memory is its plasticity — ability to morph and change. New synaptic connections are made, old ones go away, and existing connections become stronger or weaker, based on experience. Plasticity even plays a role in the single neuron — impacting its electromagnetic behavior, and its tendency to trigger a spike in reaction to certain inputs.


"The complexity and robustness of brain neurons is much more advanced and powerful than that of artificial neurons. This is not just about the number of neurons, and the number of dendritic connections per neuron — which are orders of magnitude of what we have in current ANNs. But it’s also about the internal complexity of the single neuron: as detailed below, the chemical and electric mechanisms of the neurons are much more nuanced, and robust compared to the artificial neurons. For example, a neuron is not isoelectric — meaning that different regions in the cell may hold different voltage potential, and different current running through it. This allows a single neuron to do non linear calculations, identify changes over time (e.g moving object), or map parallel different tasks to different dendritic regions — such that the cell as a whole can complete complex composite tasks. These are all much more advanced structures and capabilities compared to the very simple artificial neuron.

"Implementation — the neurons in the brain are implemented using very complex and nuanced mechanisms that allow very complex non linear computations:

"chemical transmission of signals between neurons in the synaptic gap, through the use of neurotransmitters and receptors, amplified by various excitatory and inhibitory elements.

"Excitatory / inhibitory Post synaptic potential that builds up to action potential, based on complex temporal and spatial electromagnetic waves interference logic

"Ion channels and minute voltage difference a governing the triggering of spikes in the Soma and along the axon.

"the overall network architecture of neurons in the brain is much more complex than most ANNs. Especially, your common next door feed forward network, where each layer is connected only to the previous and next layers. But even compared to multi layered RNNs, or residual networks, the network of neurons in the brain is ridiculously complex, with tens of thousands of dendrites crossing “layers” and regions in numerous directions.


"Power consumption — the brain is an extremely efficient computing machine, consuming on the order of 10 Watts. This is about one third the power consumption of a single CPU…"

Comment: I have not included his comments about how the AI computer researchers try to copy the brain, but basically it is an impossible task to achieve the same result. What he seems to describe is that each neuron is like a little computer attached to all those other little computers. But that is really not what is meant: see the next entry

Complete thread:

 RSS Feed of thread

powered by my little forum