Matt: Dangerous AI? (Introduction)
> > Matt: I think that in the next 100 years or so, we'll be able to mimic a great majority of human behaviors, but "true" AI is something that I do not believe will be possible until the underlying hardware more accurately models a brain. > > That will require the computing ability of 100 billion neurons with trillions of synapses, which themselves can self-modify electrical impulses received. Modelling the brain in humans raises ergical problems because it will require electrodes in bran aread as well as very careful 3-d depictions of axon/synapse mapping, notwithstanding the problem taht brain plasticity throws another curve at the results. No two brains are the same in finer detail, so it would have to be a generic AI brain.-From an engineering perspective, this really isn't terribly difficult, just costly. Von Neumann architecture is what prevents us from really implementing this, and as I've stated previously, HP has a "memristor" architecture that is a huge step towards a hardware-based neuron. And while the notion of "billions of connections" may flummox you, the history of computer science is littered with the remains of "that can't be done!" With smart selections, we can set aside some parts for hardware computing and do other parts in software. It means that the result will be either too slow or too simplistic. But the key observation is this:-It isn't necessary to COPY the human brain via neurons, (as in all biochemical processes) it's only necessary to exhibit the same behavior with a given input. - And this is downright trivial. Self-modifying programs have been around since the Harvard Mark I, which was the first ever stored-program computer. -If the human brain is a computer (and I advocate that it IS) then it means directly that it is capable of emulation by *any* turing-complete machine. -See above note on behavior. - Not at all. It isn't necessary to make new hardware connections because of software. Let me state it another way: programs that rewrite themselves WHILE THEY ARE BEING RUN are trivial. (Well, to a programming expert.) I know this because this kind of programming is also a huge security gap in computing at large. -As for the final sentence: "No two brains are the same in finer detail, so it would have to be a generic AI brain."-That's called window dressing. That part will be addressed by the fact that social context alters meanings:-http://www.youtube.com/watch?v=DF39Ygp53mQ-^^Same exact robot. One is dressed as male, one as female. -Basically, a generic AI brain dressed differently will be interpreted appropriately.
--
\"Why is it, Master, that ascetics fight with ascetics?\"
\"It is, brahmin, because of attachment to views, adherence to views, fixation on views, addiction to views, obsession with views, holding firmly to views that ascetics fight with ascetics.\"
Complete thread:
- Matt: Dangerous AI? -
David Turell,
2013-12-19, 14:22
- Matt: Dangerous AI? -
xeno6696,
2013-12-21, 19:06
- Matt: Dangerous AI? -
David Turell,
2013-12-21, 19:56
- Matt: Dangerous AI? -
xeno6696,
2013-12-23, 04:26
- Matt: Dangerous AI? -
David Turell,
2013-12-23, 15:48
- Matt: Dangerous AI? -
xeno6696,
2013-12-23, 17:10
- Matt: Dangerous AI? - David Turell, 2013-12-24, 01:08
- Matt: Dangerous AI? -
xeno6696,
2013-12-23, 17:10
- Matt: Dangerous AI? -
David Turell,
2013-12-26, 05:16
- Matt: Dangerous AI? -
xeno6696,
2013-12-26, 20:26
- Matt: Dangerous AI? -
David Turell,
2013-12-27, 00:12
- Matt: Dangerous AI? -
xeno6696,
2013-12-27, 03:40
- Matt: Dangerous AI? -
David Turell,
2013-12-29, 15:00
- Matt: Negative thoughts about AI - David Turell, 2014-01-02, 15:08
- Matt: Dangerous AI? -
David Turell,
2013-12-29, 15:00
- Matt: Dangerous AI? -
xeno6696,
2013-12-27, 03:40
- Matt: Dangerous AI? -
David Turell,
2013-12-27, 00:12
- Matt: Dangerous AI? -
xeno6696,
2013-12-26, 20:26
- Matt: Dangerous AI? -
David Turell,
2013-12-23, 15:48
- Matt: Dangerous AI? -
xeno6696,
2013-12-23, 04:26
- Matt: Dangerous AI? -
David Turell,
2013-12-21, 19:56
- Matt: Dangerous AI? -
xeno6696,
2013-12-21, 19:06