Introducing the brain: needs compared to AI (Introduction)

by David Turell @, Thursday, February 10, 2022, 18:43 (1015 days ago) @ David Turell

AI scientists are surprised at how many neural networks are needed:

https://www.quantamagazine.org/computer-scientists-prove-why-bigger-neural-networks-do-...

"Fundamental mathematical results had suggested that networks should only need to be so big, but modern neural networks are commonly scaled up far beyond that predicted requirement — a situation known as overparameterization.

"In a paper presented in December at NeurIPS, a leading conference, Sébastien Bubeck of Microsoft Research and Mark Sellke of Stanford University provided a new explanation for the mystery behind scaling’s success. They show that neural networks must be much larger than conventionally expected to avoid certain basic problems. The finding offers general insight into a question that has persisted over several decades.

***

"A network’s size determines how much it can memorize.

***

"When neural networks first emerged as a force in the 1980s, it made sense to think the same thing. They should only need n parameters to fit n data points — regardless of the dimension of the data.

“'This is no longer what’s happening,” said Alex Dimakis of the University of Texas, Austin. “Right now, we are routinely creating neural networks that have a number of parameters more than the number of training samples. This says that the books have to be rewritten.”

***

"In their new proof, the pair show that overparameterization is necessary for a network to be robust. They do it by figuring out how many parameters are needed to fit data points with a curve that has a mathematical property equivalent to robustness: smoothness.

***

"Other research has revealed additional reasons why overparameterization is helpful. For example, it can improve the efficiency of the training process, as well as the ability of a network to generalize. While we now know that overparameterization is necessary for robustness, it is unclear how necessary robustness is for other things. But by connecting it to overparameterization, the new proof hints that robustness may be more important than was thought, a single key that unlocks many benefits.

“'Robustness seems like a prerequisite to generalization,” said Bubeck. “If you have a system where you just slightly perturb it, and then it goes haywire, what kind of system is that? That’s not reasonable. I do think it’s a very foundational and basic requirement.'”

Comment: an unfair comparison I admit, but it hints at the underlying complexity of our brain which is built from its beginning to handle anything we throw at it. Only design can explain our brain's capacity for handling new uses


Complete thread:

 RSS Feed of thread

powered by my little forum