Share & grow the world's knowledge!

- snoweel on What kind of humongous tasks do supercomputers do? What type of mathematical models can be so complex that it requires a computer close to $1B?
- adam12349 on eli5 What is the difference between kW and kWh
- jmlinden7 on How does our eyes interpret so many photons of light which travels so much faster than neural signals?
- ADDeviant-again on ELI5- How do surgeons sew up after an operation without leaving air in the body?
- Riconquer2 on 10,000 years ago we were cavemen who lived off the land. We only had what was accesible in nature. Now we have cars, iPhones, and computers. I’m assuming everything we have now is composed of the same matter cavemen had access to. How did we get all this stuff?

Copyright © 2022 AnswerCult

Backpropagation is the cornerstone on which the entire success of ANN rests. There is no substitute for it.

It was long suspected that with enough elements in a a representation, even very simple elements, you can principle represent any desired problem-solver. But the number of possible element weights is so utterly unimaginably enormous that it’s inconceivable that you would ever find one of those weight vectors.

The backpropagation algorithm is a method of computing these weights from scratch iteratively. It runs sort of efficiently, and it is entirely differentiable (that is, you can tell at any point which way to adjust which weight, even with many. many dimensions). Without this algorithm, the entire deep learning revolution could not have happened.