Backpropagation is the cornerstone on which the entire success of ANN rests. There is no substitute for it.
It was long suspected that with enough elements in a a representation, even very simple elements, you can principle represent any desired problem-solver. But the number of possible element weights is so utterly unimaginably enormous that it’s inconceivable that you would ever find one of those weight vectors.
The backpropagation algorithm is a method of computing these weights from scratch iteratively. It runs sort of efficiently, and it is entirely differentiable (that is, you can tell at any point which way to adjust which weight, even with many. many dimensions). Without this algorithm, the entire deep learning revolution could not have happened.
Latest Answers