Backpropagation

Backpropagation is a calculus based algorithm used to incrementally modify the weights and biases of artificial neural networks in order to minimize the loss (error) of predictions vs. training values.

This diagram shows how backpropagation fits into artificial neural network model training:

To understand the backpropagation equations, let's look at the backpropagation flow of data for a very simple 3 node neural network:

using: