# Backpropogation

In a nutshell, backpropagation will consist of:

Doing a feedforward operation.
Comparing the output of the model with the desired output.
Calculating the error.
Running the feedforward operation backwards (backpropagation) to spread the error to each of the weights.
Use this to update the weights, and get a better model.
Continue this until we have a model that is good.

Assume we have a incorrectly classified point as below. Back Propogation 1

Back Propogation 2

# Math

Back Propogation Math 1

Back Propogation Math 2

Back Propogation Math 3

# Chain Rule

Main technique used in calculating derivatives.

If you want to find the partial derivative of B with respect to X, that is just partial derivative of B with respect to A times the partial derivative of A with respect to X

So basically when composing functions, derivatives basically multiply Chain Rule

Math Feed Forward

Math Back Propogation

Math Back Propogation2

Derivative of Sigmoid Function

Sigmoid Derivative