- Rectified Linear Unit (ReLU)
An activation function with the following rules:
- If input is negative or zero, output is 0.
- If input is positive, output is equal to input.

Cross-Entropy Loss In the PyTorch documentation, you can see that the cross entropy loss function actually involves two steps:
It first applies a softmax function to any output is sees Then applies NLLLoss; negative log likelihood loss Then it returns the average loss over a batch of data. Since it applies a softmax function, we do not have to specify that in the forward function of our model definition, but we could do this another way.