Recap:
- prediction
- loss of one example
In logistic regression, we want to modify the parameters, W and B, in order to reduce this loss.
in Python we denote it as .
where
If you want to do gradient descent with respect to just this one example, what you would do is the following;