Logistic Regression Gradient Descent

Recap:

☑️

☑️ - prediction

☑️ - loss of one example

In logistic regression, we want to modify the parameters, W and B, in order to reduce this loss.

  1. Compute derivative of loss with respect to the prediction.

in Python we denote it as .

where

If you want to do gradient descent with respect to just this one example, what you would do is the following;