Vectorizing Logistic Regression’s Gradient Output

In the gradient descent computation, you would do the followings:

so, and this is

Recall and , so

We had the second for loop for:


This for loop can be written as


This for loop can be written as

so in python you written

db=np.sum(dZ)/m

Vectorization to avoid For Loop

For Loop approach

; ;

for to :

         

         

         

         

for k=1 to m:

         

         

                  

         

and divide them by m;

$J/=m;$

$dw/=m$

$db/=m$

Vectorization Approach

$J=0$; $dw=np.zeros((n_x,1))$; $db=0;$

Z=np.dot(w.T,x)+b

db=np.sum(dz)/m

Then parameters w and b are updated as