Regularized Logistic Regression (CS229)

Logistic Regression of Cost Function

Recall logistic regression of cost function
$$J(\theta)=-\frac{1}{m}\left[\sum_{i=1}^my^{(i)}\log\left(h_{\theta}(x^{(i)})\right)+(1-y^{(i)})\log\left(1-h_{\theta}(x^{(i)})\right)\right]\tag{1}$$

Regularized Logistic regression of Cost Function

All we have to do is add a modify term to $(1)$

$$J(\theta)=-\frac{1}{m}\left[\sum_{i=1}^my^{(i)}\log\left(h_{\theta}(x^{(i)})\right)+(1-y^{(i)})\log\left(1-h_{\theta}(x^{(i)})\right)\right]+\frac{\lambda}{2m}\sum_{i=1}^n\theta_j^2\tag{2}$$

Where $j=1,2,3,…,n$

Implementation

Repeat
$$\theta_0:=\theta_0-\alpha\frac{1}{m}\sum_{i=1}^m(h_{\theta}(x^{(i)})-y^{(i)})x_0^{(i)}\tag{3}$$
$$\theta_j:=\theta_j-\alpha\left[\frac{1}{m}\sum_{i=1}^m(h_{\theta}(x^{(i)})-y^{(i)})x_j^{(i)}+\frac{\lambda}{m}\theta_j\right]\tag{4}$$

where
$$h_{\theta}(x^{(i)})=\frac{1}{1+e^{\mathbf{-\theta^\top x}}}$$

Similar to regularized linear regression, terms in square braces in $(4)$ are result from $\frac{\partial}{\partial\theta_j}J(\theta)$.