1. 程式人生 > >線性迴歸和邏輯迴歸介紹

線性迴歸和邏輯迴歸介紹

概述

線性迴歸和邏輯迴歸是機器學習中最基本的兩個模型,線性迴歸一般用來解決預測問題,邏輯迴歸一般解決分類問題,線性迴歸模型和邏輯迴歸模型之間既有區別又有關聯。

線性迴歸模型

假定訓練資料集為
T = { ( x

1 , y 1 ) , ( x
2
, y 2 ) , . . .
, ( x n , y n ) } T = \{(x_1,y_1),(x_2,y_2),...,(x_n,y_n)\}
擬合函式為
f ( x i ) = w x i + b , i = 1 , 2 , . . . , n f(x_i) =wx_i+b,i=1,2,...,n
用最小二乘法,既是找到一條直線,使所有樣本資料到直線的歐式距離之和最小,所以損失函式為
J ( w , b ) = i = 1 n ( f ( x i ) y i ) 2 = i = 1 n ( y i w x i b ) 2 J(w,b)=\sum_{i=1}^n(f(x_i)-y_i)^2=\sum_{i=1}^n(y_i-wx_i-b)^2
求損失函式的最小值
arg min w , b J ( w , b ) = min w , b i = 1 n ( f ( x i ) y i ) 2 = min w , b i = 1 n ( y i w x i b ) 2 \arg\min_{w,b} J(w,b)=\min_{w,b}\sum_{i=1}^n(f(x_i)-y_i)^2=\min_{w,b}\sum_{i=1}^n(y_i-wx_i-b)^2
對其求導
J ( w , b ) w = ( i = 1 n ( w 2 x i 2 + ( y i b ) 2 2 w x i ( y i b ) ) ) w = 2 i = 1 n ( w x i 2 x i ( y i b ) ) \frac {\partial J(w,b)}{\partial w}=\frac {\partial (\sum_{i=1}^n(w^2x_i^2+(y_i-b)^2-2wx_i(y_i-b)))}{\partial w}=2\sum_{i=1}^n(wx_i^2-x_i(y_i-b))
J ( w , b ) b = ( i = 1 n ( w 2 x i 2 + ( y i b ) 2 2 w x i ( y i b ) ) ) b \frac {\partial J(w,b)}{\partial b}=\frac {\partial (\sum_{i=1}^n(w^2x_i^2+(y_i-b)^2-2wx_i(y_i-b)))}{\partial b}

= ( i = 1 n ( w 2 x i 2 + ( y i 2 2 b y i + b 2 ) 2 w x i y i + 2 w x i b ) ) b = 2 i = 1 n ( b + w x i y i ) = 2 n b 2 i = 1 n ( y i w x i ) =\frac {\partial (\sum_{i=1}^n(w^2x_i^2+(y_i^2-2by_i+b^2)-2wx_iy_i+2wx_ib))}{\partial b}=2\sum_{i=1}^n(b+wx_i-y_i)=2nb-2\sum_{i=1}^n(y_i-wx_i)