1. 程式人生 > 實用技巧 >機器學習之線性迴歸

機器學習之線性迴歸

解析解(不帶懲罰項)

E ( w ) = 1 2 ∑ i = 1 N ( y ( x i , w ) − t i ) 2 E ( w ) = 1 2 ( X w − T ) T ( X w − T ) = 1 2 ( w T X T − T T ) ( X w − T ) = 1 2 ( w T X T X w − T T X w − w T X T T + T T T ) min ⁡ E ( w ) = min ⁡ ( 1 2 w T X T X w − w T X T T ) = min ⁡ 1 2 ( w T X T X w − T T X w ) ∂ E ∂ w = X T X w − X T T w = ( X T X ) − 1 X T T E(w)=\frac{1}{2}\sum\limits_{i=1}^{N}(y(x_i,w)-t_i)^2\\E(w)=\frac{1}{2}(Xw-T)^T(Xw-T)=\frac{1}{2}(w^TX^T-T^T)(Xw-T)=\frac{1}{2}(w^TX^TXw-T^TXw-w^TX^TT+T^TT)\\\min{E(w)}=\min{(\frac{1}{2}w^TX^TXw-w^TX^TT)}=\min{\frac{1}{2}(w^TX^TXw-T^TXw)}\\\frac{\partial E}{\partial w}=X^TXw-X^TT\\w=(X^TX)^{-1}X^TT

E(w)=21i=1N(y(xi,w)ti)2E(w)=21(XwT)T(XwT)=21(wTXTTT)(XwT)=21(wTXTXwTTXwwTXTT+TTT)minE(w)=min(21wTXTXwwTXTT)=min21(wTXTXwTTXw)wE=XTXwXTTw=(XTX)1XTT

解析解(帶懲罰項)

E ( w ) = 1 2 ∑ i = 1 N ( y ( x i , w ) − t i ) 2 + λ 2 ∣ ∣ w ∣ ∣ 2 E ( w ) = 1 2 ( X w − T ) T ( X w − T ) + λ 2 w T w min ⁡ E ( w ) = min ⁡ ( 1 2 w T X T X w − w T X T T + λ 2 w T w ) ∂ E ∂ w = X T X w − X T T + λ w w = ( X T X + λ I ) − 1 X T T E(w)=\frac{1}{2}\sum\limits_{i=1}^{N}(y(x_i,w)-t_i)^2+\frac{\lambda}{2}||w||^2\\E(w)=\frac{1}{2}(Xw-T)^T(Xw-T)+\frac{\lambda}{2}w^Tw\\\min{E(w)}=\min{(\frac{1}{2}w^TX^TXw-w^TX^TT+\frac{\lambda}{2}w^Tw)}\\\frac{\partial E}{\partial w}=X^TXw-X^TT+\lambda w\\w=(X^TX+\lambda I)^{-1}X^TT

E(w)=21i=1N(y(xi,w)ti)2+2λw2E(w)=21(XwT)T(XwT)+2λwTwminE(w)=min(21wTXTXwwTXTT+2λwTw)wE=XTXwXTT+λww=(XTX+λI)1XTT

過擬合:

在這裡插入圖片描述

擬合的很不錯在這裡插入圖片描述

參考:哈工大機器學習PPT、西瓜書、PRML