機器學習程式碼練習
阿新 • • 發佈:2018-11-23
ex1 (Liner Regression)
result
1. ComputeCost(計算損失)
function J = computeCost(X, y, theta)
m = length(y); % number of training examples
J = 0
J = sum((X * theta - y).^2) / (2*m);
2.gradientDescent(梯度下降)
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
theta_s=theta;
for iter = 1:num_iters
theta(1) = theta(1) - alpha / m * sum(X * theta_s - y);
theta(2) = theta(2) - alpha / m * sum((X * theta_s - y) .* X(:,2));
% 必須同時更新theta(1)和theta(2),所以不能用X * theta,而要用theta_s儲存上次結果。
theta_s=theta;