Octave中的线性回归实现

时间:2019-01-09 06:08:33

标签: machine-learning octave linear-regression gradient-descent

我最近尝试用倍频程实现线性回归,无法超越在线判断。这是代码

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)

m = length(y); % number of training examples
J_history = zeros(num_iters, 1);

for iter = 1:num_iters
    for i = 1:m
      temp1 = theta(1)-(alpha/m)*(X(i,:)*theta-y(i,:));
      temp2 = theta(2)-(alpha/m)*(X(i,:)*theta-y(i,:))*X(i,2);
      theta = [temp1;temp2];
    endfor

    J_history(iter) = computeCost(X, y, theta);

end

end

我知道向量化的实现,但是只想尝试迭代方法。任何帮助将不胜感激。

1 个答案:

答案 0 :(得分:0)

您不需要内部for循环。相反,您可以使用sum函数。

在代码中:

for iter = 1:num_iters

    j= 1:m;

    temp1 = sum((theta(1) + theta(2) .* X(j,2)) - y(j)); 
    temp2 = sum(((theta(1) + theta(2) .* X(j,2)) - y(j)) .* X(j,2));

    theta(1) = theta(1) - (alpha/m) * (temp1 );
    theta(2) = theta(2) - (alpha/m) * (temp2 );

    J_history(iter) = computeCost(X, y, theta);

end

同样要实现矢量化解决方案,然后将它们进行比较,以了解在实践中矢量化的效率如何,这将是一个很好的练习。