逻辑回归成本的矢量化

时间:2013-11-12 19:50:17

标签: matlab vectorization logistic-regression

我有这个代码用于逻辑回归的成本,在matlab中:

function [J, grad] = costFunction(theta, X, y)

m = length(y); % number of training examples
thetas = size(theta,1);
features = size(X,2);
steps = 100;
alpha = 0.1;

J = 0;
grad = zeros(size(theta));


sums = [];
result = 0;

for i=1:m

%    sums = [sums; (y(i))*log10(sigmoid(X(i,:)*theta))+(1-y(i))*log10(1-sigmoid(X(i,:)*theta))]

    sums = [sums; -y(i)*log(sigmoid(theta'*X(i,:)'))-(1-y(i))*log(1-sigmoid(theta'*X(i,:)'))];

    %use log simple not log10, mistake
end

result = sum(sums);
J = (1/m)* result;


%gradient one step

tempo = [];
thetas_update = 0;
temp_thetas = [];


grad = temp_thetas;

for i = 1:size(theta)
    for j = 1:m
        tempo(j) = (sigmoid(theta'*X(j,:)')-y(j))*X(j,i);
    end
    temp_thetas(i) = sum(tempo);
    tempo = [];
end

grad = (1/m).*temp_thetas;

% =============================================================

end

我需要对其进行矢量化,但我不知道它是如何做到的,为什么?我是程序员,所以我喜欢for's。但为了矢量化它,我是空白的。有帮助吗?感谢。

2 个答案:

答案 0 :(得分:34)

function [J, grad] = costFunction(theta, X, y)
hx = sigmoid(X * theta);
m = length(X);

J = (-y' * log(hx) - (1 - y')*log(1 - hx)) / m;
grad = X' * (hx - y) / m;

end

答案 1 :(得分:2)

代码应该是 function [J, grad] = costFunction(theta, X, y)

hx = sigmoid(X' * theta);
m = length(X);

J = sum(-y * log(hx) - (1 - y)*log(1 - hx)) / m;
grad = X' * (hx - y) / m;



end
相关问题