一对一回归

时间:2015-05-16 21:21:34

标签: matlab logistic-regression

我一直在审查我在https://github.com/jcgillespie/Coursera-Machine-Learning/tree/master/ex3找到的机器学习中Andrew Ng的一个例子。该示例处理逻辑回归和一对一分类。我对这个功能有疑问:

function [all_theta] = oneVsAll(X, y, num_labels, lambda)
%ONEVSALL trains multiple logistic regression classifiers and returns all
%the classifiers in a matrix all_theta, where the i-th row of all_theta 
%corresponds to the classifier for label i
%   [all_theta] = ONEVSALL(X, y, num_labels, lambda) trains num_labels
%   logisitc regression classifiers and returns each of these classifiers
%   in a matrix all_theta, where the i-th row of all_theta corresponds 
%   to the classifier for label i

% Some useful variables
m = size(X, 1);
n = size(X, 2);

% You need to return the following variables correctly 
all_theta = zeros(num_labels, n + 1);

% Add ones to the X data matrix
X = [ones(m, 1) X];

% ====================== YOUR CODE HERE ======================
% Instructions: You should complete the following code to train num_labels
%               logistic regression classifiers with regularization
%               parameter lambda. 
%
% Hint: theta(:) will return a column vector.
%
% Hint: You can use y == c to obtain a vector of 1's and 0's that tell use 
%       whether the ground truth is true/false for this class.
%
% Note: For this assignment, we recommend using fmincg to optimize the cost
%       function. It is okay to use a for-loop (for c = 1:num_labels) to
%       loop over the different classes.
%
%       fmincg works similarly to fminunc, but is more efficient when we
%       are dealing with large number of parameters.
%
% Example Code for fmincg:
%
%     % Set Initial theta
%     initial_theta = zeros(n + 1, 1);
%     
%     % Set options for fminunc
%     options = optimset('GradObj', 'on', 'MaxIter', 50);
% 
%     % Run fmincg to obtain the optimal theta
%     % This function will return theta and the cost 
%     [theta] = ...
%         fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ...
%                 initial_theta, options);
%

initial_theta = zeros(n + 1, 1);

options = optimset('GradObj', 'on', 'MaxIter', 50);

for i = 1:num_labels

    c = i * ones(size(y));
    fprintf('valores')
    [theta] = fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), initial_theta, options);
    all_theta(i,:) = theta;

end


% =========================================================================


end

我知道lrCostFunction作为参数:theta,X,y和lambda,但我无法从上面发布的代码中 t 的值来看出来;特别是在这一部分:

[theta] = fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), initial_theta, options);

任何帮助?

3 个答案:

答案 0 :(得分:7)

fmincg将目标函数的句柄作为第一个参数,在本例中是lrCostFunction的句柄。

如果你进入fmincg.m,你会发现以下几行:

argstr = ['feval(f, X'];                      % compose string used to call function

%---Code will not enter the following loop---%
for i = 1:(nargin - 3) %this will go from 1 to 0, thus the loop is skipped
   argstr = [argstr, ',P', int2str(i)];
end
% following will be executed
argstr = [argstr, ')'];

在上面的代码段末尾,结果将是,

argstr=feval(f,X');

如果你有点领先,你会看到,

[f1 df1] = eval(argstr);                      % get function value and gradient

因此,函数句柄f将使用参数X'运行。因此,t=X',这也是有道理的。最初的theta会收敛,为您提供逻辑回归的最终参数向量。

答案 1 :(得分:5)

你实际上可以替代。

for i=1 : num_labels

    [theta]= fmincg (@(t)(lrCostFunction(t, X, (y == i), lambda)),initial_theta, options);

all_theta(i,:)=theta;

答案 2 :(得分:3)

试试这个

for i = 1:num_labels,
    [all_theta(i,:)] = fmincg (@(t)(lrCostFunction(t, X, (y == i), lambda)), initial_theta, options);
end;

你也不需要在开头

初始化all_theta