在MATLAB中实现和绘制感知器

时间:2011-02-03 03:24:04

标签: matlab artificial-intelligence

我正在审核Toronto perceptron MATLAB code

中的代码

代码是

function [w] = perceptron(X,Y,w_init)

w = w_init;
for iteration = 1 : 100  %<- in practice, use some stopping criterion!
  for ii = 1 : size(X,2)         %cycle through training set
    if sign(w'*X(:,ii)) ~= Y(ii) %wrong decision?
      w = w + X(:,ii) * Y(ii);   %then add (or subtract) this point to w
    end
  end
  sum(sign(w'*X)~=Y)/size(X,2)   %show misclassification rate
end

所以我正在阅读如何将此函数应用于数据矩阵X和目标Y,但是,不知道如何使用此函数,据我所知,它返回一个权重向量,因此它可以进行分类。

你能举个例子,解释一下吗?

我试过了

X=[0 0; 0 1; 1 1]
Y=[1 0; 2 1]
w=[1 1 1]
Result = perceptron( X, Y, w )

??? Error using ==> mtimes
Inner matrix dimensions must agree.

Error in ==> perceptron at 15
            if sign(w'*X(:,ii)) ~= Y(ii) 

    Result = perceptron( X, Y, w' )

??? Error using ==> ne
Matrix dimensions must agree.

Error in ==> perceptron at 19
        sum(sign(w'*X)~=Y) / size(X,2);     

由于

谢谢你的回答,我还有一个,如果我改变Y = [0,1],算法会发生什么?

那么,任何输入数据都不适用于Y = [0,1]这个感知器的代码对吗?

----------------------------- EDIT -------------- ----------

还有一个问题,如果我想绘制划分2个类的行,我知道我们可以得到解决线性方程系统的线,它与权重有关,但是如何, 我该怎么办?,我正在尝试像

这样的东西
% the initial weights
w_init = [ 1 1 1]';  
% the weights returned from perceptron    
wtag   = perceptron(X,Y,w_init,15);

% concatenate both
Line = [wtag,w_init] 

% solve the linear system, am I correct doing this?
rref(Line')

% plot???

3 个答案:

答案 0 :(得分:17)

您应该首先了解每个输入的含义:

  • X是大小为M x N的示例的输入矩阵,其中M是特征向量的维数,N是样本数。由于用于预测的感知器模型为Y=w*X+b,因此您必须在X中提供一个额外的维度,该维度是常量,通常设置为1,因此b术语是“已构建的” -in“进入X。在下面针对X的示例中,我将X的最后一个条目设置为所有样本中的1
  • Y是来自X(您希望感知器学习的分类)的每个样本的正确分类,因此它应该是N维行向量 - 每个输入示例的一个输出。由于感知器是二进制分类器,因此它应该只有2个不同的可能值。查看代码,您会看到它检查预测的符号,它告诉您允许的Y值应为-1,+1(例如,而不是0,1)。< / LI>
  • w是您要学习的权重向量。

因此,尝试使用:

调用该函数
X=[0 0; 0 1; 1 1];
Y=[1 -1];
w=[.5; .5; .5];

修改

使用以下代码调用perceptron alg并以图形方式查看结果:

% input samples
X1=[rand(1,100);rand(1,100);ones(1,100)];   % class '+1'
X2=[rand(1,100);1+rand(1,100);ones(1,100)]; % class '-1'
X=[X1,X2];

% output class [-1,+1];
Y=[-ones(1,100),ones(1,100)];

% init weigth vector
w=[.5 .5 .5]';

% call perceptron
wtag=perceptron(X,Y,w);
% predict
ytag=wtag'*X;


% plot prediction over origianl data
figure;hold on
plot(X1(1,:),X1(2,:),'b.')
plot(X2(1,:),X2(2,:),'r.')

plot(X(1,ytag<0),X(2,ytag<0),'bo')
plot(X(1,ytag>0),X(2,ytag>0),'ro')
legend('class -1','class +1','pred -1','pred +1')

答案 1 :(得分:10)

如果您有兴趣,这里有一个以相当教程的方式编写的小型感知器演示:

function perceptronDemo
%PERCEPTRONDEMO
%
%   A simple demonstration of the perceptron algorithm for training
%   a linear classifier, made as readable as possible for tutorial
%   purposes. It is derived from the treatment of linear learning
%   machines presented in Chapter 2 of "An Introduction to Support
%   Vector Machines" by Nello Cristianini and John Shawe-Taylor.
%
%

    Data  = createTrainingData;
    Model = trainPerceptron( Data );

end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function Model = trainPerceptron( Data )
%TRAINPERCEPTRON

    DOWN   = 1;
    ACROSS = 2;

    assert( isequal( unique( Data.labels ), [-1; +1] ), ...
        'Labels must be -1 or +1' );

    % ---------------------------------------------------------------------
    % Normalise the data by calculating z-scores
    %
    %   This makes plotting easier, but is not needed by the algorithm.
    %

    sampleMean   = mean( Data.samples );
    sampleStdDev = std(  Data.samples );
    Data.samples = bsxfun( @minus,   Data.samples, sampleMean   );
    Data.samples = bsxfun( @rdivide, Data.samples, sampleStdDev );

    % ---------------------------------------------------------------------
    % Calculate the squared radius of the smallest ball that encloses the
    % data and is centred on the origin. This is used to provide an
    % appropriate range and step size when updating the threshold (bias)
    % parameter.
    %

    sampleSize = size( Data.samples, DOWN );
    maxNorm    = realmin;
    for iObservation = 1:sampleSize
        observationNorm = norm( Data.samples(iObservation,:) );
        if observationNorm > maxNorm
            maxNorm = observationNorm;
        end
    end
    enclosingBallRadius        = maxNorm;
    enclosingBallRadiusSquared = enclosingBallRadius .^ 2;

    % ---------------------------------------------------------------------
    % Define the starting weight vector and bias. These should be zeros,
    % as the algorithm omits a learning rate, and it is suggested in
    % Cristianini & Shawe-Taylor that learning rate may only be omitted
    % safely when the starting weight vector and bias are zero.
    %

    Model.weights = [0.0 0.0];
    Model.bias    = 0.0;

    % ---------------------------------------------------------------------
    % Run the perceptron training algorithm
    %
    %   To prevent program running forever when nonseparable data are
    %   provided, limit the number of steps in the outer loop.
    %

    maxNumSteps = 1000;

    for iStep = 1:maxNumSteps

        isAnyObsMisclassified = false;

        for iObservation = 1:sampleSize;

            inputObservation = Data.samples( iObservation, : );
            desiredLabel     = Data.labels(  iObservation    ); % +1 or -1

            perceptronOutput = sum( Model.weights .* inputObservation, ACROSS ) + Model.bias;
            margin           = desiredLabel * perceptronOutput;

            isCorrectLabel   = margin > 0;

            % -------------------------------------------------------------
            % If the model misclassifies the observation, update the
            % weights and the bias.
            %

            if ~isCorrectLabel

                isAnyObsMisclassified = true;

                weightCorrection = desiredLabel  * inputObservation;
                Model.weights    = Model.weights + weightCorrection;

                biasCorrection   = desiredLabel .* enclosingBallRadiusSquared;
                Model.bias       = Model.bias   + biasCorrection;

                displayPerceptronState( Data, Model );

            end % if this observation misclassified.

        end % loop over observations

        if ~isAnyObsMisclassified
            disp( 'Done!' );
            break;
        end

    end % outer loop

end % TRAINPERCEPTRON
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function Data = createTrainingData
%CREATETRAININGDATA
%
%   Return a structure containing training data suitable for linear
%   classification.
%

    sampleAsize   = 1024;
    sampleBsize   = 1024;

    sampleAmean   = [ 5.5 5.0 ];
    sampleAstdDev = [ 0.5 1.0 ];

    sampleBmean   = [ 2.5 3.0 ];
    sampleBstdDev = [ 0.3 0.7 ];

    Data.samples  = [ normallyDistributedSample( sampleAsize, sampleAmean, sampleAstdDev ); ...
                      normallyDistributedSample( sampleBsize, sampleBmean, sampleBstdDev ) ];

    Data.labels   = [  ones(sampleAsize,1); ...
                      -ones(sampleBsize,1) ];

    % ---------------------------------------------------------------------
    % Randomly permute samples & class labels.
    %
    %   This is not really necessary, but done to illustrate that the order
    %   in which observations are evaluated does not matter.
    %

    randomOrder   = randperm( sampleAsize + sampleBsize );
    Data.samples  = Data.samples( randomOrder, : );
    Data.labels   = Data.labels(  randomOrder, : );

end % CREATETRAININGDATA
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function samples = normallyDistributedSample( sampleSize, sampleMean, sampleStdDev )
%NORMALDISTRIBUTIONSAMPLE
%
%   Draw a sample from a normal distribution with specified mean and
%   standard deviation.
%

    assert(    isequal( size( sampleMean ), size( sampleStdDev ) ) ...
            && 1 == size( sampleMean, 1 ),                         ...
        'Sample mean and standard deviation must be row vectors of equal length.' );

    numFeatures = numel( sampleMean );
    samples     = randn( sampleSize, numFeatures );
    samples     = bsxfun( @times, samples, sampleStdDev );
    samples     = bsxfun( @plus,  samples, sampleMean   );

end % NORMALDISTRIBUTIONSAMPLE
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function displayPerceptronState( Data, Model )
%DISPLAYPERCEPTRONSTATE

    hFig = figure( 1 );
    clf;
    set( hFig,                        ...
        'NumberTitle', 'off',         ...
        'Name',         mfilename,    ...
        'MenuBar',      'none',       ...
        'Color',        [1.0 1.0 1.0] );

    displayXmin = -4;
    displayXmax =  4;
    displayYmin = -4;
    displayYmax =  4;

    hAx = subplot( 1, 1, 1 );
    axis('equal');
    set( hAx,                                  ...
        'Box',      'on',                      ...
        'NextPlot', 'add',                     ...
        'xgrid',    'on',                      ...
        'ygrid',    'on',                      ...
        'xlim',     [displayXmin displayXmax], ... % Bounds suitable for Z-scored data
        'ylim',     [displayYmin displayYmax]  );
    xlabel( 'x_1' );
    ylabel( 'x_2' );

    % ---------------------------------------------------------------------
    % Plot data points from the two classes
    %

    isPositiveClass = Data.labels >  0;
    isNegativeClass = Data.labels <= 0;

    plot( hAx, Data.samples(isPositiveClass,1), Data.samples(isPositiveClass,2), 'b+' );
    plot( hAx, Data.samples(isNegativeClass,1), Data.samples(isNegativeClass,2), 'rx' );

    % ---------------------------------------------------------------------
    % Display parameters for separating hyperplane in title
    %

    xWeight   = Model.weights(1);
    yWeight   = Model.weights(2);
    bias      = Model.bias;

    szTitle   = sprintf( 'Linear classifier parameters: %0.2f x_1 + %0.2f x_2 + %0.2f = 0', xWeight, yWeight, bias );
    title( szTitle );

    % ---------------------------------------------------------------------
    % Plot separating hyperplane
    %

    y1 = ( (xWeight*displayXmin) + bias ) ./ -yWeight;
    y2 = ( (xWeight*displayXmax) + bias ) ./ -yWeight;

    plot( hAx, [displayXmin; displayXmax], [y1, y2], 'k-', 'linewidth', 2 );

    pause(0.1);

end % DISPLAYPERCEPTRONSTATE

答案 2 :(得分:1)

试试这个:

perceptron([1 2 1 2], [1 0 1 0], 0.5);