基于感知器系数绘制分类决策边界线

时间:2016-06-25 01:17:29

标签: r plot machine-learning

这实际上是this question的重复。但是,我想问一个非常具体的问题,关于根据我通过基本的“手动”编码实验得到的感知器系数绘制决策边界线。正如您所看到的,从逻辑回归中提取的系数得出一个很好的决策边界线:

enter image description here

基于glm()结果:

(Intercept)       test1       test2 
   1.718449    4.012903    3.743903 

感知器实验的系数完全不同:

     bias     test1     test2 
 9.131054 19.095881 20.736352 

为方便回答,here is the data,以下是代码:

# DATA PRE-PROCESSING:
dat = read.csv("perceptron.txt", header=F)
dat[,1:2] = apply(dat[,1:2], MARGIN = 2, FUN = function(x) scale(x)) # scaling the data
data = data.frame(rep(1,nrow(dat)), dat) # introducing the "bias" column
colnames(data) = c("bias","test1","test2","y")
data$y[data$y==0] = -1 # Turning 0/1 dependent variable into -1/1.
data = as.matrix(data) # Turning data.frame into matrix to avoid mmult problems.

# PERCEPTRON:
set.seed(62416)
no.iter = 1000                           # Number of loops
theta = rnorm(ncol(data) - 1)            # Starting a random vector of coefficients.
theta = theta/sqrt(sum(theta^2))         # Normalizing the vector.
h = theta %*% t(data[,1:3])              # Performing the first f(theta^T X)

for (i in 1:no.iter){                    # We will recalculate 1,000 times
  for (j in 1:nrow(data)){               # Each time we go through each example.
      if(h[j] * data[j, 4] < 0){         # If the hypothesis disagrees with the sign of y,
      theta = theta + (sign(data[j,4]) * data[j, 1:3]) # We + or - the example from theta.
      }
      else
      theta = theta                      # Else we let it be.
  }
  h = theta %*% t(data[,1:3])            # Calculating h() after iteration.
}
theta                                    # Final coefficients
mean(sign(h) == data[,4])                # Accuracy

问题:如果我们只有感知器系数,如何绘制边界线(如上所述使用逻辑回归系数)?

2 个答案:

答案 0 :(得分:0)

嗯......事实证明它与case of logistic regression中的完全相同,尽管系数差别很大:选择横坐标的最小值和最大值(测试1),添加一点点余量,并在决策边界处计算相应的测试2值(当0 = theta_o + theta_1 test1 + theta_2 test2时),并绘制点之间的线:

palette(c("tan3","purple4"))
plot(test2 ~ test1, col = as.factor(y), pch = 20, data=data,
     main="College admissions")
(x = c(min(data[,2])-.2,  max(data[,2])+ .2))
(y = c((-1/theta[3]) * (theta[2] * x + theta[1])))
lines(x, y, lwd=3, col=rgb(.7,0,.2,.5))

enter image description here

答案 1 :(得分:-1)

计算感知器权重,使得当θ^ T X> 0,它被归类为正,并且当θ^ T X <0时,它被归类为正。 0它归类为负数。这意味着方程式θTX X是感知器的决策边界。

相同的逻辑适用于逻辑回归,除了它现在的sigmoid(theta ^ T X)&gt; 0.5。