如何在R中使用包装器特征选择算法?

时间:2016-04-20 14:13:54

标签: r algorithm feature-selection

我有几种算法:rpart,kNN,逻辑回归,randomForest,Naive Bayes和SVM。我想使用前向/后向和遗传算法选择来找到用于特定算法的最佳特征子集。

如何在R?

中实现包装类型前向/后向和遗传选择功能?

3 个答案:

答案 0 :(得分:1)

我正在测试包装器,所以我会在R中给你一些Pacckage名称。什么是wrapperenter image description here

现在来到方法: MASS包:通过逐步算法中的AIC选择模型

stepAIC(model, direction = "both", trace = FALSE) stepAIC(model, direction = "backward", trace = FALSE) stepAIC(model, direction = "forward", trace = FALSE)

Carte套餐:向后特征选择

control <- rfeControl(functions = lmFuncs, method = "repeatedcv", number = 5, verbose = TRUE)
rfe_results <- rfe(x, y, sizes = c(1:10), rfeControl = control)

使用遗传算法进行监督功能选择

gafs_results <- gafs(x, y, gafsControl = control)

模拟退火特征选择

safs_results <-  safs(x, y, iters = 10, safsControl = control)
希望我能给你一个很好的概述。那里有更多的方法...

答案 1 :(得分:0)

R中的caret包具有广泛的功能,可以很容易地在您提到的算法之间切换。

他们的网站上还有很多文档:

希望这有帮助

答案 2 :(得分:0)

以下是前向特征选择的一些代码

selectFeature <- function(train, test, cls.train, cls.test, features) {
  ## identify a feature to be selected
  current.best.accuracy <- -Inf #nagtive infinity
  selected.i <- NULL
  for(i in 1:ncol(train)) {
    current.f <- colnames(train)[i]
    if(!current.f %in% features) {
      model <- knn(train=train[,c(features, current.f)],      test=test[,c(features, current.f)], cl=cls.train, k=3)
      test.acc <- sum(model == cls.test) / length(cls.test)

      if(test.acc > current.best.accuracy) {
        current.best.accuracy <- test.acc
        selected.i <- colnames(train)[i]
      }
    }
  }
  return(selected.i)
}


##
library(caret)
set.seed(1)
inTrain <- createDataPartition(Sonar$Class, p = .6)[[1]]
allFeatures <- colnames(Sonar)[-61]
train <- Sonar[ inTrain,-61]
test  <- Sonar[-inTrain,-61]
cls.train <- Sonar$Class[inTrain]
cls.test <- Sonar$Class[-inTrain]

# use correlation to determine the first feature
cls.train.numeric <- rep(c(0, 1), c(sum(cls.train == "R"),   sum(cls.train == "M")))
features <- c()
current.best.cor <- 0
for(i in 1:ncol(train[,-61])) {
  if(current.best.cor < abs(cor(train[,i], cls.train.numeric))) {
    current.best.cor <- abs(cor(train[,i], cls.train.numeric))
    features <- colnames(train)[i]
  }
}
print(features)

# select the 2 to 10 best features using knn as a wrapper classifier
for (j in 2:10) {
  selected.i <- selectFeature(train, test, cls.train, cls.test, features)
  print(selected.i)

  # add the best feature from current run
  features <- c(features, selected.i)
}