R版石灰可以解释具有计数的xgboost模型:泊松目标函数吗?

时间:2018-03-14 14:29:47

标签: r xgboost lime

我使用xgb.train生成了一个模型,其中"计数:poisson"目标函数,我在尝试创建解释器时遇到以下错误:

Error: Unsupported model type
当我用reg:logistic。

之类的东西替换目标时,

Lime工作

有没有办法解释数:石灰中的泊松? 感谢

可重复的例子:

library(xgboost)
library(dplyr)
library(caret)
library(insuranceData) # example dataset https://cran.r-project.org/web/packages/insuranceData/insuranceData.pdf
library(lime) # Local Interpretable Model-Agnostic Explanations
set.seed(123)
data(dataCar)
mydb <- dataCar %>% select(clm, exposure, veh_value, veh_body,
                           veh_age, gender, area, agecat)

label_var <- "clm"  
offset_var <- "exposure"
feature_vars <- mydb %>% 
  select(-one_of(c(label_var, offset_var))) %>% 
  colnames()

#preparing data for xgboost (one hot encoding of categorical (factor) data
myformula <- paste0( "~", paste0( feature_vars, collapse = " + ") ) %>% as.formula()
dummyFier <- caret::dummyVars(myformula, data=mydb, fullRank = TRUE)
dummyVars.df <- predict(dummyFier,newdata = mydb)
mydb_dummy <- cbind(mydb %>% select(one_of(c(label_var, offset_var))), 
                    dummyVars.df)
rm(myformula, dummyFier, dummyVars.df)


feature_vars_dummy <-  mydb_dummy  %>% select(-one_of(c(label_var, offset_var))) %>% colnames()

xgbMatrix <- xgb.DMatrix(
  data = mydb_dummy %>% select(feature_vars_dummy) %>% as.matrix, 
  label = mydb_dummy %>% pull(label_var),
  missing = "NAN")


#model 1: this does not
myParam <- list(max.depth = 2,
                eta = .01,
                gamma = 0.001,
                objective = 'count:poisson',
                eval_metric = "poisson-nloglik")


booster <- xgb.train(
  params = myParam, 
  data = xgbMatrix, 
  nround = 50)

explainer <- lime(mydb_dummy %>% select(feature_vars_dummy), 
                  model = booster)

explanation <- explain(mydb_dummy %>% select(feature_vars_dummy) %>% head,
                       explainer,
                       n_labels = 1, 
                       n_features = 2)
#Error: Unsupported model type
#model 2 : this works
myParam2 <- list(max.depth = 2,
                eta = .01,
                gamma = 0.001,
                objective = 'reg:logistic',
                eval_metric = "logloss")


booster2 <- xgb.train(
  params = myParam2, 
  data = xgbMatrix, 
  nround = 50)

explainer <- lime(mydb_dummy %>% select(feature_vars_dummy), 
                  model = booster)

explanation <- explain(mydb_dummy %>% select(feature_vars_dummy) %>% head,
                       explainer,
                       n_features = 2)


plot_features(explanation)

0 个答案:

没有答案
相关问题