我正在尝试使用R中的另一个数据集来复制此方法。
https://www.kaggle.com/btyuhas/bayesian-optimization-with-xgboost
我的数据缺少值,但没有Inf值。但是,它给了我错误信息:
xgb.iter.update(fd $ bst,fd $ dtrain,迭代-1,obj)中的错误: 无法解析某些结尾字符:'Inf'
计时停在:0.39 0.19 0.58
我在线搜索,但找不到相关答案。
谢谢!
dtrain = xgb.DMatrix(data = as.matrix(x_train), label=y_train)
dtest = xgb.DMatrix(as.matrix(x_test))
xgb_evaluate = function(max_depth, gamma, colsample_bytree){
params = list(max_depth = max_depth,
subsample = 0.8,
eta = 0.1,
gamma = gamma,
colsample_bytree = 0.3,
eval_metric = "rmse")
# Used around 1000 boosting rounds in the full model
cv_result = xgb.cv(params, dtrain, nround=100, nfold=3)
# Bayesian optimization only knows how to maximize, not minimize, so return the negative RMSE
return (-1.0 *tail(xgboostModelCV$evaluation_log$test_rmse_mean, 1))
}
xgb_bo = BayesianOptimization(xgb_evaluate,
bounds = list(max_depth = c(4L, 6L),
gamma = c(0, 1),
colsample_bytree = c(0.3, 0.9)),
n_iter = 20)