此问题基于:
xgboost in R: how does xgb.cv pass the optimal parameters into xgb.train
我试图用不同的参数循环:
for (iter in 1:100){
param <- list(objective = "binary:logistic",
eval_metric = "auc",
max_depth = sample(2:6,1),
eta = runif(.01,.1,.05),
gamma = runif(.01,.05,.1),
subsample = runif(.9,.8,.7),
colsample_bytree = runif(.8,.9,.5),
min_child_weight = sample(30:100,1),
max_delta_step = sample(1:10,1)
)
但它抛出错误,因为param在第一次迭代时将值视为:
max_depth : int 6
eta : num(0)
gamma: num(0)
subsample : num(0)
colsample_bytree : num(0)
min_child_weight: int 63
max_delta_step: int 2
可能导致此行为的原因是什么?
答案 0 :(得分:0)
好像我的runif函数错了。
这似乎有效:
param <- list(objective = "binary:logistic",
eval_metric = "auc",
max_depth = sample(2:6,1),
eta = runif(1,.01,.05),
gamma = runif(1,.01,.1),
subsample = runif(1,0.6,0.9),
colsample_bytree = runif(1,.5,1),
min_child_weight = sample(30:100,1),
max_delta_step = sample(1:10,1)