我在Normal XGBoost和Dask XGBoost上运行相同的代码。 我从两种模型得到不同的概率。
常规XGBoost代码
params = {'objective': 'binary:logistic', 'nround': 1000,
'max_depth': 16, 'eta': 0.01, 'subsample': 0.5,
'min_child_weight': 1, 'tree_method': 'hist',
'grow_policy': 'lossguide'}
model = XGBClassifier(params=params)
model.fit(X_train, y_train)
y_pred = model.predict(X_test)
输出:-{Normal XGBoost Code Output
Dask XGBoost代码
params = {'objective': 'binary:logistic', 'nround': 1000,
'max_depth': 16, 'eta': 0.01, 'subsample': 0.5,
'min_child_weight': 1, 'tree_method': 'hist',
'grow_policy': 'lossguide'}
bst = dxgb.train(client, params, X_train, y_train)
predictions2 = dxgb.predict(client, bst, X_test).persist()
有人可以在这里帮助我吗?