Xgboost适合我的ML模型,导致出现MemoryError

时间:2020-09-30 14:08:57

标签: python machine-learning xgboost

我正在尝试拟合具有超过80万行并且内存不足的数据集。有人知道我该如何解决这个问题? 错误和代码(我使用XGBoostClassifier):

Exception has occurred: MemoryError
Unable to allocate array with shape (819461, 30) and data type float64



gsc = RandomizedSearchCV(xgb,
                             param_distributions={
                                 'objective': ['binary:logistic'],
                                 'eval_metric': ['error'],
                                 'colsample_bytree':
                                 np.arange(0.8, 0.95, 0.02).tolist(),
                                 'scale_pos_weight': [peso],
                                 'gamma':
                                 np.arange(0.15, 0.4, 0.02).tolist(),
                             },
                             cv=cv2,
                             scoring=SCORING_METHOD,
                             refit=SCORING_METHOD,
                             verbose=1,
                             n_jobs=-1,
                             n_iter=int(N_ITER_SEARCH),
                             return_train_score=True)

    grid_result = gsc.fit(
        X_train, y_train
    )

0 个答案:

没有答案