XGBoost功能重要性不匹配

时间:2018-03-30 17:17:06

标签: python xgboost

当我使用model.feature_importances_xgb.plot_importance(model)时,我获得了不同的重要值。

此外,numpy数组feature_importances不直接对应于plot_importance函数返回的索引。

这是情节的样子:

xgb.plot_importance(model)

但这是model.feature_importances_的输出给出了完全不同的值:

array([ 0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.00568182,  0.        ,  0.        ,  0.        ,
        0.13636364,  0.        ,  0.        ,  0.        ,  0.01136364,
        0.        ,  0.        ,  0.        ,  0.        ,  0.07386363,
        0.03409091,  0.        ,  0.00568182,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.00568182,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.00568182,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.01704546,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.05681818,  0.15909091,  0.0625    ,  0.        ,
        0.        ,  0.        ,  0.10227273,  0.        ,  0.07386363,
        0.01704546,  0.05113636,  0.00568182,  0.        ,  0.        ,
        0.02272727,  0.        ,  0.01136364,  0.        ,  0.        ,
        0.11363637,  0.        ,  0.01704546,  0.01136364,  0.        ,
        0.        ,  0.        ,  0.        ,  0.        ,  0.        ,
        0.        ,  0.        ,  0.        ], dtype=float32)

如果我只是尝试抓取功能81(model.feature_importances_[81]),我会得到:0.051136363。但是model.feature_importances_.argmax()会返回72

知道为什么这两种方法会给我带来质量上不同的结果吗?

0 个答案:

没有答案