XGBoost:'objective'参数设置是什么?

时间:2016-10-25 05:00:37

标签: xgboost

我想解决XGBoost的回归问题。我对学习任务参数目标[default = reg:linear](XGboost)感到困惑,**似乎'目标'用于设置损失函数。**但我无法理解'reg:线性'如何影响损失函数。在逻辑回归演示(XGBoost logistic regression demo)中,objective = binary:logistic意味着损失函数是逻辑损失函数。所以'objective = reg:linear'对应哪个损失函数?

1 个答案:

答案 0 :(得分:3)

  

所以'objective = reg:linear'对应哪个损失函数?

平方错误

你可以在这里看一下逻辑回归和线性回归的损失函数(基于梯度和粗糙度)

https://github.com/dmlc/xgboost/blob/master/src/objective/regression_obj.cc

请注意,损失函数非常相似。只是SecondOrderGradient是平方损失的常数

// common regressions
// linear regression
struct LinearSquareLoss {
  static float PredTransform(float x) { return x; }
  static bool CheckLabel(float x) { return true; }
  static float FirstOrderGradient(float predt, float label) { return predt - label; }
  static float SecondOrderGradient(float predt, float label) { return 1.0f; }
  static float ProbToMargin(float base_score) { return base_score; }
  static const char* LabelErrorMsg() { return ""; }
  static const char* DefaultEvalMetric() { return "rmse"; }
};
// logistic loss for probability regression task
struct LogisticRegression {
  static float PredTransform(float x) { return common::Sigmoid(x); }
  static bool CheckLabel(float x) { return x >= 0.0f && x <= 1.0f; }
  static float FirstOrderGradient(float predt, float label) { return predt - label; }
  static float SecondOrderGradient(float predt, float label) {
    const float eps = 1e-16f;
    return std::max(predt * (1.0f - predt), eps);
  } 

作者在此提到https://github.com/dmlc/xgboost/tree/master/demo/regression