3

From other posts (see Unbalanced multiclass data with XGBoost) and the documentation, scale_pos_weight in XGBoost appears to balance positive and negative cases, which seems to apply only to classification. However, it also appears to be an option in XGBRegressor (see documentation). Before I dive into the source code, can someone explain what this does for regression?

Robert Yi
  • 131
  • 3

1 Answers1

2

There are a few unused or deprecated parameters in both XGBClassifier and XGBRegressor, so it might just be a matter of sloppy inheritance/c+p.

A couple of possibilities:

  1. They copied over the params from Sklearn's GradientBoostedClassifier
  2. They copied over the params from XGBClassifier
  3. They inherited the properties from some class that already had those attributes.

There have been some consistency issues for a while now (I faintly recall nthreads versus n_job issue.)

Dave Liu
  • 166
  • 5
  • Oh yeah, the only other question I've answered on this StackExchange was about these vestigial variables: https://datascience.stackexchange.com/questions/33885/eta-and-learning-rate-different-in-xgboost/50935#50935 – Dave Liu Sep 25 '19 at 17:26