12

How could we get feature_importances when we are performing regression with XGBRegressor()?

There is something like XGBClassifier().feature_importances_?

Simone
  • 705
  • 1
  • 14
  • 23

3 Answers3

13
from xgboost import XGBClassifier
model = XGBClassifier.fit(X,y)

# importance_type = ['weight', 'gain', 'cover', 'total_gain', 'total_cover']
model.get_booster().get_score(importance_type='weight')

However, the method below also returns feature importance's and that have different values to any of the "importance_type" options in the method above. This was raised in this github issue, but there is no answer [as of Jan 2019].

model.feature_importances_
BenP
  • 317
  • 2
  • 8
  • 2
    The values in the list feature_importances_ equal the values in the dict get_score(importance_type='weight') where each element is divided by the sum of elements. – Anton Tarasenko Nov 22 '18 at 11:33
  • 2
    Which importance_type is equivalent to the sklearn.ensemble.GradientBoostingRegressor version of feature_importances_? My suspicion is total_gain – Keith Jan 18 '19 at 18:49
4

Finally I have solved this issue by:

model.booster().get_score(importance_type='weight')

Simone
  • 705
  • 1
  • 14
  • 23
3

In the past the Scikit-Learn wrapper XGBRegressor and XGBClassifier should get the feature importance using model.booster().get_score(). Not sure from which version but now in xgboost 0.71 we can access it using

model.feature_importances_
byrony
  • 131
  • 3
  • I'm using from xgboost.sklearn import XGBRegressor in version 0.72.1 and this worked for me. Thanks! – Adam Jul 20 '18 at 17:57