How could we get feature_importances
when we are performing regression with XGBRegressor()
?
There is something like XGBClassifier().feature_importances_
?
How could we get feature_importances
when we are performing regression with XGBRegressor()
?
There is something like XGBClassifier().feature_importances_
?
from xgboost import XGBClassifier
model = XGBClassifier.fit(X,y)
# importance_type = ['weight', 'gain', 'cover', 'total_gain', 'total_cover']
model.get_booster().get_score(importance_type='weight')
However, the method below also returns feature importance's and that have different values to any of the "importance_type" options in the method above. This was raised in this github issue, but there is no answer [as of Jan 2019].
model.feature_importances_
feature_importances_
equal the values in the dict get_score(importance_type='weight')
where each element is divided by the sum of elements.
– Anton Tarasenko
Nov 22 '18 at 11:33
Finally I have solved this issue by:
model.booster().get_score(importance_type='weight')
model.get_booster().get_score(importance_type='weight')
instead.
– Sndn
Jan 26 '19 at 06:46
In the past the Scikit-Learn wrapper XGBRegressor
and XGBClassifier
should get the feature importance using model.booster().get_score()
. Not sure from which version but now in xgboost 0.71
we can access it using
model.feature_importances_
from xgboost.sklearn import XGBRegressor
in version 0.72.1 and this worked for me. Thanks!
– Adam
Jul 20 '18 at 17:57
feature_importances_
withXGBRegressor()
, because it works only withXGBClassifier()
. – Simone Jun 21 '17 at 16:46