Just what the title says. Suppose I know the feature that I want to be used for splitting for the root node in a tree model in XGBoost; is there a way for me to tell XGBoost to split on this feature at the root node?
Asked
Active
Viewed 1,189 times
6
-
I don't know the answer to your question, but I'm curious why you want to do this. – tom Oct 22 '19 at 20:42
-
1@tom I have a categorical feature that can take one of several values, and I'm considering whether I want to make a single model and let XGBoost deal with this feature on its own, or if I want to make a different model for each possible value of this feature. The latter option is tedious and time consuming, so if I can tell XGBoost just to always split on that feature first then that should serve as an easy way of testing the "many different models" case. – bgav Oct 22 '19 at 21:18
3 Answers
1
I do not have the whole answer, but I believe that xgboost allows warm-starting, which implies starting from a provided point.
xgb.train(…,xgb_model=model_to_start_from)

Stephen Rauch
- 1,783
- 11
- 22
- 34

Martin Alley
- 11
- 1
-
Warm starting in xgb, AFAIK, only works for adding more base learners, not extending existing ones. – Ben Reiniger Dec 18 '19 at 14:09
1
It's hard to prove a negative, but I think the answer is no. Especially in xgboost compared to other tree-model packages, it's rather hard to even access the base trees, let alone modify their build.

Ben Reiniger
- 11,770
- 3
- 16
- 56
0
Could you enforce this using feature interactions? Maybe create two sets of features with only that one feature in common? I’m not sure if it will guarantee that that shared feature is the one on which the first split is made.

Just trying
- 465
- 2
- 6