-
Notifications
You must be signed in to change notification settings - Fork 74
Open
Labels
Description
Describe the bug
The current XGBoostLSS implementation forces that if n_trials == 0 the default parameters will be passed to XGBoost downstream. The original library has an embedded tuning, which is nice, but if this regressor is used for time series it cannot be used (series dependence broken by k-fold shuffling). Therefore, the user that tries to build a forecaster needs to set n_trials == 0 but is then unable to change the default parameters to be able to perform tuning outside of the model in a more composable manner.
The solution could be to add an xgb_params in the init, and then:
if self.n_trials == 0:
opt_params = self.xgb_params if self.xgb_params is not None else {}So, after this change this enables this pattern:
param_grid = {... xgb_params ...}
cv = ...
estimator = XGBoostLSS()
forecaster = DirectTabularRegressionForecaster(estimator=estimator)
tunable_forecaster = ForecastingGridSearchCV(forecaster, cv, param_grid)joshdunnlime