Skip to content

[BUG] XGBoostLSS wrapper should expose the underlying XGBoost params #671

@marrov

Description

@marrov

Describe the bug
The current XGBoostLSS implementation forces that if n_trials == 0 the default parameters will be passed to XGBoost downstream. The original library has an embedded tuning, which is nice, but if this regressor is used for time series it cannot be used (series dependence broken by k-fold shuffling). Therefore, the user that tries to build a forecaster needs to set n_trials == 0 but is then unable to change the default parameters to be able to perform tuning outside of the model in a more composable manner.

The solution could be to add an xgb_params in the init, and then:

        if self.n_trials == 0:
            opt_params = self.xgb_params if self.xgb_params is not None else {}

So, after this change this enables this pattern:

param_grid = {... xgb_params ...}
cv = ...
estimator = XGBoostLSS()
forecaster = DirectTabularRegressionForecaster(estimator=estimator)
tunable_forecaster = ForecastingGridSearchCV(forecaster, cv, param_grid)

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions