You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the XGBSEDebiasedBCE model, there is no way to overwrite the model with an already-trained model. This is useful as XGBSE does not expose some low-level APIs of the XGBoost model, for example, sample/feature weighting. This feature would be useful to add the data and an already trained model and then run the LR on top of that model.
Will this change a current behavior? How?
No. It will only add a method to the XGBSEDebiasedBCE model to replace the 'self.bst' and run the multitask LR task.
The text was updated successfully, but these errors were encountered:
Thanks for the report, @BardiaKh .
We are currently working on a way to replace the 1st step xgboost model from both XGBSEDebiasedBCE and XGBSEStackedWeibull modules. Both will be part of our next minor release.
Describe the feature and the current state.
In the XGBSEDebiasedBCE model, there is no way to overwrite the model with an already-trained model. This is useful as XGBSE does not expose some low-level APIs of the XGBoost model, for example, sample/feature weighting. This feature would be useful to add the data and an already trained model and then run the LR on top of that model.
Will this change a current behavior? How?
No. It will only add a method to the XGBSEDebiasedBCE model to replace the 'self.bst' and run the multitask LR task.
The text was updated successfully, but these errors were encountered: