Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hyperparamter_search not running over all folds? #19

Open
sebastianrosengren opened this issue Sep 1, 2021 · 2 comments
Open

hyperparamter_search not running over all folds? #19

sebastianrosengren opened this issue Sep 1, 2021 · 2 comments

Comments

@sebastianrosengren
Copy link

sebastianrosengren commented Sep 1, 2021

for loop running 1:1

for (i in 1:1) {
  dtrain <- xgboost::xgb.DMatrix(train_feat[[i]]$data)
  attr(dtrain, "errors") <- train_feat[[i]]$errors

  bst <- xgboost::xgb.train(param, dtrain, nrounds)
  preds <- M4metalearning::predict_selection_ensemble(bst, test_feat[[i]]$data)
  er <- M4metalearning::summary_performance(preds,
                                            test_ds[[i]],
                                            print.summary = FALSE)

  final_error <- c(final_error, er$weighted_error)
  final_preds <- rbind(final_preds, preds)
}

Should maybe be 1.length(folds) ?

@pmontman
Copy link
Collaborator

pmontman commented Sep 2, 2021

Hi Sebastian, you are right, the loop could run for all the folds.
It is done that way for computational reasons. It is doing a holdout validation (so just an approximation anyways) that runs inside a bayesian optimization loop. So you can also modify the number of rounds of bayesian optimization if you want to find a better solution.

@sebastianrosengren
Copy link
Author

Hi,

Sorry for the late appreciation post. Thanks for taking the time and clarifying this.

Much appreciated!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants