Skip to content

Commit

Permalink
release: 0.19.1 (#401)
Browse files Browse the repository at this point in the history
* release: 0.19.1

* chore: update readme

* chore: remove remotes

* fix: cran omp thread limit

* fix: remove xgboost tests
  • Loading branch information
be-marc authored Nov 20, 2023
1 parent 9fd3f4b commit a10c9a4
Show file tree
Hide file tree
Showing 6 changed files with 28 additions and 24 deletions.
10 changes: 4 additions & 6 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Package: mlr3tuning
Title: Hyperparameter Optimization for 'mlr3'
Version: 0.19.0.9000
Version: 0.19.1
Authors@R: c(
person("Marc", "Becker", , "[email protected]", role = c("cre", "aut"),
comment = c(ORCID = "0000-0002-8115-0400")),
Expand All @@ -25,15 +25,15 @@ License: LGPL-3
URL: https://mlr3tuning.mlr-org.com, https://github.com/mlr-org/mlr3tuning
BugReports: https://github.com/mlr-org/mlr3tuning/issues
Depends:
mlr3 (>= 0.14.1),
mlr3 (>= 0.17.0),
paradox (>= 0.10.0),
R (>= 3.1.0)
Imports:
bbotk (>= 0.7.2),
bbotk (>= 0.7.3),
checkmate (>= 2.0.0),
data.table,
lgr,
mlr3misc (>= 0.11.0),
mlr3misc (>= 0.13.0),
R6
Suggests:
adagio,
Expand All @@ -47,8 +47,6 @@ Suggests:
rpart,
testthat (>= 3.0.0),
xgboost
Remotes:
mlr-org/mlr3
Config/testthat/edition: 3
Config/testthat/parallel: true
Encoding: UTF-8
Expand Down
3 changes: 2 additions & 1 deletion NEWS.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
# mlr3tuning (development version)
# mlr3tuning 0.19.1

* refactor: Speed up the tuning process by minimizing the number of deep clones and parameter checks.
* fix: Set `store_benchmark_result = TRUE` if `store_models = TRUE` when creating a tuning instance.
* fix: Passing a terminator in `tune_nested()` did not work.

Expand Down
3 changes: 3 additions & 0 deletions R/zzz.R
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,9 @@
"_PACKAGE"

.onLoad = function(libname, pkgname) {
# CRAN OMP THREAD LIMIT
Sys.setenv("OMP_THREAD_LIMIT" = 2)

# nocov start
x = utils::getFromNamespace("mlr_reflections", ns = "mlr3")
x$tuner_properties = "dependencies"
Expand Down
32 changes: 16 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@

# mlr3tuning <img src="man/figures/logo.png" align="right" width = "120" />

Package website: [release](https://mlr3tuning.mlr-org.com/) |
Package website: [release](https://mlr3tuning.mlr-org.com/) \|
[dev](https://mlr3tuning.mlr-org.com/dev/)

<!-- badges: start -->
Expand Down Expand Up @@ -34,49 +34,49 @@ The package is built on the optimization framework

mlr3tuning is extended by the following packages.

- [mlr3tuningspaces](https://github.com/mlr-org/mlr3tuningspaces) is a
- [mlr3tuningspaces](https://github.com/mlr-org/mlr3tuningspaces) is a
collection of search spaces from scientific articles for commonly
used learners.
- [mlr3hyperband](https://github.com/mlr-org/mlr3hyperband) adds the
- [mlr3hyperband](https://github.com/mlr-org/mlr3hyperband) adds the
Hyperband and Successive Halving algorithm.
- [mlr3mbo](https://github.com/mlr-org/mlr3mbo) adds Bayesian
- [mlr3mbo](https://github.com/mlr-org/mlr3mbo) adds Bayesian
Optimization methods.

## Resources

There are several sections about hyperparameter optimization in the
[mlr3book](https://mlr3book.mlr-org.com).

- Getting started with [hyperparameter
- Getting started with [hyperparameter
optimization](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html).
- [Tune](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-model-tuning)
- [Tune](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-model-tuning)
a support vector machine on the Sonar data set.
- Learn about [tuning
- Learn about [tuning
spaces](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-defining-search-spaces).
- Estimate the model performance with [nested
- Estimate the model performance with [nested
resampling](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-nested-resampling).
- Learn about [multi-objective
- Learn about [multi-objective
optimization](https://mlr3book.mlr-org.com/chapters/chapter5/advanced_tuning_methods_and_black_box_optimization.html#sec-multi-metrics-tuning).

The [gallery](https://mlr-org.com/gallery-all-optimization.html)
features a collection of case studies and demos about optimization.

- Learn more advanced methods with the [Practical Tuning
- Learn more advanced methods with the [Practical Tuning
Series](https://mlr-org.com/gallery/series/2021-03-09-practical-tuning-series-tune-a-support-vector-machine/).
- Optimize an rpart classification tree with only a [few lines of
- Optimize an rpart classification tree with only a [few lines of
code](https://mlr-org.com/gallery/optimization/2022-11-10-hyperparameter-optimization-on-the-palmer-penguins/).
- Simultaneously optimize hyperparameters and use [early
- Simultaneously optimize hyperparameters and use [early
stopping](https://mlr-org.com/gallery/optimization/2022-11-04-early-stopping-with-xgboost/)
with XGBoost.
- Make us of proven [search
- Make us of proven [search
space](https://mlr-org.com/gallery/optimization/2021-07-06-introduction-to-mlr3tuningspaces/).
- Learn about
- Learn about
[hotstarting](https://mlr-org.com/gallery/optimization/2023-01-16-hotstart/)
models.
- Run the [default hyperparameter
- Run the [default hyperparameter
configuration](https://mlr-org.com/gallery/optimization/2023-01-31-default-configuration/)
of learners as a baseline.
- Use the
- Use the
[Hyperband](https://mlr-org.com/gallery/series/2023-01-15-hyperband-xgboost/)
optimizer with different budget parameters.

Expand Down
3 changes: 2 additions & 1 deletion inst/WORDLIST
Original file line number Diff line number Diff line change
Expand Up @@ -52,16 +52,17 @@ hotstart
hotstarting
hyperband
irace
iteratively
mbo
mlr
nloptr
optimizers
parallelization
parallelize
param
parametrized
rpart
subclasses
svm
th
trafo
tuningspaces
Expand Down
1 change: 1 addition & 0 deletions tests/testthat/test_mlr_callbacks.R
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
test_that("early stopping callback works", {
skip_on_cran()
skip_if_not_installed("mlr3learners")
skip_if_not_installed("xgboost")
library(mlr3learners) # nolint
Expand Down

0 comments on commit a10c9a4

Please sign in to comment.