Skip to content

Releases: JuliaAI/MLJTuning.jl

v0.7.3

01 Sep 22:10
bfa9723
Compare
Choose a tag to compare

MLJTuning v0.7.3

Diff since v0.7.2

Merged pull requests:

  • Clean up tests and code to avoid private access of machine.report (#181) (@ablaom)
  • For a 0.7.3 release (#182) (@ablaom)

v0.7.2

17 Jun 02:50
f9ad59d
Compare
Choose a tag to compare

MLJTuning v0.7.2

Diff since v0.7.1

Closed issues:

  • Checklist for 0.7.0 release (#152)

Merged pull requests:

v0.7.1

24 May 08:45
d87788a
Compare
Choose a tag to compare

MLJTuning v0.7.1

Diff since v0.7.0

  • (enhancement) Allow user to specify class weights when constructing TunedModel , for passing to measures (#172)
  • (convenience) Allow user to specify models as an argument instead of keyword, as in TunedModel(model, tuning=...) (#175)
  • Suppress some unecessary "outer layer" caching of training data (#173)

Closed issues:

  • Tuned Model interface doesnt have class_weights (#134)
  • Allow TunedModel(mymodel; kwargs....) in addition to TunedModel(model=mymodel; kwargs...) (#154)
  • Machines wrapping TunedModel instances should never cache data (#171)

Merged pull requests:

  • Add class weight support (#172) (@ablaom)
  • Suppress data caching in TunedModel machines, at outer level (#173) (@ablaom)
  • Allow user to specify model in TunedModel as arg instead of kwarg (#175) (@ablaom)
  • For a 0.7.1 release (#176) (@ablaom)

v0.7.0

06 Apr 05:14
ebf0983
Compare
Choose a tag to compare

MLJTuning v0.7.0

Diff since v0.6.16

  • (breaking) Change default tuning strategy from Grid to RandomSearch() (#147)
  • (breaking) Remove deprecated learning_curve! method (#151)
  • (breaking) Adapt to serialization changes in MLJBase 0.20. In particular, this allows serialization of TunedModel(model=...) when model is not pure Julia (#165) @olivierlabayle

Closed issues:

  • Make RandomSearch the default, instead of Grid (#147)
  • Remove learning_curve! which has been deprecated (#151)

Merged pull requests:

v0.6.16

29 Dec 01:20
31f0255
Compare
Choose a tag to compare

MLJTuning v0.6.16

Diff since v0.6.15

Closed issues:

  • Plotting results of tuning is not working for negative measurements (#162)

Merged pull requests:

v0.6.15

23 Dec 20:07
0b50093
Compare
Choose a tag to compare

MLJTuning v0.6.15

Diff since v0.6.14

Closed issues:

  • Add warning in documentation about unpredictability of history order when using parallelization (#140)

Merged pull requests:

  • Use mean instead of average for UnivariateFinite in tests; extend MLJBase compat to include 0.19^ (#160) (@ablaom)
  • For a 0.6.15 release (#161) (@ablaom)

v0.6.14

01 Nov 23:58
a9f5732
Compare
Choose a tag to compare

MLJTuning v0.6.14

Diff since v0.6.13

Merged pull requests:

  • Fix invalid test tripped by julia 1.6.3 (reproducibility issue) (#156) (@ablaom)
  • Fix a confusing error when forgetting to instantiate models (#157) (@rikhuijzer)
  • Enable tuning of outlier detection models (#158) (@davnn)
  • For a 0.6.14 release (#159) (@ablaom)

v0.6.13

23 Sep 03:45
afcd719
Compare
Choose a tag to compare

MLJTuning v0.6.13

Diff since v0.6.12

Closed issues:

  • Issue for triggering new releases (#59)
  • Add deprecation warning for learning_curve! (#60)
  • Add particle swarm strateg(ies) (#132)
  • Scores versus losses being addressed properly in TreeParzen? (#136)

Merged pull requests:

v0.6.12

21 Sep 06:59
a47019b
Compare
Choose a tag to compare

MLJTuning v0.6.12

Diff since v0.6.11

  • (enhancement) Allow specification of deterministic measures for probabilistic predictors (#148)

Merged pull requests:

v0.6.11

10 Sep 03:28
a819b50
Compare
Choose a tag to compare

MLJTuning v0.6.11

Diff since v0.6.10

Closed issues:

  • Non-thread safe use of resampling machines (#126)

Merged pull requests: