Skip to content

Releases: sktime/pytorch-forecasting

Adding DeepAR

10 Nov 12:27
278daa0
Compare
Choose a tag to compare

Added

  • DeepAR by Amazon (#115)
    • First autoregressive model in PyTorch Forecasting
    • Distribution loss: normal, negative binomial and log-normal distributions
    • Currently missing: handling lag variables and tutorial (planned for 0.6.1)
  • Improved documentation on TimeSeriesDataSet and how to implement a new network (#145)

Changed

  • Internals of encoders and how they store center and scale (#115)

Fixed

  • Update to PyTorch 1.7 and PyTorch Lightning 1.0.5 which came with breaking changes for CUDA handling and with optimizers (PyTorch Forecasting Ranger version) (#143, #137, #115)

Contributors

  • jdb78
  • JakeForesey

Bug fixes

31 Oct 08:29
bfab49a
Compare
Choose a tag to compare

Fixes

  • Fix issue where hyperparameter verbosity controlled only part of output (#118)
  • Fix occasional error when .get_parameters() from TimeSeriesDataSet failed (#117)
  • Remove redundant double pass through LSTM for temporal fusion transformer (#125)
  • Prevent installation of pytorch-lightning 1.0.4 as it breaks the code (#127)
  • Prevent modification of model defaults in-place (#112)

Fixes to interpretation and more control over hyperparameter verbosity

18 Oct 06:39
aa4f0d9
Compare
Choose a tag to compare

Added

  • Hyperparameter tuning with optuna to tutorial
  • Control over verbosity of hyper parameter tuning

Fixes

  • Interpretation error when different batches had different maximum decoder lengths
  • Fix some typos (no changes to user API)

PyTorch Lightning 1.0 compatibility

14 Oct 05:04
5d5adcf
Compare
Choose a tag to compare

This release has only one purpose: Allow usage of PyTorch Lightning 1.0 - all tests have passed.

PyTorch Lightning 0.10 compatibility and classification

12 Oct 06:44
599047e
Compare
Choose a tag to compare

Added

  • Additional checks for TimeSeriesDataSet inputs - now flagging if series are lost due to high min_encoder_length and ensure parameters are integers
  • Enable classification - simply change the target in the TimeSeriesDataSet to a non-float variable, use the CrossEntropy metric to optimize and output as many classes as you want to predict

Changed

  • Ensured PyTorch Lightning 0.10 compatibility
    • Using LearningRateMonitor instead of LearningRateLogger
    • Use EarlyStopping callback in trainer callbacks instead of early_stopping argument
    • Update metric system update() and compute() methods
    • Use trainer.tuner.lr_find() instead of trainer.lr_find() in tutorials and examples
  • Update poetry to 1.1.0

Various fixes models and data

01 Oct 08:58
e97189f
Compare
Choose a tag to compare

Fixes

Model

  • Removed attention to current datapoint in TFT decoder to generalise better over various sequence lengths
  • Allow resuming optuna hyperparamter tuning study

Data

  • Fixed inconsistent naming and calculation of encoder_lengthin TimeSeriesDataSet when added as feature

Contributors

  • jdb78

Metrics, performance, and subsequence detection

28 Sep 19:58
8c7277a
Compare
Choose a tag to compare

Added

Models

  • Backcast loss for N-BEATS network for better regularisation
  • logging_metrics as explicit arguments to models

Metrics

  • MASE (Mean absolute scaled error) metric for training and reporting
  • Metrics can be composed, e.g. 0.3* metric1 + 0.7 * metric2
  • Aggregation metric that is computed on mean prediction over all samples to reduce mean-bias

Data

  • Increased speed of parsing data with missing datapoints. About 2s for 1M data points. If numba is installed, 0.2s for 1M data points
  • Time-synchronize samples in batches: ensure that all samples in each batch have with same time index in decoder

Breaking changes

  • Improved subsequence detection in TimeSeriesDataSet ensures that there exists a subsequence starting and ending on each point in time.
  • Fix min_encoder_length = 0 being ignored and processed as min_encoder_length = max_encoder_length

Contributors

  • jdb78
  • dehoyosb

More tests and better docs

13 Sep 22:43
5e7c808
Compare
Choose a tag to compare
  • More tests driving coverage to ~90%
  • Performance tweaks for temporal fusion transformer
  • Reformatting with sort
  • Improve documentation - particularly expand on hyper parameter tuning

Fixes:

  • Fix PoissonLoss quantiles calculation
  • Fix N-Beats visualisations

More testing and interpretation features

02 Sep 22:18
6843748
Compare
Choose a tag to compare

Added

  • Calculating partial dependency for a variable
  • Improved documentation - in particular added FAQ section and improved tutorial
  • Data for examples and tutorials can now be downloaded. Cloning the repo is not a requirement anymore
  • Added Ranger Optimizer from pytorch_ranger package and fixed its warnings (part of preparations for conda package release)
  • Use GPU for tests if available as part of preparation for GPU tests in CI

Changes

  • BREAKING: Fix typo “add_decoder_length” to “add_encoder_length” in TimeSeriesDataSet

Bugfixes

  • Fixing plotting predictions vs actuals by slicing variables

Fix edge case in prediction logging

26 Aug 21:03
3c517d6
Compare
Choose a tag to compare

Fixes

Fix bug where predictions were not correctly logged in case of decoder_length == 1.

Additions

Add favicon to docs page