Releases: sktime/pytorch-forecasting
Releases · sktime/pytorch-forecasting
Adding DeepAR
Added
- DeepAR by Amazon (#115)
- First autoregressive model in PyTorch Forecasting
- Distribution loss: normal, negative binomial and log-normal distributions
- Currently missing: handling lag variables and tutorial (planned for 0.6.1)
- Improved documentation on TimeSeriesDataSet and how to implement a new network (#145)
Changed
- Internals of encoders and how they store center and scale (#115)
Fixed
- Update to PyTorch 1.7 and PyTorch Lightning 1.0.5 which came with breaking changes for CUDA handling and with optimizers (PyTorch Forecasting Ranger version) (#143, #137, #115)
Contributors
- jdb78
- JakeForesey
Bug fixes
Fixes
- Fix issue where hyperparameter verbosity controlled only part of output (#118)
- Fix occasional error when
.get_parameters()
fromTimeSeriesDataSet
failed (#117) - Remove redundant double pass through LSTM for temporal fusion transformer (#125)
- Prevent installation of pytorch-lightning 1.0.4 as it breaks the code (#127)
- Prevent modification of model defaults in-place (#112)
Fixes to interpretation and more control over hyperparameter verbosity
Added
- Hyperparameter tuning with optuna to tutorial
- Control over verbosity of hyper parameter tuning
Fixes
- Interpretation error when different batches had different maximum decoder lengths
- Fix some typos (no changes to user API)
PyTorch Lightning 1.0 compatibility
This release has only one purpose: Allow usage of PyTorch Lightning 1.0 - all tests have passed.
PyTorch Lightning 0.10 compatibility and classification
Added
- Additional checks for
TimeSeriesDataSet
inputs - now flagging if series are lost due to highmin_encoder_length
and ensure parameters are integers - Enable classification - simply change the target in the
TimeSeriesDataSet
to a non-float variable, use theCrossEntropy
metric to optimize and output as many classes as you want to predict
Changed
- Ensured PyTorch Lightning 0.10 compatibility
- Using
LearningRateMonitor
instead ofLearningRateLogger
- Use
EarlyStopping
callback in trainercallbacks
instead ofearly_stopping
argument - Update metric system
update()
andcompute()
methods - Use
trainer.tuner.lr_find()
instead oftrainer.lr_find()
in tutorials and examples
- Using
- Update poetry to 1.1.0
Various fixes models and data
Fixes
Model
- Removed attention to current datapoint in TFT decoder to generalise better over various sequence lengths
- Allow resuming optuna hyperparamter tuning study
Data
- Fixed inconsistent naming and calculation of
encoder_length
in TimeSeriesDataSet when added as feature
Contributors
- jdb78
Metrics, performance, and subsequence detection
Added
Models
- Backcast loss for N-BEATS network for better regularisation
- logging_metrics as explicit arguments to models
Metrics
- MASE (Mean absolute scaled error) metric for training and reporting
- Metrics can be composed, e.g.
0.3* metric1 + 0.7 * metric2
- Aggregation metric that is computed on mean prediction over all samples to reduce mean-bias
Data
- Increased speed of parsing data with missing datapoints. About 2s for 1M data points. If
numba
is installed, 0.2s for 1M data points - Time-synchronize samples in batches: ensure that all samples in each batch have with same time index in decoder
Breaking changes
- Improved subsequence detection in TimeSeriesDataSet ensures that there exists a subsequence starting and ending on each point in time.
- Fix
min_encoder_length = 0
being ignored and processed asmin_encoder_length = max_encoder_length
Contributors
- jdb78
- dehoyosb
More tests and better docs
- More tests driving coverage to ~90%
- Performance tweaks for temporal fusion transformer
- Reformatting with sort
- Improve documentation - particularly expand on hyper parameter tuning
Fixes:
- Fix PoissonLoss quantiles calculation
- Fix N-Beats visualisations
More testing and interpretation features
Added
- Calculating partial dependency for a variable
- Improved documentation - in particular added FAQ section and improved tutorial
- Data for examples and tutorials can now be downloaded. Cloning the repo is not a requirement anymore
- Added Ranger Optimizer from
pytorch_ranger
package and fixed its warnings (part of preparations for conda package release) - Use GPU for tests if available as part of preparation for GPU tests in CI
Changes
- BREAKING: Fix typo “add_decoder_length” to “add_encoder_length” in TimeSeriesDataSet
Bugfixes
- Fixing plotting predictions vs actuals by slicing variables
Fix edge case in prediction logging
Fixes
Fix bug where predictions were not correctly logged in case of decoder_length == 1
.
Additions
Add favicon to docs page