Skip to content

Commit

Permalink
new
Browse files Browse the repository at this point in the history
  • Loading branch information
inikishev authored Dec 25, 2024
1 parent 9626ade commit 50ee49d
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ This closure will also work with all built in pytorch optimizers, including LBFG

# Contents
There will be docs with a more exhaustive list and explanations. A preliminary list of all modules is available here https://torchzero.readthedocs.io/en/latest/autoapi/torchzero/modules/index.html#classes. For now I hope that everything should be reasonably straightforward to use.
- SGD/RProp/RMSProp/AdaGrad/Adam as composable modules
- SGD/Rprop/RMSProp/AdaGrad/Adam as composable modules. They are also tested to exactly match built in pytorch versions.
- Cautious Optimizers (https://huggingface.co/papers/2411.16085)
- Optimizer grafting (https://openreview.net/forum?id=FpKgG31Z_i9)
- Laplacian smoothing (https://arxiv.org/abs/1806.06317)
Expand Down

0 comments on commit 50ee49d

Please sign in to comment.