Skip to content

Commit

Permalink
Merge pull request #292 from kozistr/refactor/utils
Browse files Browse the repository at this point in the history
[Refactor] Remove direct import of many methods from `utils`, etc
  • Loading branch information
kozistr authored Nov 24, 2024
2 parents 7868588 + ceb592a commit 131314b
Show file tree
Hide file tree
Showing 128 changed files with 851 additions and 814 deletions.
36 changes: 20 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,10 +49,7 @@ optimizer = load_optimizer(optimizer='adamp')(model.parameters())

# if you install `bitsandbytes` optimizer, you can use `8-bit` optimizers from `pytorch-optimizer`.

from pytorch_optimizer import load_optimizer

opt = load_optimizer(optimizer='bnb_adamw8bit')
optimizer = opt(model.parameters())
optimizer = load_optimizer(optimizer='bnb_adamw8bit')(model.parameters())
```

Also, you can load the optimizer via `torch.hub`.
Expand All @@ -61,6 +58,7 @@ Also, you can load the optimizer via `torch.hub`.
import torch

model = YourModel()

opt = torch.hub.load('kozistr/pytorch_optimizer', 'adamp')
optimizer = opt(model.parameters())
```
Expand Down Expand Up @@ -93,11 +91,13 @@ supported_optimizers = get_supported_optimizers()
or you can also search them with the filter(s).

```python
>>> get_supported_optimizers('adam*')
['adamax', 'adamg', 'adammini', 'adamod', 'adamp', 'adams', 'adamw']
from pytorch_optimizer import get_supported_optimizers

>>> get_supported_optimizers(['adam*', 'ranger*'])
['adamax', 'adamg', 'adammini', 'adamod', 'adamp', 'adams', 'adamw', 'ranger', 'ranger21']
get_supported_optimizers('adam*')
# ['adamax', 'adamg', 'adammini', 'adamod', 'adamp', 'adams', 'adamw']

get_supported_optimizers(['adam*', 'ranger*'])
# ['adamax', 'adamg', 'adammini', 'adamod', 'adamp', 'adams', 'adamw', 'ranger', 'ranger21']
```

| Optimizer | Description | Official Code | Paper | Citation |
Expand Down Expand Up @@ -197,11 +197,13 @@ supported_lr_schedulers = get_supported_lr_schedulers()
or you can also search them with the filter(s).

```python
>>> get_supported_lr_schedulers('cosine*')
['cosine', 'cosine_annealing', 'cosine_annealing_with_warm_restart', 'cosine_annealing_with_warmup']
from pytorch_optimizer import get_supported_lr_schedulers

get_supported_lr_schedulers('cosine*')
# ['cosine', 'cosine_annealing', 'cosine_annealing_with_warm_restart', 'cosine_annealing_with_warmup']

>>> get_supported_lr_schedulers(['cosine*', '*warm*'])
['cosine', 'cosine_annealing', 'cosine_annealing_with_warm_restart', 'cosine_annealing_with_warmup', 'warmup_stable_decay']
get_supported_lr_schedulers(['cosine*', '*warm*'])
# ['cosine', 'cosine_annealing', 'cosine_annealing_with_warm_restart', 'cosine_annealing_with_warmup', 'warmup_stable_decay']
```

| LR Scheduler | Description | Official Code | Paper | Citation |
Expand All @@ -224,11 +226,13 @@ supported_loss_functions = get_supported_loss_functions()
or you can also search them with the filter(s).

```python
>>> get_supported_loss_functions('*focal*')
['bcefocalloss', 'focalcosineloss', 'focalloss', 'focaltverskyloss']
from pytorch_optimizer import get_supported_loss_functions

get_supported_loss_functions('*focal*')
# ['bcefocalloss', 'focalcosineloss', 'focalloss', 'focaltverskyloss']

>>> get_supported_loss_functions(['*focal*', 'bce*'])
['bcefocalloss', 'bceloss', 'focalcosineloss', 'focalloss', 'focaltverskyloss']
get_supported_loss_functions(['*focal*', 'bce*'])
# ['bcefocalloss', 'bceloss', 'focalcosineloss', 'focalloss', 'focaltverskyloss']
```

| Loss Functions | Description | Official Code | Paper | Citation |
Expand Down
16 changes: 16 additions & 0 deletions docs/changelogs/v3.3.0.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,3 +8,19 @@
* [Modified Adam Can Converge with Any β2 with the Optimal Rate](https://arxiv.org/abs/2411.02853)
* Implement `FTRL` optimizer. (#291)
* [Follow The Regularized Leader](https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/41159.pdf)

### Refactor

* Big refactoring, removing direct import from `pytorch_optimizer.*`.
* I removed some methods not to directly import from it from `pytorch_optimzier.*` because they're probably not used frequently and actually not an optimizer rather utils only used for specific optimizers.
* `pytorch_optimizer.[Shampoo stuff]` -> `pytorch_optimizer.optimizers.shampoo_utils.[Shampoo stuff]`.
* `shampoo_utils` like `Graft`, `BlockPartitioner`, `PreConditioner`, etc. You can check the details [here](https://github.com/kozistr/pytorch_optimizer/blob/main/pytorch_optimizer/optimizer/shampoo_utils.py).
* `pytorch_optimizer.GaLoreProjector` -> `pytorch_optimizer.optimizers.galore.GaLoreProjector`.
* `pytorch_optimizer.gradfilter_ema` -> `pytorch_optimizer.optimizers.grokfast.gradfilter_ema`.
* `pytorch_optimizer.gradfilter_ma` -> `pytorch_optimizer.optimizers.grokfast.gradfilter_ma`.
* `pytorch_optimizer.l2_projection` -> `pytorch_optimizer.optimizers.alig.l2_projection`.
* `pytorch_optimizer.flatten_grad` -> `pytorch_optimizer.optimizers.pcgrad.flatten_grad`.
* `pytorch_optimizer.un_flatten_grad` -> `pytorch_optimizer.optimizers.pcgrad.un_flatten_grad`.
* `pytorch_optimizer.reduce_max_except_dim` -> `pytorch_optimizer.optimizers.sm3.reduce_max_except_dim`.
* `pytorch_optimizer.neuron_norm` -> `pytorch_optimizer.optimizers.nero.neuron_norm`.
* `pytorch_optimizer.neuron_mean` -> `pytorch_optimizer.optimizers.nero.neuron_mean`.
36 changes: 20 additions & 16 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,10 +49,7 @@ optimizer = load_optimizer(optimizer='adamp')(model.parameters())

# if you install `bitsandbytes` optimizer, you can use `8-bit` optimizers from `pytorch-optimizer`.

from pytorch_optimizer import load_optimizer

opt = load_optimizer(optimizer='bnb_adamw8bit')
optimizer = opt(model.parameters())
optimizer = load_optimizer(optimizer='bnb_adamw8bit')(model.parameters())
```

Also, you can load the optimizer via `torch.hub`.
Expand All @@ -61,6 +58,7 @@ Also, you can load the optimizer via `torch.hub`.
import torch

model = YourModel()

opt = torch.hub.load('kozistr/pytorch_optimizer', 'adamp')
optimizer = opt(model.parameters())
```
Expand Down Expand Up @@ -93,11 +91,13 @@ supported_optimizers = get_supported_optimizers()
or you can also search them with the filter(s).

```python
>>> get_supported_optimizers('adam*')
['adamax', 'adamg', 'adammini', 'adamod', 'adamp', 'adams', 'adamw']
from pytorch_optimizer import get_supported_optimizers

>>> get_supported_optimizers(['adam*', 'ranger*'])
['adamax', 'adamg', 'adammini', 'adamod', 'adamp', 'adams', 'adamw', 'ranger', 'ranger21']
get_supported_optimizers('adam*')
# ['adamax', 'adamg', 'adammini', 'adamod', 'adamp', 'adams', 'adamw']

get_supported_optimizers(['adam*', 'ranger*'])
# ['adamax', 'adamg', 'adammini', 'adamod', 'adamp', 'adams', 'adamw', 'ranger', 'ranger21']
```

| Optimizer | Description | Official Code | Paper | Citation |
Expand Down Expand Up @@ -197,11 +197,13 @@ supported_lr_schedulers = get_supported_lr_schedulers()
or you can also search them with the filter(s).

```python
>>> get_supported_lr_schedulers('cosine*')
['cosine', 'cosine_annealing', 'cosine_annealing_with_warm_restart', 'cosine_annealing_with_warmup']
from pytorch_optimizer import get_supported_lr_schedulers

get_supported_lr_schedulers('cosine*')
# ['cosine', 'cosine_annealing', 'cosine_annealing_with_warm_restart', 'cosine_annealing_with_warmup']

>>> get_supported_lr_schedulers(['cosine*', '*warm*'])
['cosine', 'cosine_annealing', 'cosine_annealing_with_warm_restart', 'cosine_annealing_with_warmup', 'warmup_stable_decay']
get_supported_lr_schedulers(['cosine*', '*warm*'])
# ['cosine', 'cosine_annealing', 'cosine_annealing_with_warm_restart', 'cosine_annealing_with_warmup', 'warmup_stable_decay']
```

| LR Scheduler | Description | Official Code | Paper | Citation |
Expand All @@ -224,11 +226,13 @@ supported_loss_functions = get_supported_loss_functions()
or you can also search them with the filter(s).

```python
>>> get_supported_loss_functions('*focal*')
['bcefocalloss', 'focalcosineloss', 'focalloss', 'focaltverskyloss']
from pytorch_optimizer import get_supported_loss_functions

get_supported_loss_functions('*focal*')
# ['bcefocalloss', 'focalcosineloss', 'focalloss', 'focaltverskyloss']

>>> get_supported_loss_functions(['*focal*', 'bce*'])
['bcefocalloss', 'bceloss', 'focalcosineloss', 'focalloss', 'focaltverskyloss']
get_supported_loss_functions(['*focal*', 'bce*'])
# ['bcefocalloss', 'bceloss', 'focalcosineloss', 'focalloss', 'focaltverskyloss']
```

| Loss Functions | Description | Official Code | Paper | Citation |
Expand Down
4 changes: 2 additions & 2 deletions docs/loss.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Loss Function

::: pytorch_optimizer.loss.bi_tempered.bi_tempered_logistic_loss
::: pytorch_optimizer.bi_tempered_logistic_loss
:docstring:

::: pytorch_optimizer.BiTemperedLogisticLoss
Expand Down Expand Up @@ -35,7 +35,7 @@
:docstring:
:members:

::: pytorch_optimizer.loss.jaccard.soft_jaccard_score
::: pytorch_optimizer.soft_jaccard_score
:docstring:

::: pytorch_optimizer.JaccardLoss
Expand Down
20 changes: 8 additions & 12 deletions docs/optimizer.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,13 @@
# Optimizers

::: pytorch_optimizer.optimizer.create_optimizer
:docstring:
:members:

::: pytorch_optimizer.optimizer.get_optimizer_parameters
:docstring:
:members:

::: pytorch_optimizer.A2Grad
:docstring:
:members:
Expand Down Expand Up @@ -168,10 +176,6 @@
:docstring:
:members:

::: pytorch_optimizer.GaLoreProjector
:docstring:
:members:

::: pytorch_optimizer.centralize_gradient
:docstring:
:members:
Expand All @@ -180,14 +184,6 @@
:docstring:
:members:

::: pytorch_optimizer.gradfilter_ema
:docstring:
:members:

::: pytorch_optimizer.gradfilter_ma
:docstring:
:members:

::: pytorch_optimizer.GrokFastAdamW
:docstring:
:members:
Expand Down
92 changes: 0 additions & 92 deletions docs/util.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,6 @@
:docstring:
:members:

::: pytorch_optimizer.optimizer.utils.get_optimizer_parameters
:docstring:
:members:

::: pytorch_optimizer.optimizer.utils.is_valid_parameters
:docstring:
:members:
Expand All @@ -36,14 +32,6 @@
:docstring:
:members:

::: pytorch_optimizer.optimizer.utils.flatten_grad
:docstring:
:members:

::: pytorch_optimizer.optimizer.utils.un_flatten_grad
:docstring:
:members:

::: pytorch_optimizer.optimizer.utils.channel_view
:docstring:
:members:
Expand All @@ -68,14 +56,6 @@
:docstring:
:members:

::: pytorch_optimizer.optimizer.utils.neuron_norm
:docstring:
:members:

::: pytorch_optimizer.optimizer.utils.neuron_mean
:docstring:
:members:

::: pytorch_optimizer.optimizer.utils.disable_running_stats
:docstring:
:members:
Expand All @@ -84,82 +64,10 @@
:docstring:
:members:

::: pytorch_optimizer.optimizer.utils.l2_projection
:docstring:
:members:

::: pytorch_optimizer.optimizer.utils.get_global_gradient_norm
:docstring:
:members:

::: pytorch_optimizer.optimizer.utils.reduce_max_except_dim
:docstring:
:members:

::: pytorch_optimizer.optimizer.shampoo_utils.merge_small_dims
:docstring:
:members:

::: pytorch_optimizer.optimizer.utils.reg_noise
:docstring:
:members:

## Newton methods

::: pytorch_optimizer.optimizer.shampoo_utils.power_iteration
:docstring:
:members:

::: pytorch_optimizer.optimizer.shampoo_utils.compute_power_schur_newton
:docstring:
:members:

::: pytorch_optimizer.optimizer.shampoo_utils.compute_power_svd
:docstring:
:members:

## Grafting

::: pytorch_optimizer.optimizer.shampoo_utils.Graft
:docstring:
:members:

::: pytorch_optimizer.optimizer.shampoo_utils.LayerWiseGrafting
:docstring:
:members:

::: pytorch_optimizer.optimizer.shampoo_utils.SGDGraft
:docstring:
:members:

::: pytorch_optimizer.optimizer.shampoo_utils.SQRTNGraft
:docstring:
:members:

::: pytorch_optimizer.optimizer.shampoo_utils.AdaGradGraft
:docstring:
:members:

::: pytorch_optimizer.optimizer.shampoo_utils.RMSPropGraft
:docstring:
:members:

::: pytorch_optimizer.optimizer.shampoo_utils.build_graft
:docstring:
:members:

## Block Partitioner

::: pytorch_optimizer.optimizer.shampoo_utils.BlockPartitioner
:docstring:
:members:

## Pre-Conditioner

::: pytorch_optimizer.optimizer.shampoo_utils.PreConditionerType
:docstring:
:members:

::: pytorch_optimizer.optimizer.shampoo_utils.PreConditioner
:docstring:
:members:
Binary file modified docs/visualizations/rastrigin_AdaBelief.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_AdaBound.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_AdaMod.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_AdaPNM.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_Adai.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_AdamP.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_AdamW.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_Adan.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_AggMo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_DAdaptAdaGrad.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_DAdaptAdam.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_DAdaptSGD.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_DiffGrad.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_Fromage.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_LARS.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_Lamb.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_MADGRAD.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_MSVAG.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_Nero.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_PID.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_PNM.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_QHAdam.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_QHM.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_RAdam.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_Ranger.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/visualizations/rastrigin_Ranger21.png
Binary file modified docs/visualizations/rastrigin_SGDP.png
Binary file modified docs/visualizations/rastrigin_ScalableShampoo.png
Binary file modified docs/visualizations/rastrigin_Shampoo.png
Binary file modified docs/visualizations/rosenbrock_ASGD.png
Binary file modified docs/visualizations/rosenbrock_AccSGD.png
Binary file modified docs/visualizations/rosenbrock_AdaBelief.png
Binary file modified docs/visualizations/rosenbrock_AdaBound.png
Binary file modified docs/visualizations/rosenbrock_AdaDelta.png
Binary file modified docs/visualizations/rosenbrock_AdaFactor.png
Binary file modified docs/visualizations/rosenbrock_AdaHessian.png
Binary file modified docs/visualizations/rosenbrock_AdaMax.png
Binary file modified docs/visualizations/rosenbrock_AdaMod.png
Binary file modified docs/visualizations/rosenbrock_AdaNorm.png
Binary file modified docs/visualizations/rosenbrock_AdaPNM.png
Binary file modified docs/visualizations/rosenbrock_AdaSmooth.png
Binary file modified docs/visualizations/rosenbrock_Adai.png
Binary file modified docs/visualizations/rosenbrock_Adalite.png
Binary file modified docs/visualizations/rosenbrock_Adam.png
Binary file modified docs/visualizations/rosenbrock_AdamP.png
Binary file modified docs/visualizations/rosenbrock_AdamS.png
Binary file modified docs/visualizations/rosenbrock_AdamW.png
Binary file modified docs/visualizations/rosenbrock_Adan.png
Binary file modified docs/visualizations/rosenbrock_AggMo.png
Binary file modified docs/visualizations/rosenbrock_Aida.png
Binary file modified docs/visualizations/rosenbrock_Amos.png
Binary file modified docs/visualizations/rosenbrock_Apollo.png
Binary file modified docs/visualizations/rosenbrock_AvaGrad.png
Binary file modified docs/visualizations/rosenbrock_CAME.png
Binary file modified docs/visualizations/rosenbrock_DAdaptAdaGrad.png
Binary file modified docs/visualizations/rosenbrock_DAdaptAdam.png
Binary file modified docs/visualizations/rosenbrock_DAdaptAdan.png
Binary file modified docs/visualizations/rosenbrock_DAdaptLion.png
Binary file modified docs/visualizations/rosenbrock_DAdaptSGD.png
Binary file modified docs/visualizations/rosenbrock_DiffGrad.png
Binary file modified docs/visualizations/rosenbrock_FAdam.png
Binary file modified docs/visualizations/rosenbrock_Fromage.png
Binary file modified docs/visualizations/rosenbrock_GaLore.png
Binary file modified docs/visualizations/rosenbrock_Gravity.png
Binary file modified docs/visualizations/rosenbrock_GrokFastAdamW.png
Binary file modified docs/visualizations/rosenbrock_Kate.png
Binary file modified docs/visualizations/rosenbrock_LARS.png
Binary file modified docs/visualizations/rosenbrock_Lamb.png
Binary file modified docs/visualizations/rosenbrock_Lion.png
Binary file modified docs/visualizations/rosenbrock_MADGRAD.png
Binary file modified docs/visualizations/rosenbrock_MSVAG.png
Binary file modified docs/visualizations/rosenbrock_Nero.png
Binary file modified docs/visualizations/rosenbrock_NovoGrad.png
Binary file modified docs/visualizations/rosenbrock_PAdam.png
Binary file modified docs/visualizations/rosenbrock_PID.png
Binary file modified docs/visualizations/rosenbrock_PNM.png
Binary file modified docs/visualizations/rosenbrock_Prodigy.png
Binary file modified docs/visualizations/rosenbrock_QHAdam.png
Binary file modified docs/visualizations/rosenbrock_QHM.png
Binary file modified docs/visualizations/rosenbrock_RAdam.png
Binary file modified docs/visualizations/rosenbrock_Ranger.png
Binary file modified docs/visualizations/rosenbrock_Ranger21.png
Binary file modified docs/visualizations/rosenbrock_SGD.png
Binary file modified docs/visualizations/rosenbrock_SGDP.png
Binary file modified docs/visualizations/rosenbrock_SGDW.png
Binary file modified docs/visualizations/rosenbrock_SM3.png
Binary file modified docs/visualizations/rosenbrock_SRMM.png
Binary file modified docs/visualizations/rosenbrock_SWATS.png
Binary file modified docs/visualizations/rosenbrock_ScalableShampoo.png
Binary file modified docs/visualizations/rosenbrock_ScheduleFreeAdamW.png
Binary file modified docs/visualizations/rosenbrock_ScheduleFreeSGD.png
Binary file modified docs/visualizations/rosenbrock_Shampoo.png
Binary file modified docs/visualizations/rosenbrock_SignSGD.png
Binary file modified docs/visualizations/rosenbrock_SophiaH.png
Binary file modified docs/visualizations/rosenbrock_StableAdamW.png
Binary file modified docs/visualizations/rosenbrock_Tiger.png
Binary file modified docs/visualizations/rosenbrock_Yogi.png
Loading

0 comments on commit 131314b

Please sign in to comment.