Skip to content

Commit

Permalink
refactor: codes
Browse files Browse the repository at this point in the history
  • Loading branch information
kozistr committed Nov 23, 2024
1 parent 67682b7 commit cdb8153
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 2 deletions.
1 change: 1 addition & 0 deletions pytorch_optimizer/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,7 @@
Nero,
NovoGrad,
PAdam,
PCGrad,
Prodigy,
QHAdam,
RAdam,
Expand Down
2 changes: 0 additions & 2 deletions pytorch_optimizer/optimizer/adopt.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,6 @@ class ADOPT(BaseOptimizer):
:param weight_decay: float. weight decay (L2 penalty).
:param weight_decouple: bool. the optimizer uses decoupled weight decay as in AdamW.
:param fixed_decay: bool. fix weight decay.
:param adanorm: bool. whether to use the AdaNorm variant.
:param adam_debias: bool. Only correct the denominator to avoid inflating step sizes early in training.
:param eps: float. term added to the denominator to improve numerical stability.
"""

Expand Down

0 comments on commit cdb8153

Please sign in to comment.