V2.7.0
Change Log
Feature
- Implement
AdaNorm
optimizer (#133)
- Implement
RotoGrad
optimizer (#124, #134)
- Implement
D-Adapt Adan
optimizer (#134)
- Support
AdaNorm
variant (#133, #134)
- AdaBelief
- AdamP
- AdamS
- AdaPNM
- diffGrad
- Lamb
- RAdam
- Ranger
- Adan
- Support
AMSGrad
variant (#133, #134)
- Support
degenerated_to_sgd
(#133)
Refactor
- Rename
adamd_debias_term
to adam_debias
(#133)
- Merge the rectified version with the original (#133)
- diffRGrad + diffGrad -> diffGrad
- RaLamb + Lamb -> Lamb
- now you can simply use with
rectify=True
Bug
- Fix
previous_grad
deepcopy issue in Adan optimizer (#134)
Diff
2.6.1...2.7.0