V2.9.0
Change Log
Feature
- Implement AdaMax optimizer (#148)
- A variant of Adam based on the infinity norm
- Implement Gravity optimizer (#151)
- Implement AdaSmooth optimizer (#153)
- Implement SRMM optimizer (#154)
- Implement AvaGrad optimizer (#155)
- Implement AdaShift optimizer (#157)
- Upgrade to D-Adaptation v3 (#158, #159)
- Implement AdaDelta optimizer (#160)
Docs
- Fix readthedocs build issue (#156)
- Move citations into table (#156)
Refactor
- Refactor validation logic (#149, #150)
- Rename
amsbound
, amsgrad
terms into ams_bound
(#149)
- Return gradient instead of the parameter, AGC. (#149)
- Refactor duplicates (e.g. rectified step size, AMSBound, AdamD, AdaNorm, weight decay) into re-usable functions (#150)
- Move
pytorch_optimizer.experimental
under pytorch_optimizer.*.experimental
Diff
2.8.0...2.9.0