V2.10.0
Change Log
Feature
- Implement Amos optimizer (#174)
- Implement SignSGD optimizer (#176) (thanks to @i404788)
- Implement AdaHessian optimizer (#176) (thanks to @i404788)
- Implement SophiaH optimizer (#173, #176) (thanks to @i404788)
- Implement re-usable functions to compute hessian in
BaseOptimizer
(#176, #177) (thanks to @i404788)- two types of distribution are supported (
gaussian
,rademacher
).
- two types of distribution are supported (
- Support
AdamD
variant for AdaHessian optimizer (#177)