Commit Graph

21 Commits (11f5f8802183affbfbd07323e03dafbedaba2c6f)

Author SHA1 Message Date
kingfo 9708e58259 fix TupleToArray & Cast operator issue
5 years ago
mindspore-ci-bot 3536185f5b !2007 add lazy adam optimizer and support sparse adam&ftrl for cpu backend
6 years ago
wangnan39@huawei.com 4042f16ce4 add lazy adam optim and support sparse adam & ftrl for cpu backend
6 years ago
mindspore-ci-bot f859dfecc8 !1920 SupportPynativeIndexing
6 years ago
huangdongrun 9522f59b87 support for tensor indexing in pynative
6 years ago
mindspore-ci-bot 9dfb1011fe !1854 add SparseApplyAdam and SparseApplyLazyAdam ops
6 years ago
wangnan39@huawei.com de21dbdaef add ops SparseApplyAdam and SparseApplyLazyAdam
6 years ago
mindspore-ci-bot 3b8edd5a5b !1918 sparse grad for gatherv2
6 years ago
panyifeng acaa66a738 sparse grad for gatherv2
6 years ago
lilei 36d9e353a5 add proximal_ada_grad optimizer
6 years ago
wangnan39@huawei.com c9b7d95c2c fix lr check bug in AdamWeightDecayDynamicLR
6 years ago
wangnan39@huawei.com 810ccf80d8 fix_bug_in_check_lamb_warmup_step
6 years ago
“liuxiao” ebbccca78b pylint clean
6 years ago
guohongzilong 2d2f9ba8fd fix group parameter code for check
6 years ago
jinyaohui 5a914994ba clean pylint
6 years ago
jinyaohui 26fd75895d pylint waring clean
6 years ago
guohongzilong 824bc30a94 learning rate and weight decay support group mode
6 years ago
wangnan39@huawei.com 7f602016f4 add parameter verification for rmsprop, and modify default value in annotation
6 years ago
root 7d700295f8 add dynamic lr and enhance optim
6 years ago
Ziyan 4cbcd8e907 enable use float type learning rate in lars optimizer
6 years ago
zhunaipan 930a1fb0a8 initial version
6 years ago