Commit Graph

50 Commits (516b56cb64c372801522084151731ec5327ef5a7)

Author SHA1 Message Date
jinyaohui 86d197dfeb clean pylint
5 years ago
mindspore-ci-bot 19ce0c372a !1257 Implicit type conversion
5 years ago
wangnan39@huawei.com 810ccf80d8 fix_bug_in_check_lamb_warmup_step
5 years ago
candanzg 2429da19fb implicit type conversion
5 years ago
chenhaozhe b6aceddeab update bert scripts according to rules of modelzoo
5 years ago
“liuxiao” ebbccca78b pylint clean
5 years ago
“liuxiao” f4542f810b pylint clean
5 years ago
“liuxiao” 382a0124c3 pylint clean
5 years ago
mindspore-ci-bot 62c716b68e !1349 Fix some functions in group parameters and optimizer
5 years ago
jinyaohui fbdba6e4da clean pylint
5 years ago
guohongzilong 2d2f9ba8fd fix group parameter code for check
5 years ago
jinyaohui 5a914994ba clean pylint
5 years ago
jiangjinsheng e45532b78c fixed transpose
5 years ago
candanzg 2cc85bdc93 Support weight compile according to shape
5 years ago
jinyaohui 26fd75895d pylint waring clean
5 years ago
mindspore-ci-bot 66667d727e !1034 Gpu Support Dropout operator
5 years ago
chenzomi 661f9dfaf8 add dropout primtive
5 years ago
mindspore-ci-bot 8a45ab1125 !906 fix a bug that support dilation greater than 1 in conv2dTranspose ops
5 years ago
mindspore-ci-bot deae380969 !637 Learning rate and weight decay making group params
5 years ago
guohongzilong 824bc30a94 learning rate and weight decay support group mode
5 years ago
mindspore-ci-bot 3d3b9d5474 !840 [BUG] fix conv2dtranspose bug
5 years ago
jinyaohui 88e763a98f modify conv2dtranspose
5 years ago
zhaojichen 039c75af8e fix avgpool and add check
5 years ago
yangyongjie 3f1c6b7b47 solve the problem when dialtion greater than 1 in Conv2dTranspose ops
5 years ago
zhaozhenlong 66e7a36846 ImageGradients check 4d
5 years ago
wangnan39@huawei.com 7f602016f4 add parameter verification for rmsprop, and modify default value in annotation
5 years ago
zhaozhenlong c88edfb31d psnr check two input same shape and type
5 years ago
mindspore-ci-bot 818acd46d4 !569 fix bug in checkpoint when save scaler
5 years ago
wangnan39@huawei.com f38d18c665 fix bug in checkpoint when save scaler
5 years ago
mindspore-ci-bot 67057d1309 !541 add average pooling 1D
5 years ago
mindspore-ci-bot 72f42fc37c !170 Add prim name to error message for operators in nn_ops.py
5 years ago
wangnan39@huawei.com b812b18c02 support update parameter for vm
5 years ago
fary86 6dd72f654a Add prim name to error message for nn_ops.py
5 years ago
zhaojichen 94c99998ae add AvgPooling layer
5 years ago
fary86 8cbbbd950e Add cell name to error message
5 years ago
zhaozhenlong aa8fbcc06e add cell psnr
5 years ago
zhaojichen 04c522d0c6 Add Group Normalization
5 years ago
zhaojichen 0b7de6968f Add Group Normalization
5 years ago
zhaojichen ebe6efff71 Add Group Normalization
5 years ago
zhaozhenlong 6a2cf4b6e6 ssim impl code
5 years ago
mindspore-ci-bot 7ffb8bb19f !250 Add nn.pad to support three modes
5 years ago
gaojing 2db3e64ff2 add operation
5 years ago
root 7d700295f8 add dynamic lr and enhance optim
5 years ago
chenzomi d64f662c76 quantization aware training frontend operators define.
5 years ago
buxue 7541d3b067 Develop op MaxPoolWithArgMax
5 years ago
zhaozhenlong f9d180d413 add api image gradients
5 years ago
mindspore-ci-bot 352c6faf85 !18 enable use float type learning rate in lars optimizer
5 years ago
buxue 0da0bdcf40 Fix bug structure output when there is depend whose first input is constant in outputs
5 years ago
Ziyan 4cbcd8e907 enable use float type learning rate in lars optimizer
5 years ago
zhunaipan 930a1fb0a8 initial version
5 years ago