seatea
|
840280e784
|
Correct the comments for `RandomChoiceWithMask` op.
|
5 years ago |
mindspore-ci-bot
|
062b744b19
|
!12 Fix dtype bug for loss_scale and weight_decay
Merge pull request !12 from seatea/dynamic-loss-scale
|
5 years ago |
mindspore-ci-bot
|
3ab402e110
|
!7 adapt relu6grad
Merge pull request !7 from zhaozhenlong/adapte-relu6grad
|
5 years ago |
lichenever
|
b4d34973bc
|
fix_cast_bug
|
5 years ago |
chenhaozhe
|
cab5503280
|
use find instead of equal to distinguish training graph
|
5 years ago |
mindspore-ci-bot
|
44cd0c1f90
|
!13 Check input shape for `NMSWithMask` op
Merge pull request !13 from seatea/NMSWithMask-check-shape
|
5 years ago |
buxue
|
0da0bdcf40
|
Fix bug structure output when there is depend whose first input is constant in outputs
|
5 years ago |
Ziyan
|
4cbcd8e907
|
enable use float type learning rate in lars optimizer
|
5 years ago |
zhaozhenlong
|
9862dea3cf
|
adapt relu6grad and graphengine modified
|
5 years ago |
shibeiji
|
468e257a14
|
add log for kernel runtime in order to trace performance
|
5 years ago |
jonyguo
|
34e42bd6f9
|
1. add more log info for dataset & mindrecord, 2. add two new testcase for MindDataset
|
5 years ago |
seatea
|
6c03542eec
|
Fix dtype bug for loss_scale and weight_decay.
1.Change dtype of scale to dtype of grad in loss_scale.py;
2.Change dtype of weight_decay to dtype of weight in optimizer.py.
|
5 years ago |
seatea
|
7b7a6a45a0
|
Check if the shape of the input of NMSWithMask is (N, 5).
|
5 years ago |
jjfeing
|
86f5c69995
|
change parallel complie num: 32->16
|
5 years ago |
GinFung
|
468dbc3557
|
Add matmul biasadd fusion pass
|
5 years ago |
zhunaipan
|
930a1fb0a8
|
initial version
Signed-off-by: leonwanghui <leon.wanghui@huawei.com>
|
5 years ago |