Commit Graph

21 Commits (b085ecc25896c0a4aea70bcfff316683a76ec5e4)

Author SHA1 Message Date
zhongpu c4ede95c74 open dygraph op test, test=develop (#19787)
5 years ago
WangXi 3c98ec90ce Fix INF bug of softmax_cross_entropy_op (#21165)
5 years ago
Tao Luo 134d809e23
fix softmax input error check on float16 (#20273)
6 years ago
Tao Luo 65a02fc114
add input type and dtype check for softmax_op (#19975)
6 years ago
Adam cb65439da8 Add support for other axes in MKLDNN softmax op (#19907)
6 years ago
dengkaipeng 93701dba50 add jit kernel for softmax axis. test=develop
6 years ago
dengkaipeng 365e6cfd15 add mkldnn support. test=develop
6 years ago
dengkaipeng 6cb66721d2 add cudnn support. test=develop
6 years ago
Krzysztof Binias 851ea04dec Add UTs to check whether primitives for activations and softmax already exist in backward
6 years ago
Tao Luo 5b9c62faee
Revert "Softmax op optimization for inference "
6 years ago
Jacek Czaja d332326847 - Added unit tests for softmax is_test=True op
6 years ago
chengduo a9b5d42dd4
Add fp16 backward support (#14202)
6 years ago
minqiyang 99d3f08920 Add print_function for all python files
7 years ago
fengjiayi f7bd0b227b Add unittests for softmax_op
7 years ago
Wei Xu 264e8305b0 Fixed unittests for WITH_GPU=OFF and WITH_DISTRIBUTE=OFF build
7 years ago
Kexin Zhao b2a1c9e8b7 Add float16 support to non-cudnn softmax op on GPU (#9686)
7 years ago
Kexin Zhao 4eaa789730 resolve conflict
7 years ago
Jacek Czaja 3b95b55f07 - Softmax MKLDNN primitive integration
7 years ago
Kexin Zhao 70e7122785 initial commit
7 years ago
dzhwinter 128adf53cb
[Speed]implement cudnn sequence softmax cudnn (#8978)
7 years ago
Luo Tao b11956a0b5 move Fluid API code out of V2 API code
7 years ago