Commit Graph

30 Commits (32ad4f90a4b9c5fc38f6480b5a024ba44f654ee2)

Author SHA1 Message Date
GaoWei8 11fb8a1c10
Refine cudnn softmax (#25757)
5 years ago
Qi Li 8d194524ba
hardtanh prelu softmax, test=develop (#26431)
5 years ago
zhupengyang b2034c2854
softmax: imperative->static; fix doc examples (#26134)
5 years ago
zhupengyang 2214394edc
softmax: refine doc; input->x (#25976)
5 years ago
YUNSHEN XIE 70554c9f97
disable TestSoftmaxFP16Op2 in test_softmax_op (#25519)
5 years ago
littletomatodonkey 64b4612290
Fix softmax unittest (#25371)
5 years ago
suytingwan a2c6d45080
test=develop softmax op fp16 test case pass grad check (#24130)
5 years ago
juncaipeng f4379a9149 Update the precision of some op tests from fp32 to fp64 (#21847)
5 years ago
Zhang Ting 3df13ab40c
fix PythonAPI test in Op unittest, test=develop (#21455)
5 years ago
zhongpu c4ede95c74 open dygraph op test, test=develop (#19787)
5 years ago
WangXi 3c98ec90ce Fix INF bug of softmax_cross_entropy_op (#21165)
5 years ago
Tao Luo 134d809e23
fix softmax input error check on float16 (#20273)
6 years ago
Tao Luo 65a02fc114
add input type and dtype check for softmax_op (#19975)
6 years ago
Adam cb65439da8 Add support for other axes in MKLDNN softmax op (#19907)
6 years ago
dengkaipeng 93701dba50 add jit kernel for softmax axis. test=develop
6 years ago
dengkaipeng 365e6cfd15 add mkldnn support. test=develop
6 years ago
dengkaipeng 6cb66721d2 add cudnn support. test=develop
6 years ago
Krzysztof Binias 851ea04dec Add UTs to check whether primitives for activations and softmax already exist in backward
6 years ago
Tao Luo 5b9c62faee
Revert "Softmax op optimization for inference "
6 years ago
Jacek Czaja d332326847 - Added unit tests for softmax is_test=True op
6 years ago
chengduo a9b5d42dd4
Add fp16 backward support (#14202)
6 years ago
minqiyang 99d3f08920 Add print_function for all python files
7 years ago
fengjiayi f7bd0b227b Add unittests for softmax_op
7 years ago
Wei Xu 264e8305b0 Fixed unittests for WITH_GPU=OFF and WITH_DISTRIBUTE=OFF build
7 years ago
Kexin Zhao b2a1c9e8b7 Add float16 support to non-cudnn softmax op on GPU (#9686)
7 years ago
Kexin Zhao 4eaa789730 resolve conflict
7 years ago
Jacek Czaja 3b95b55f07 - Softmax MKLDNN primitive integration
7 years ago
Kexin Zhao 70e7122785 initial commit
7 years ago
dzhwinter 128adf53cb
[Speed]implement cudnn sequence softmax cudnn (#8978)
7 years ago
Luo Tao b11956a0b5 move Fluid API code out of V2 API code
7 years ago