Commit Graph

24 Commits (17030ff28b9a54bb57779e9b8448a6d222110ec5)

Author SHA1 Message Date
Zhang Ting 318dfa0d4f
remove eval in eigen function when dtype is fp16 (#23845)
5 years ago
Kaipeng Deng 3f021781a1
fix softmax CE time limit check failed (#19846)
5 years ago
Kaipeng Deng 99c78b772a
fix softmax axis!=-1. test=develop (#19800)
6 years ago
tensor-tang 7ae461eb13
[CPU] refine cpu softmax bwd (#17534)
6 years ago
tensor-tang 0600b370ea
[CPU] refine softmax op fwd on CPU (#17522)
6 years ago
dengkaipeng 90bd038d35 fix format. test=develop
6 years ago
dengkaipeng 93701dba50 add jit kernel for softmax axis. test=develop
6 years ago
dengkaipeng 6c64182709 refine softmax kernel. test=develop
6 years ago
tensor-tang 14a764c930 simplify the jitkernel templates and tests
6 years ago
tensor-tang 6e1ee7fb57 cache softmax kernel func
6 years ago
tensor-tang d59f733551 refine softmax and use with cache
6 years ago
sneaxiy a500dfa579 rewrite ddim
6 years ago
gongweibao f1fb64b17f
Add reduce sparse tensor feature. (#14757)
6 years ago
Jacek Czaja 8bfa1fa9bb - ASUM MKL integration
6 years ago
Jacek Czaja 9b0eae3023 - Removing partial specialization of sotmax for inference for GPU
6 years ago
Jacek Czaja 513bb6c151 Squashing MKL based softmax for inference
6 years ago
Jacek Czaja b361579f09 - Softmax for Inference is enabled when ON_INFER is set
6 years ago
Tao Luo 5b9c62faee
Revert "Softmax op optimization for inference "
6 years ago
Jacek Czaja d332326847 - Added unit tests for softmax is_test=True op
6 years ago
Jacek Czaja c1fccc29c1 - Noise adding removed for Test phase of softmax
6 years ago
Kexin Zhao b2a1c9e8b7 Add float16 support to non-cudnn softmax op on GPU (#9686)
7 years ago
qingqing01 24509f4af9 Fix the grammar in copyright. (#8403)
7 years ago
Yi Wang fc374821dd Correct #include path
7 years ago
Yi Wang 90648f336d Move file to fluid/; Edit CMakeLists.txt
7 years ago