Commit Graph

38 Commits (2b259bf14d396a1aaed9cf3ea519a1c07989060f)

Author SHA1 Message Date
fengjiayi bff0cbfcd3
Merge pull request #7025 from JiayiFeng/rename_output_of_softmax_and_activitions
7 years ago
Luo Tao 761b329793 unify the indentation of license
7 years ago
fengjiayi e0be63bf09 change activations
7 years ago
Yu Yang e445b3ff20
Move framework.proto to proto namespace (#6718)
7 years ago
QI JUN 61ec0b9516
Refine device context (#6433)
7 years ago
Abhinav Arora 113c026d12
Swish activation operator (#6358)
7 years ago
Abhinav Arora 1d04b19ce8
Fix the rendering of latex equation for adamax op (#6294)
7 years ago
dzhwinter 513b1e010f
"add floor, ceil, round op" (#5898)
7 years ago
Yu Yang a5e73f9eaf
Support many data types of several operators (#5731)
7 years ago
kexinzhao cb0118f3e5 Polish Operator Doc (m) (#5375)
7 years ago
Kexin Zhao 81ba077e7b small fix
7 years ago
Yu Yang be00b0c4d6 Gradient check use graph (#5027)
7 years ago
Yu Yang 73a8b78a72 Correct OpWithKernel's infershape (#4847)
7 years ago
Abhinav Arora 3b954e1ddc Adding Hard Sigmoid Activation (#4771)
7 years ago
Abhinav Arora b504a2346c Adding the Thresholded Relu Op (#4685)
7 years ago
kexinzhao 9995aed114 Implementing Softplus operator (#4690)
7 years ago
kavyasrinet 1397e17f6b Implemented the hardShrink activation (#4653)
7 years ago
Siddharth Goyal 6604d7cda2 Add logsigmoid (numerically stable) and softshrink (#4663)
7 years ago
zhouxiao-coder e6421249d5 update to latest
7 years ago
Qiao Longfei e12ec95ac1 Merge pull request #4630 from jacquesqiao/merge-infershapecontext
7 years ago
kavyasrinet f30a1f42f0 Adding relu6 activation function (#4607)
7 years ago
zhouxiao-coder 53574e54a1 reslove merge conflict;reimplement ELU activation with functor
7 years ago
Luo Tao 707d144c93 Unify Reduce functions and simplify register code
7 years ago
qiaolongfei c0a34e1c64 rename InferShapeContextBase to InferShapeContext
7 years ago
Kavya Srinet 154a6ed29c Implementing tanhshrink operator
7 years ago
Kavya Srinet 60af56c1b8 Added Leaky Relu activation
7 years ago
zhouxiao-coder 4436ba0c56 elu: Optimize gradient calculation;Add more comments
7 years ago
zhouxiao-coder a815d6abcf elu: Optimize gradient calculation;Add more comments
7 years ago
Yu Yang a8c6ce9b4d Merge branch 'develop' of github.com:baidu/Paddle into feature/BetterActivationKern
7 years ago
Abhinav Arora 0c3eee09ff Implementing the SoftSign activation operator
7 years ago
Yu Yang 337b7ebe77 Unify Activation functions and simplify register code
7 years ago
Qiao Longfei 9a9d50a6ee Refactoring InferShape (#3946)
7 years ago
dangqingqing 6e2782e958 update to develop branch.
8 years ago
qijun fd5aa2ada2 merge baidu/develop
8 years ago
qijun 5824d85001 add activation operators and python unittests
8 years ago
qijun 0957fa7b3c fix relu functor and revert some codes
8 years ago
qijun c18ebc3022 remove macros
8 years ago
qijun d736fc0e00 add activation macro
8 years ago