Commit Graph

31 Commits (f086ebb8b9b8e21f7d0d2056f6a961da080edfb2)

Author SHA1 Message Date
Yang Yu 5a4367bb16 Update
7 years ago
fengjiayi bff0cbfcd3
Merge pull request #7025 from JiayiFeng/rename_output_of_softmax_and_activitions
7 years ago
Luo Tao 761b329793 unify the indentation of license
7 years ago
fengjiayi e0be63bf09 change activations
7 years ago
QI JUN 61ec0b9516
Refine device context (#6433)
7 years ago
Abhinav Arora 113c026d12
Swish activation operator (#6358)
7 years ago
dzhwinter 513b1e010f
"add floor, ceil, round op" (#5898)
7 years ago
Kexin Zhao 81ba077e7b small fix
7 years ago
QI JUN 669786bfe1
refine square_error_cost layer (#5216)
7 years ago
Yu Yang be00b0c4d6 Gradient check use graph (#5027)
7 years ago
Abhinav Arora 3b954e1ddc Adding Hard Sigmoid Activation (#4771)
7 years ago
Abhinav Arora b504a2346c Adding the Thresholded Relu Op (#4685)
7 years ago
kexinzhao 9995aed114 Implementing Softplus operator (#4690)
7 years ago
kavyasrinet 1397e17f6b Implemented the hardShrink activation (#4653)
7 years ago
Siddharth Goyal 6604d7cda2 Add logsigmoid (numerically stable) and softshrink (#4663)
7 years ago
zhouxiao-coder e6421249d5 update to latest
7 years ago
kavyasrinet f30a1f42f0 Adding relu6 activation function (#4607)
7 years ago
zhouxiao-coder 53574e54a1 reslove merge conflict;reimplement ELU activation with functor
7 years ago
Kavya Srinet 154a6ed29c Implementing tanhshrink operator
7 years ago
Kavya Srinet 60af56c1b8 Added Leaky Relu activation
7 years ago
zhouxiao-coder a815d6abcf elu: Optimize gradient calculation;Add more comments
7 years ago
Yu Yang a8c6ce9b4d Merge branch 'develop' of github.com:baidu/Paddle into feature/BetterActivationKern
7 years ago
Abhinav Arora 0c3eee09ff Implementing the SoftSign activation operator
7 years ago
Yu Yang 337b7ebe77 Unify Activation functions and simplify register code
7 years ago
Yu Yang 3a5693e0a8 Add Skeleton of Double support
7 years ago
qijun 5824d85001 add activation operators and python unittests
8 years ago
qijun dadace3178 add more activation functors
8 years ago
qijun e515f18dd8 add tanh and sqrt activation operators
8 years ago
qijun 0957fa7b3c fix relu functor and revert some codes
8 years ago
qijun c18ebc3022 remove macros
8 years ago
qijun d736fc0e00 add activation macro
8 years ago