Commit Graph

32 Commits (2e907c3613abfd68ebe8bf4c9d7b2bc42816105a)

Author SHA1 Message Date
Qiao Longfei 59357f4fb9
fix floor_op () 7 years ago
Yang Yu 5a4367bb16 Update 7 years ago
fengjiayi bff0cbfcd3
Merge pull request from JiayiFeng/rename_output_of_softmax_and_activitions 7 years ago
Luo Tao 761b329793 unify the indentation of license 7 years ago
fengjiayi e0be63bf09 change activations 7 years ago
QI JUN 61ec0b9516
Refine device context () 7 years ago
Abhinav Arora 113c026d12
Swish activation operator () 7 years ago
dzhwinter 513b1e010f
"add floor, ceil, round op" () 7 years ago
Kexin Zhao 81ba077e7b small fix 8 years ago
QI JUN 669786bfe1
refine square_error_cost layer () 8 years ago
Yu Yang be00b0c4d6 Gradient check use graph () 8 years ago
Abhinav Arora 3b954e1ddc Adding Hard Sigmoid Activation () 8 years ago
Abhinav Arora b504a2346c Adding the Thresholded Relu Op () 8 years ago
kexinzhao 9995aed114 Implementing Softplus operator () 8 years ago
kavyasrinet 1397e17f6b Implemented the hardShrink activation () 8 years ago
Siddharth Goyal 6604d7cda2 Add logsigmoid (numerically stable) and softshrink () 8 years ago
zhouxiao-coder e6421249d5 update to latest 8 years ago
kavyasrinet f30a1f42f0 Adding relu6 activation function () 8 years ago
zhouxiao-coder 53574e54a1 reslove merge conflict;reimplement ELU activation with functor 8 years ago
Kavya Srinet 154a6ed29c Implementing tanhshrink operator 8 years ago
Kavya Srinet 60af56c1b8 Added Leaky Relu activation 8 years ago
zhouxiao-coder a815d6abcf elu: Optimize gradient calculation;Add more comments 8 years ago
Yu Yang a8c6ce9b4d Merge branch 'develop' of github.com:baidu/Paddle into feature/BetterActivationKern 8 years ago
Abhinav Arora 0c3eee09ff Implementing the SoftSign activation operator 8 years ago
Yu Yang 337b7ebe77 Unify Activation functions and simplify register code 8 years ago
Yu Yang 3a5693e0a8 Add Skeleton of Double support 8 years ago
qijun 5824d85001 add activation operators and python unittests 8 years ago
qijun dadace3178 add more activation functors 8 years ago
qijun e515f18dd8 add tanh and sqrt activation operators 8 years ago
qijun 0957fa7b3c fix relu functor and revert some codes 8 years ago
qijun c18ebc3022 remove macros 8 years ago
qijun d736fc0e00 add activation macro 8 years ago