Commit Graph

27 Commits (f605f16726a2f481ca268fa56cac73c0aae1fe75)

Author SHA1 Message Date
QI JUN 61ec0b9516
Refine device context (#6433)
7 years ago
Abhinav Arora 113c026d12
Swish activation operator (#6358)
7 years ago
dzhwinter 513b1e010f
"add floor, ceil, round op" (#5898)
8 years ago
Kexin Zhao 81ba077e7b small fix
8 years ago
QI JUN 669786bfe1
refine square_error_cost layer (#5216)
8 years ago
Yu Yang be00b0c4d6 Gradient check use graph (#5027)
8 years ago
Abhinav Arora 3b954e1ddc Adding Hard Sigmoid Activation (#4771)
8 years ago
Abhinav Arora b504a2346c Adding the Thresholded Relu Op (#4685)
8 years ago
kexinzhao 9995aed114 Implementing Softplus operator (#4690)
8 years ago
kavyasrinet 1397e17f6b Implemented the hardShrink activation (#4653)
8 years ago
Siddharth Goyal 6604d7cda2 Add logsigmoid (numerically stable) and softshrink (#4663)
8 years ago
zhouxiao-coder e6421249d5 update to latest
8 years ago
kavyasrinet f30a1f42f0 Adding relu6 activation function (#4607)
8 years ago
zhouxiao-coder 53574e54a1 reslove merge conflict;reimplement ELU activation with functor
8 years ago
Kavya Srinet 154a6ed29c Implementing tanhshrink operator
8 years ago
Kavya Srinet 60af56c1b8 Added Leaky Relu activation
8 years ago
zhouxiao-coder a815d6abcf elu: Optimize gradient calculation;Add more comments
8 years ago
Yu Yang a8c6ce9b4d Merge branch 'develop' of github.com:baidu/Paddle into feature/BetterActivationKern
8 years ago
Abhinav Arora 0c3eee09ff Implementing the SoftSign activation operator
8 years ago
Yu Yang 337b7ebe77 Unify Activation functions and simplify register code
8 years ago
Yu Yang 3a5693e0a8 Add Skeleton of Double support
8 years ago
qijun 5824d85001 add activation operators and python unittests
8 years ago
qijun dadace3178 add more activation functors
8 years ago
qijun e515f18dd8 add tanh and sqrt activation operators
8 years ago
qijun 0957fa7b3c fix relu functor and revert some codes
8 years ago
qijun c18ebc3022 remove macros
8 years ago
qijun d736fc0e00 add activation macro
8 years ago