Commit Graph

21 Commits (34aac18cb0e6ac25621ffda162035eb1029d5282)

Author SHA1 Message Date
Abhinav Arora 3b954e1ddc Adding Hard Sigmoid Activation (#4771)
7 years ago
Abhinav Arora b504a2346c Adding the Thresholded Relu Op (#4685)
7 years ago
kexinzhao 9995aed114 Implementing Softplus operator (#4690)
7 years ago
kavyasrinet 1397e17f6b Implemented the hardShrink activation (#4653)
7 years ago
Siddharth Goyal 6604d7cda2 Add logsigmoid (numerically stable) and softshrink (#4663)
7 years ago
zhouxiao-coder e6421249d5 update to latest
7 years ago
kavyasrinet f30a1f42f0 Adding relu6 activation function (#4607)
7 years ago
zhouxiao-coder 53574e54a1 reslove merge conflict;reimplement ELU activation with functor
7 years ago
Kavya Srinet 154a6ed29c Implementing tanhshrink operator
7 years ago
Kavya Srinet 60af56c1b8 Added Leaky Relu activation
7 years ago
zhouxiao-coder a815d6abcf elu: Optimize gradient calculation;Add more comments
7 years ago
Yu Yang a8c6ce9b4d Merge branch 'develop' of github.com:baidu/Paddle into feature/BetterActivationKern
7 years ago
Abhinav Arora 0c3eee09ff Implementing the SoftSign activation operator
7 years ago
Yu Yang 337b7ebe77 Unify Activation functions and simplify register code
7 years ago
Yu Yang 3a5693e0a8 Add Skeleton of Double support
7 years ago
qijun 5824d85001 add activation operators and python unittests
8 years ago
qijun dadace3178 add more activation functors
8 years ago
qijun e515f18dd8 add tanh and sqrt activation operators
8 years ago
qijun 0957fa7b3c fix relu functor and revert some codes
8 years ago
qijun c18ebc3022 remove macros
8 years ago
qijun d736fc0e00 add activation macro
8 years ago