Commit Graph

19 Commits (f3818bd3358f5218e601b26c93c0a51cd015d74f)

Author SHA1 Message Date
Yu Yang be00b0c4d6 Gradient check use graph (#5027)
8 years ago
Yang Yang(Tony) db157eda45 New Op Test framework. (#4962)
8 years ago
Abhinav Arora 3b954e1ddc Adding Hard Sigmoid Activation (#4771)
8 years ago
Abhinav Arora b504a2346c Adding the Thresholded Relu Op (#4685)
8 years ago
kexinzhao 9995aed114 Implementing Softplus operator (#4690)
8 years ago
kavyasrinet 1397e17f6b Implemented the hardShrink activation (#4653)
8 years ago
Siddharth Goyal 6604d7cda2 Add logsigmoid (numerically stable) and softshrink (#4663)
8 years ago
zhouxiao-coder e6421249d5 update to latest
8 years ago
kavyasrinet f30a1f42f0 Adding relu6 activation function (#4607)
8 years ago
zhouxiao-coder 53574e54a1 reslove merge conflict;reimplement ELU activation with functor
8 years ago
Kavya Srinet 154a6ed29c Implementing tanhshrink operator
8 years ago
Kavya Srinet 11070e5f36 Updated the reltive error
8 years ago
Kavya Srinet 60af56c1b8 Added Leaky Relu activation
8 years ago
zhouxiao-coder a815d6abcf elu: Optimize gradient calculation;Add more comments
8 years ago
Abhinav Arora 0c3eee09ff Implementing the SoftSign activation operator
8 years ago
qijun 87ba6cbf20 merge baidu/develop
8 years ago
qijun 48f5f6bdd0 refine some operators' python unittests
8 years ago
qijun 5824d85001 add activation operators and python unittests
8 years ago
qijun 3110bf9a9a merge activation operator python tests
8 years ago