Commit Graph

70 Commits (87e75a77c2e80bdf720d394f9b9e1ef3b0a731a9)

Author SHA1 Message Date
joejiong 87e75a77c2
Add tangent operator (#29207)
5 years ago
joejiong 32b90b1c2d
add log10 (#28576)
5 years ago
joejiong 08d2413142
add log2 operator (#28319)
5 years ago
Jack Zhou d37b3774fd
register log double grad kernel for cpu and cuda
5 years ago
Zhong Hui fab4e6d08f
add abs support double grad
5 years ago
Qi Li 6f69fbc8ea
fix elu grad whne alpha less then zero, test=develop (#26543)
5 years ago
zhupengyang f8863e0603
leaky_relu and LeakyReLU: alpha->negative_slope (#26216)
5 years ago
zhupengyang 4ad504e7c7
hardshrink: support threshold < 0 (#26403)
5 years ago
hong19860320 40d193ed17
Add the ReLU6, Tanhshrink, SELU, Softplus, Softshrink and Softsign for the api 2.0 (#26376)
5 years ago
cnn 70cee22fde
New features, add sinh and cosh op, test=develop (#25495)
5 years ago
Zhang Ting b71abeee1b
use 32 bit index to improve activation ops (#24206)
5 years ago
danleifeng 222a5137b3
Add new tensor in API2.0: max,min,t,eye,log1p (#23228)
5 years ago
Steffy-zxf ac4da77aa6
update error info of ops,add some test cases for raise message (#23750)
5 years ago
Chen Weihang 16315d3d9e
Delete Ref & VectorRef and add GetDataSafely (#22997)
5 years ago
Feiyu Chan 01ab8a0619
add approximation for gelu, test=develop (#22961)
5 years ago
Double_V fab4b0765a support elu_op double grad (#21822)
6 years ago
SunAhong1993 7f4abaf2f5
register int/int64_t/float16 in pow/square kernel,test=develop (#22023)
6 years ago
Leo Chen add62acfd1
remove kDepXOut for abs_grad op, test=develop (#21407)
6 years ago
hong ac8546701d
Add dygraph execution context (#20157)
6 years ago
Adam d623e863c9 Fix GELU grad error (#21204)
6 years ago
Zeng Jinle 10505faf4e
polish codes, test=develop (#20672)
6 years ago
Leo Chen 982e61f5ff Update elementwise double grad to save gpu memory (#19509)
6 years ago
Zeng Jinle cabb9501bd
fix leaky_relu op when alpha is zero, test=develop (#19833)
6 years ago
liym27 677e714425 fix pow op, support tensor for agument factor. (#19313)
6 years ago
Zeng Jinle 0daa5c9772
Make leaky relu inplacable (#19676)
6 years ago
huangjun12 20f18930ae Add hard swish op (new op) (#19001)
6 years ago
Zeng Jinle 88f111f885
remove unused inplace act codes, test=develop (#19079)
6 years ago
Leo Zhao 86e494eb64 use mkl to accelerate gelu_grad (#18099)
6 years ago
Tao Luo bd22453f20
Revert "Add LeakyRelu MKLDNN support (#18656)" (#18723)
6 years ago
Adam d6b6a337a9 Add LeakyRelu MKLDNN support (#18656)
6 years ago
qingqing01 80d2e66f9e
Update backward appending stragety to support double backward and fix some bug. (#18104)
6 years ago
lvmengsi 4ef631013c Double backward sqrt (#17387)
6 years ago
Kaipeng Deng 11d3a38f25
add double grad for square op (#17173)
6 years ago
Zeng Jinle 28d69d710a
Refine dropout gpu memory (#17095)
6 years ago
ceci3 258e000be6
test=develop, double backward leaky_relu (#17067)
6 years ago
qingqing01 c1c2633a63
Support backward of backward for Relu and add a new gradient checker by comparing theoretical and numerical Jacobian. (#16862)
6 years ago
zhoukunsheng b1c5820b3f fix merge conflict
6 years ago
zhoukunsheng 2b2b4ca21e
Merge branch 'develop' into rsqrt
6 years ago
Zeng Jinle 9f7b027dce
fix activation grad op desc maker (#16715)
6 years ago
zhoukunsheng 91ba75000c fix type conversion problem in rsqrt functor
6 years ago
zhoukunsheng c47f3cc7fe test=develop
6 years ago
tink2123 837ad7f86f Add the inverse trigonometric function
6 years ago
Tao Luo 4efdebc6f6
Merge pull request #15931 from yihuaxu/develop_2c5c7b2a7_gelu_mkl_opt
6 years ago
dzhwinter 225c11a91f polish cudnn related code and fix bug. (#15164)
6 years ago
Yihua Xu 7396788694 Optimize gelu operation with mkl erf.
6 years ago
tensor-tang ee2321debd
Revert 15770 develop a6910f900 gelu mkl opt (#15872)
6 years ago
Yihua Xu 676995c86c Optimze Gelu with MKL Erf function (#15770)
6 years ago
Yibing Liu 6951ef9a55
Fix the gelu backward to avoid nan (#14857)
7 years ago
chengduo 04539d4c5d
Fix clip.py (#14718)
7 years ago
minqiyang a02ce58f2c Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into revert_vlog
7 years ago