Commit Graph

60 Commits (c42d662e2afa05d3d20d026c3f5c1ba376aa53a4)

Author SHA1 Message Date
Zhang Ting b71abeee1b
use 32 bit index to improve activation ops (#24206)
5 years ago
danleifeng 222a5137b3
Add new tensor in API2.0: max,min,t,eye,log1p (#23228)
5 years ago
Steffy-zxf ac4da77aa6
update error info of ops,add some test cases for raise message (#23750)
5 years ago
Chen Weihang 16315d3d9e
Delete Ref & VectorRef and add GetDataSafely (#22997)
5 years ago
Feiyu Chan 01ab8a0619
add approximation for gelu, test=develop (#22961)
5 years ago
Double_V fab4b0765a support elu_op double grad (#21822)
5 years ago
SunAhong1993 7f4abaf2f5
register int/int64_t/float16 in pow/square kernel,test=develop (#22023)
5 years ago
Leo Chen add62acfd1
remove kDepXOut for abs_grad op, test=develop (#21407)
5 years ago
hong ac8546701d
Add dygraph execution context (#20157)
5 years ago
Adam d623e863c9 Fix GELU grad error (#21204)
5 years ago
Zeng Jinle 10505faf4e
polish codes, test=develop (#20672)
5 years ago
Leo Chen 982e61f5ff Update elementwise double grad to save gpu memory (#19509)
5 years ago
Zeng Jinle cabb9501bd
fix leaky_relu op when alpha is zero, test=develop (#19833)
5 years ago
liym27 677e714425 fix pow op, support tensor for agument factor. (#19313)
5 years ago
Zeng Jinle 0daa5c9772
Make leaky relu inplacable (#19676)
6 years ago
huangjun12 20f18930ae Add hard swish op (new op) (#19001)
6 years ago
Zeng Jinle 88f111f885
remove unused inplace act codes, test=develop (#19079)
6 years ago
Leo Zhao 86e494eb64 use mkl to accelerate gelu_grad (#18099)
6 years ago
Tao Luo bd22453f20
Revert "Add LeakyRelu MKLDNN support (#18656)" (#18723)
6 years ago
Adam d6b6a337a9 Add LeakyRelu MKLDNN support (#18656)
6 years ago
qingqing01 80d2e66f9e
Update backward appending stragety to support double backward and fix some bug. (#18104)
6 years ago
lvmengsi 4ef631013c Double backward sqrt (#17387)
6 years ago
Kaipeng Deng 11d3a38f25
add double grad for square op (#17173)
6 years ago
Zeng Jinle 28d69d710a
Refine dropout gpu memory (#17095)
6 years ago
ceci3 258e000be6
test=develop, double backward leaky_relu (#17067)
6 years ago
qingqing01 c1c2633a63
Support backward of backward for Relu and add a new gradient checker by comparing theoretical and numerical Jacobian. (#16862)
6 years ago
zhoukunsheng b1c5820b3f fix merge conflict
6 years ago
zhoukunsheng 2b2b4ca21e
Merge branch 'develop' into rsqrt
6 years ago
Zeng Jinle 9f7b027dce
fix activation grad op desc maker (#16715)
6 years ago
zhoukunsheng 91ba75000c fix type conversion problem in rsqrt functor
6 years ago
zhoukunsheng c47f3cc7fe test=develop
6 years ago
tink2123 837ad7f86f Add the inverse trigonometric function
6 years ago
Tao Luo 4efdebc6f6
Merge pull request #15931 from yihuaxu/develop_2c5c7b2a7_gelu_mkl_opt
6 years ago
dzhwinter 225c11a91f polish cudnn related code and fix bug. (#15164)
6 years ago
Yihua Xu 7396788694 Optimize gelu operation with mkl erf.
6 years ago
tensor-tang ee2321debd
Revert 15770 develop a6910f900 gelu mkl opt (#15872)
6 years ago
Yihua Xu 676995c86c Optimze Gelu with MKL Erf function (#15770)
6 years ago
Yibing Liu 6951ef9a55
Fix the gelu backward to avoid nan (#14857)
6 years ago
chengduo 04539d4c5d
Fix clip.py (#14718)
6 years ago
minqiyang a02ce58f2c Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into revert_vlog
6 years ago
Clementine 6c71c1f8f9 Add activation gelu (#14569)
6 years ago
minqiyang 53433d7f2e Revert the changes of VLOG
6 years ago
minqiyang 0c3227a523 Change the origin VLOG level to 10 times
6 years ago
chengduo a9b5d42dd4
Add fp16 backward support (#14202)
6 years ago
dzhwinter e722f68318
fix windows compile (#13147)
7 years ago
dzhwinter 4069262f0e
Revert ""cherry picked operators changes" (#12184)" (#12747)
7 years ago
dzhwinter bf3c34960f
"cherry picked operators changes" (#12184)
7 years ago
dzhwinter 7a517dc93e merge develop
7 years ago
dzhwinter e54f203c55 "move to a new PR"
7 years ago
Kexin Zhao 0f38bb4593
add fp16 support to activation op (#9769)
7 years ago