Commit Graph

731 Commits (cc29bec6e6ba4059a53a193fd26cdcf6e4c1cbc1)

Author SHA1 Message Date
Jiabin Yang e41d581304
test=develop, fix space_to_depth_doc (#16293)
6 years ago
Jiabin Yang f735102eab
add layer norm to Layers, add transformer test in imperative mode (#16092)
6 years ago
whs e9bec9369b
[slim] Add quantization strategy and distillation strategy. (#16408)
6 years ago
qingqing01 5d6737b5cb
Fix bug in affine_channel API (#16373)
6 years ago
phlrain 686b8935fe Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into add_floordiv_and_mod
6 years ago
Hongyu Liu 8c81d9949e
Merge pull request #16347 from phlrain/fix_matmul_check
6 years ago
qingqing01 d2b938ef5a
Refine gradient proto maker and python API for affine_channel_op (#16340)
6 years ago
phlrain 0e40298949 fix matmul shape check; test=develop
6 years ago
phlrain 56c2d384c7 add elementwise floordiv, mod; test=develop
6 years ago
ceci3 27f7a72641 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into doc
6 years ago
ceci3 3f5f5ed361 fix dropout doc
6 years ago
Xin Pan 3e9319f3ab add more imperative layer tests.
6 years ago
Xin Pan 7458114b5b
Merge pull request #16228 from panyx0718/imperative
6 years ago
Tao Luo 38898c2808
Merge pull request #16212 from Aurelius84/develop
6 years ago
Xin Pan 50ff898378 graph neural network for imperative mode
6 years ago
Aurelius84 6cfd20dea8 fix words spell error test=develop
6 years ago
ceci3 cd82e2b03b Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into doc
6 years ago
ceci3 ede33c6260 fix formula in dropout
6 years ago
qingqing01 8ad672a287
Support sync batch norm. (#16121)
6 years ago
Aurelius84 a59b7d47a8 improve layers.fc api doc test=develop
6 years ago
sneaxiy 3e03695629 fix numeric error
6 years ago
sneaxiy 5a92e4c097 revert revert 16144
6 years ago
Zeng Jinle a91964c8fe Revert "PaddingRNN model memory optimize"
6 years ago
Zeng Jinle 0b49e43d3a
Merge pull request #16144 from sneaxiy/rnn_mem_opt
6 years ago
ceci3 24fbe6d610 test=develop, replace sce
6 years ago
sneaxiy d7407c90aa refine cross_entropy mem
6 years ago
ceci3 0af00a0541 test=develop
6 years ago
ceci3 d3656ff304 test=develop
6 years ago
ceci3 5f343b0e3a test=develop
6 years ago
ceci3 a80555a3a5 test=develop, change import
6 years ago
ceci3 60bfcb8b30 test=develop, change import
6 years ago
ceci3 8b86c12e46 test=develop, update API.spec
6 years ago
ceci3 23a9035b21 test=develop, update doc
6 years ago
chengduo 84e3adbe60 Fix reshape bug (#16069)
6 years ago
dengkaipeng dbb8d07886 fix doc statement. test=develop
6 years ago
dengkaipeng eeeebdd006 refine doc. test=develop
6 years ago
dengkaipeng 12416a24d2 add doc and test_layers. test=develop
6 years ago
dengkaipeng 63d322f07c fix attr dim calc. test=develop
6 years ago
ceci3 3b96aa0839 conflict fix
6 years ago
Tink_Y 8949a94691 refine image_resize annotation (#15976)
6 years ago
jerrywgz 4f43e981c1 add comment for revise, test=develop
6 years ago
jerrywgz b2ce832021 change default option related to softmax, test=develop
6 years ago
Tink_Y 31d830de9f refine image_resize annotation (#15976)
6 years ago
ceci3 f6d186782a
test=develop
6 years ago
jerrywgz b92ef45fe9
Merge pull request #15678 from jerrywgz/refine_softmax_with_cross_entropy
6 years ago
ceci3 4b7bf06e1f test=develop
6 years ago
colourful-tree 7d8f639883
Merge pull request #15902 from colourful-tree/new_develop
6 years ago
jerrywgz b53fdbed2c add comment for revise, test=develop
6 years ago
colourful-tree f2d6473ef8
Merge branch 'develop' into new_develop
6 years ago
heqiaozhi 04f876f5bc remove mkl & fix commit
6 years ago