Commit Graph

2756 Commits (821899ccd4b89576fa40b663680019b5089b2bd5)

Author SHA1 Message Date
sweetsky0901 821899ccd4 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into my_unpool_max_2d
8 years ago
sweetsky0901 6fc9a9fd69 modify for del T2 and doc update
8 years ago
Qiao Longfei c975fe1bde
batch norm support matrix input (#5980)
8 years ago
Yu Yang 985e4ab62d
Add Python wrap of conv2d_transpose and its unittest (#5946)
8 years ago
Yu Yang 0aceeee1fa
Feature/remove g program (#5930)
8 years ago
Tao Luo 1e6f85e5ff
Merge pull request #5952 from luotao1/fix_lscpu_log
8 years ago
sweetsky0901 ee0a794c27 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into my_unpool_max_2d
8 years ago
sweetsky0901 57e68e5740 modify for code review by qingqing 2nd
8 years ago
ranqiu92 217c6a363f
Merge pull request #5949 from ranqiu92/doc
8 years ago
Wang Meng 95cdbfec19
Merge pull request #4859 from will-am/factorization_machine_layer
8 years ago
Luo Tao 966a442eb0 fix grep socket error in lscpu command
8 years ago
ranqiu d4c2f2f219 Refine the doc of layers.py
8 years ago
wangmeng28 8a283dbc9e Update docs for fm layer
8 years ago
Cao Ying ed516e0344
Merge pull request #5890 from ranqiu92/doc
8 years ago
QI JUN b28b2f172b refine test_recognize_digits_mlp and format codes (#5937)
8 years ago
Yu Yang d89ff5b614
Restore the param infos in Program.clone() (#5873)
8 years ago
Qiao Longfei c9a96575d5
py_test and test_image_classification_train support argument (#5934)
8 years ago
sweetsky0901 f9c2a5c38e modify for code review zcd
8 years ago
sweetsky0901 022b48e16f Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into my_unpool_max_2d
8 years ago
sweetsky0901 20654cf78a modify for type check rewrite
8 years ago
Guo Sheng dbfc76ab29
Merge pull request #5927 from guoshengCS/fix-addtolayer-check
8 years ago
fengjiayi 33fa2dfbde
Compelete max_sequence_len_op (#5913)
8 years ago
Yu Yang 0ac8c74e63
Unify fluid submodules to fluid module (#5924)
8 years ago
guosheng f6e82bcf7e Merge branch 'develop' of https://github.com/PaddlePaddle/paddle into fix-addtolayer-check
8 years ago
guosheng 5981918cfc Fix the check in addto_layer
8 years ago
dzhwinter e6546baa62
remove unused file (#5918)
8 years ago
武毅 a06bec1287
Conv cudnn 3d (#5783)
8 years ago
dzhwinter 52a735879c "add asnumpy interface" (#5620)
8 years ago
sweetsky0901 27cf7f3376 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into my_unpool_max_2d
8 years ago
Yu Yang a619695b06
Feature/enhance evaluator (#5824)
8 years ago
sweetsky0901 a38bbc8610 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into my_unpool_max_2d
8 years ago
dzhwinter 513b1e010f
"add floor, ceil, round op" (#5898)
8 years ago
dzhwinter 45062fe5d7
Feature/copytensor (#5455)
8 years ago
wanghaox 0690cca758
Merge pull request #5831 from wanghaox/roi_pool
8 years ago
wanghaox cf5b598642 fix some issues
8 years ago
wanghaox ef905598a2 fix some code issues
8 years ago
Qiao Longfei 65c859db7a
beam_search_decode support multi data type (#5847)
8 years ago
QI JUN 3a76062c84
support testing when training and handle dropout and batch_norm operator in testing mode (#5734)
8 years ago
ranqiu e4c8de9ef5 Update the annotations of layers.py
8 years ago
fengjiayi 50d670ee06
Unify dtype and datatype (#5869)
8 years ago
Qiao Longfei e1b26514a7
revert print in test_layers (#5834)
8 years ago
wangmeng28 89e63b138f Merge remote-tracking branch 'upstream/develop' into factorization_machine_layer
8 years ago
sweetsky0901 ee4a5d2117 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into my_unpool_max_2d
8 years ago
kavyasrinet d883547bf0
Adding the FTRL optimizer. (#5785)
8 years ago
tensor-tang e4397c4aa8
Merge pull request #5828 from tensor-tang/develop
8 years ago
Cao Ying 657776012b
Merge pull request #5692 from peterzhang2029/add_bn_eq
8 years ago
wanghaox 36dd770a08 add roi operator unittest
8 years ago
tensor-tang 32eb0a7fcf fix v2 init issue on Mac
8 years ago
sweetsky0901 e553d5728d format test code
8 years ago
sweetsky0901 47bd0bb678 del printf
8 years ago