Commit Graph

28618 Commits (baddedfdf1b13a76738cd30f7ee26ab4aac6bc0b)
 

Author SHA1 Message Date
lilong12 36c0410223
Revert "Initialize gloo for low level collective apis (#27356)", test=document_fix (#27665)
4 years ago
xiemoyuan 99e3337368
Optimize the error message of OP. (#27478)
4 years ago
Zeyu Chen aed0080181
Remove NLTK requirements to avoid dependency security alert from Github Dependabot. (#27628)
4 years ago
ShenLiang e8f873df88
optimize the speed&memory of matmul op (#27610)
4 years ago
Pei Yang ae6e40a7fd
Add unittests and OP version registry for tensorrt_subgraph_pass (#27544)
4 years ago
123malin 6822307745
test=develop, rm netifaces (#27581)
4 years ago
tangwei12 9704582eef
fix op error (#27599)
4 years ago
wanghuancoder c68a0313a5
add paddle.fluid._cuda_synchronize (#27595)
4 years ago
Li Fuchen 516d84b22a
fix tests warpctc (#27639)
4 years ago
yaoxuefeng c9a8801325
enhance error messages of lookup_tale, merge_ids, data_norm (#27619)
4 years ago
whs 9cc5603d56
Make grid support stopping graients. (#27630)
4 years ago
liym27 074a71bd25
Support assignment to a Variable in dynamic mode but not deal with backward. (#27471)
4 years ago
lilong12 5218b7af6b
add ncclSend and ncclRecv (#27621)
4 years ago
lilong12 fa73e4a284
Initialize gloo for low level collective apis (#27356)
4 years ago
Tao Luo bf99bc4a62
update name_scope example code (#27594)
4 years ago
WangXi 5641ea2bf6
Remove optimizer which in fleet, test=develop (#27606)
4 years ago
littletomatodonkey 68df20d2f2
fix pad2d example code (#27615)
4 years ago
Aurelius84 7c5162400f
[API 2.0]Migrate api example for gradients/append_backward/program_guard (#27570)
4 years ago
Dong Daxiang 4e8f18ab25
Get final strategy (#27602)
4 years ago
furnace d01f626944
update mv op according PR#27024 (#27474)
4 years ago
Double_V 9d783aeddd
Error message opt, test=develop (#27467)
4 years ago
YUNSHEN XIE d1c2a3bc6f
disable ut test_warpctc_op,test=document_fix (#27632)
4 years ago
littletomatodonkey 6e41143ffe
remove paddle.metrics.cos_sim api (#27569)
4 years ago
whs 96daa2594e
Fix padding in conv1d op (#27590)
4 years ago
Li Fuchen 1501a80f74
add support to float64 input of warpctc op. (#27399)
4 years ago
liym27 3f170dd83d
[API 2.0] Fix example code of api 'switch_case' and add/delete alias (#27578)
4 years ago
Zhou Wei c5b6e44b4a
fix cholesky of test_math_op_patch_var_base (#27591)
4 years ago
liym27 9b7ebf1099
[API 2.0] Fix example code of api 'case' and add/delete alias (#27577)
4 years ago
QingshuChen 6b727e08b1
support elementwise add, activation, matmul on Baidu Kunlun (#27143)
4 years ago
Jack Zhou d37b3774fd
register log double grad kernel for cpu and cuda
4 years ago
Chengmo d014e29fc6
fix error message (#27318)
4 years ago
Leo Chen 35074963e3
Refine error msg in paddle/fluid/framework/details [part 2] (#27429)
4 years ago
Zhou Wei 162b4d6c13
remove to_variable from 2.0 (#27528)
4 years ago
YUNSHEN XIE 9b12401434
modified storage address of block file (#27576)
4 years ago
Chengmo 0e101c4f6f
Fix test dist fleet heter ctr (#27513)
4 years ago
Double_V 42065ba37a
fix activate_nn_grad, test=develop (#27555)
4 years ago
Double_V b9d739a7ea
fix pool bug, test=develop (#27537)
4 years ago
gongweibao 86fa043205
init test=develop (#27554)
4 years ago
Zhong Hui a85592bcbf
fix cpplint error for the autmic max/min
4 years ago
Chen Weihang ecfdfc9c58
fix guard place set error (#27573)
4 years ago
joanna.wozna.intel b0ee1405f7
Add conv2d bfloat16 support (#27325)
4 years ago
LielinJiang b38e4f2840
Refine vision models (#27476)
4 years ago
Leo Chen 0b4bb023a7
Add static mode check on data() (#27495)
4 years ago
Leo Chen a5b3263782
Refine error msg in paddle/fluid/imperative (#27521)
4 years ago
chalsliu 09f1953296
Revert "Disable ut quickly."
4 years ago
Thunderbrook 6f69a4cb05
add xpu in heter mode (#27000)
4 years ago
ceci3 8daccc9ea7
Fix batch norm double grad compute (#27549)
4 years ago
Chen Weihang c143326df5
try to fix test_paddle_save_load unknown timeout (#27536)
4 years ago
ShenLiang 6fc74bbaf6
add fp16 for matmul (#27523)
4 years ago
Zhong Hui fab4e6d08f
add abs support double grad
4 years ago