Commit Graph

70 Commits (fix_doc_bugs)

Author SHA1 Message Date
WangXi 739043c6a6
【paddle.fleet】fleet add _get_applied_meta_list and _get_applied_graph_list, test=develop (#27952) (#28053)
6 years ago
danleifeng 61162497b4
raise error if use multi-cards in fleet non_distributed mode;test=develop (#28093)
6 years ago
lilong12 316c97c7eb
put gloo initialization log to file (#27969) (#28076)
6 years ago
MRXLT e3513a6395
[cherry pick] Fix fleet (#28067)
6 years ago
tangwei12 c0550b54a5
Feature/large scale kv save base/delta (#27470) (#27990)
6 years ago
123malin c0061ff56f
【paddle.fleet】geo send sparse optimize (#27719) (#27979)
6 years ago
Chengmo 328cb289ed
【paddle.fleet】fix sparse load (#27680)
6 years ago
123malin a4f850748a
【paddle.fleet】bug fix for parameter_recv (#27838)
6 years ago
Chen Weihang ed31dac6eb
remove scale loss and coll grads, test=document_fix (#27874)
6 years ago
WangXi 50619cd842
use floyd algorithm to find meta optimizer max path, test=develop (#27867)
6 years ago
mapingshuo 8d2cb14f98
support gradient merge with recompute, test=develop (#27834)
6 years ago
Chengmo c5f2802d56
【paddle.fleet】Update fleetrun & ps-heter (#27472)
6 years ago
WangXi 0a1862d1d2
fleet combine amp dgc recompute meta optimizer (#27643)
6 years ago
danleifeng a01bc6b31d
【paddle.fleet】fleet support non_distributed training in dygraph mode (#27714)
6 years ago
lilong12 742cbe6660
[bug fix] avoiding multiple initialization of gloo for fleet in dygraph mode (#27706)
6 years ago
lilong12 5132f5129d
terminate http server used by gloo for fleet after init (#27698)
6 years ago
lilong12 bbc2add703
Initialize gloo for low level collective apis (#27672)
6 years ago
Qinghe JING 1539a23822
Fix bugs in hdfs download (#27344)
6 years ago
yaoxuefeng 780140599f
【paddle.distributed.fleet】add data_generator in distributed.fleet.dataset (#27345)
6 years ago
lilong12 36c0410223
Revert "Initialize gloo for low level collective apis (#27356)", test=document_fix (#27665)
6 years ago
123malin 6822307745
test=develop, rm netifaces (#27581)
6 years ago
lilong12 fa73e4a284
Initialize gloo for low level collective apis (#27356)
6 years ago
Dong Daxiang 4e8f18ab25
Get final strategy (#27602)
6 years ago
Chengmo 0e101c4f6f
Fix test dist fleet heter ctr (#27513)
6 years ago
WangXi e550fc02ae
fleet2.0 add fp16 grad compression (#27480)
6 years ago
123malin 32ad4f90a4
【paddle.fleet】 Usages Change: from fleet.util() to fleet.util (#27468)
6 years ago
tangwei12 bc5f0246a8
large scale kv speedup (#26510)
6 years ago
danleifeng 0721767ba9
fix server_num bug;test=develop (#27442)
6 years ago
danleifeng 905e2346ac
add endpoints log;test=develop (#27439)
6 years ago
danleifeng fc61efd736
fix port env bug(int);test=develop (#27405)
6 years ago
tangwei12 d6b54de467
【paddle.fleet】Fix/role maker api fix (#27326)
6 years ago
tangwei12 99626502f7
【paddle.fleet】gloo and util (#27213)
6 years ago
123malin f36b9a7f79
【Fleet2.0 Util】 add documents (#26698)
6 years ago
danleifeng 8d05c00c67
fix paddle.fleet en-doc for apis in dynamic mode (#27354)
6 years ago
ShenLiang 746a8ded29
fix comment of adaptive lsgd (#27362)
6 years ago
gongweibao 11bcf0e21c
Cleanup redundant code files (#27319)
6 years ago
ShenLiang 54b81fa32c
add adaptivelsgd in meta_optimizer (#27289)
6 years ago
yaoxuefeng c67c391682
refine fleet dataset class api (#27133)
6 years ago
danleifeng 389a9a7e0e
fix ports conflict when use paddlecloud to launch analogue multi-nodes (#26191)
6 years ago
mapingshuo 9dedafa0df
fix strategy, test=develop (#27323)
6 years ago
ShenLiang 2b6a5793fe
remove auto mode from localsgd optimizer (#27237)
6 years ago
123malin 60c3ef3ab8
【paddle.fleet】parameter_server_optimizer support auto_strategy (#27181)
6 years ago
JZ-LIANG 5d039f4086
modified the implement of Lars optimizer (#26733)
6 years ago
Dong Daxiang f7d08b7db8
【paddle.fleet】refine launch and distributed repr string for print (#27093)
6 years ago
123malin f2d68d3ed5
【paddle.fleet】parameter_server_optimizer support auto_strategy (#26838)
6 years ago
ShenLiang aca450f6fb
fix the localsgd optimizer (#27094)
6 years ago
Dong Daxiang 0443b480b8
【paddle.fleet】add auto parallel L1 implementations (#27090)
6 years ago
Chengmo a72752263b
support heter-xpu-ps (#27018)
6 years ago
mapingshuo 9e4fe92303
fix strategy example (#26856)
6 years ago
danleifeng 6b4ca0d7f1
【paddle.fleet】distributed_optimizer supports dygraph (#26541)
6 years ago