Commit Graph

128 Commits (b6a26749dc1747d2378e4976366d18268841b74c)

Author SHA1 Message Date
WangXi 0c2a51d240
optimizer amp, all use fp16 communication, overlap last comm and compute (#28957)
5 years ago
123malin 92817f8005
test=develop, rm pathlib (#28658)
5 years ago
ShenLiang e2d01eb650
Support dynamic graph distributed (#28997)
5 years ago
Chen Long d576d6ddeb
fix some docs test=develop;test=document_fix (#29159)
5 years ago
lilong12 216e085605
update, test=develop (#29139)
5 years ago
lilong12 a1add716bc
Add a flag to control whether to initialize gloo (#29150)
5 years ago
ShenLiang cddc70964d
fix InMemoryDataset doc (#28688)
5 years ago
JZ-LIANG 0dadacc4eb
[sharding] doc, api, bug fixed (#28983)
5 years ago
lilong12 2a864c70c4
fix the bug in gloo (#29112)
5 years ago
WangXi e931c7baf9
Fix multi nccl comm & wait server ready (#28663)
5 years ago
gongweibao 1358397e97
Clean up the redundant files and unify the launch interface. (#28928)
5 years ago
Chen Weihang bb16c2515d
Polish parallel api impl & doc details (#28980)
5 years ago
Leo Chen 3815d7aa40
Upgrade string literals to raw string (#28989)
5 years ago
123malin fbf9564f6b
【paddle.distributed.fleet】Optimize ParameterServer's Async Mode (#28442)
5 years ago
lilong12 f77a78cdee
enable pipeline to run with Executor.run() (#28373)
5 years ago
Chen Weihang bff4179cc7
lazily init global group in collective (#28780)
5 years ago
JZ-LIANG 5a9f6889c1
[Sharding] add new features (#28568)
5 years ago
lilong12 e4f9415338
update doc, test=document_fix (#28498)
5 years ago
danleifeng a24d186814
fix nccl init failed in parallel dygraph mode (#28497)
5 years ago
Chengmo 4dc8c44ba1
【Paddle.Fleet】Fix fleetrun heter (#28252)
5 years ago
mapingshuo 81244fbfab
add sharding strategy in fleet(#27900)
5 years ago
WangXi 11acbfae06
refine auto strategy, test=document_fix (#28211)
5 years ago
MRXLT 55098b975e
fleet support paddle.optimzier (#28026)
5 years ago
lilong12 5bb348a1c2
add doc for ReduceOp (#28051)
5 years ago
WangXi fb641c915e
【paddle.fleet】fleet add _get_applied_meta_list and _get_applied_graph_list (#27952)
5 years ago
lilong12 ff0ebefc1e
put gloo initialization log to file (#27969)
5 years ago
tangwei12 202bfab1be
Feature/large scale kv save base/delta (#27470)
5 years ago
123malin aa3b4ed717
【paddle.fleet】geo send sparse optimize (#27719)
5 years ago
danleifeng 8d7908f3fd
【paddle.fleet】raise error when using multi-cards in fleet non_distributed mode (#27854)
5 years ago
chentianyu03 d05058d268
Remove and reorganize the alias of APIs (#27717)
5 years ago
Chengmo 328cb289ed
【paddle.fleet】fix sparse load (#27680)
5 years ago
123malin a4f850748a
【paddle.fleet】bug fix for parameter_recv (#27838)
5 years ago
Chen Weihang ed31dac6eb
remove scale loss and coll grads, test=document_fix (#27874)
5 years ago
WangXi 50619cd842
use floyd algorithm to find meta optimizer max path, test=develop (#27867)
5 years ago
mapingshuo 8d2cb14f98
support gradient merge with recompute, test=develop (#27834)
5 years ago
Chengmo c5f2802d56
【paddle.fleet】Update fleetrun & ps-heter (#27472)
5 years ago
WangXi 0a1862d1d2
fleet combine amp dgc recompute meta optimizer (#27643)
5 years ago
danleifeng a01bc6b31d
【paddle.fleet】fleet support non_distributed training in dygraph mode (#27714)
5 years ago
lilong12 742cbe6660
[bug fix] avoiding multiple initialization of gloo for fleet in dygraph mode (#27706)
5 years ago
lilong12 5132f5129d
terminate http server used by gloo for fleet after init (#27698)
5 years ago
lilong12 bbc2add703
Initialize gloo for low level collective apis (#27672)
5 years ago
Qinghe JING 1539a23822
Fix bugs in hdfs download (#27344)
5 years ago
yaoxuefeng 780140599f
【paddle.distributed.fleet】add data_generator in distributed.fleet.dataset (#27345)
5 years ago
lilong12 36c0410223
Revert "Initialize gloo for low level collective apis (#27356)", test=document_fix (#27665)
5 years ago
123malin 6822307745
test=develop, rm netifaces (#27581)
5 years ago
lilong12 fa73e4a284
Initialize gloo for low level collective apis (#27356)
5 years ago
Dong Daxiang 4e8f18ab25
Get final strategy (#27602)
5 years ago
Chengmo 0e101c4f6f
Fix test dist fleet heter ctr (#27513)
5 years ago
WangXi e550fc02ae
fleet2.0 add fp16 grad compression (#27480)
5 years ago
123malin 32ad4f90a4
【paddle.fleet】 Usages Change: from fleet.util() to fleet.util (#27468)
5 years ago