Commit Graph

116 Commits (06a3e3114899e4d6a5c621d34d38c401e071d1f0)

Author SHA1 Message Date
Zhen Wang 4a9de931a2
Fix the bug in fleet amp_init. (#30606)
4 years ago
huangxu96 138620084c
Add fleet amp_init() (#30572)
4 years ago
lilong12 8126a41d73
fix the bug of all_reduce pipeline gradient multiple times (#30437)
4 years ago
tangwei12 c9e78a22c5
add trainers for pserver (#30523)
4 years ago
hutuxian 9fec1618d2
Ascend Framework Part3: Ascend Parser (#30391)
4 years ago
123malin 05f06d9ae1
test=develop, fix fleet.metric (#30438)
4 years ago
Chengmo 859431aadb
fix ps init(#30397)
4 years ago
123malin 2a98e9323a
test=develop, add distributed_infer (#30300)
4 years ago
JZ-LIANG 75936d838f
Recompute Offload (#30233)
4 years ago
tangwei12 25f80fd304
Fix/distributed proto (#29981)
4 years ago
Chengmo d479ae1725
【Paddle.Fleet】Support local save sparse param (#30175)
4 years ago
Chengmo 528e03fc08
【Paddle.Fleet】Fix tensor table (#30075)
4 years ago
gongweibao 4d2a4bb27a
fix logs info test=develop (#30071)
4 years ago
WangXi ab04997846
[fleet] combine amp and gradient merge, test=develop (#30086)
4 years ago
gongweibao eea7090c26
fix selected_gpus test=develop (#30044)
4 years ago
lilong12 b0bd93de00
Disable gloo by default (#29805)
4 years ago
lilong12 01950ceb42
fix the bug in pipeline data parallelism (#29731)
4 years ago
tangwei12 032414ca2a
[Feature] one ps (3/4) (#29604)
4 years ago
ShenLiang 01e2874a0e
Support multi-stream communication for dynamic graph distributed (#29525)
4 years ago
WangXi 9cbcc6cadc
fleet sync build strategy, test=develop (#29732)
4 years ago
JZ-LIANG d33d468f02
[Sharding] add hybrid-dp feature (#29518)
4 years ago
ShenLiang 2ef9e0e23c
Rebuild group automatically in dynamic graph distributed (#29255)
4 years ago
lilong12 b122d0bb76
Fix bug in gloo that gloo initialization hangs (#29447)
4 years ago
ShenLiang 4064354a01
support dp run single card (#29358)
4 years ago
gongweibao 96de8b008f
cleanup enum test=develop (#29294)
4 years ago
ShenLiang 2d6aa1a5bb
fix warning of fleet (#29317)
4 years ago
ShenLiang 2cd0bf5764
Fix doc of fleet api (#29282)
4 years ago
ShenLiang 46b73e6cd9
Change the api of DataParallel and Fleet (#29224)
4 years ago
123malin cc9c619679
test=develop, fix doc (#29200)
4 years ago
WangXi 0c2a51d240
optimizer amp, all use fp16 communication, overlap last comm and compute (#28957)
4 years ago
123malin 92817f8005
test=develop, rm pathlib (#28658)
4 years ago
ShenLiang e2d01eb650
Support dynamic graph distributed (#28997)
4 years ago
Chen Long d576d6ddeb
fix some docs test=develop;test=document_fix (#29159)
4 years ago
lilong12 a1add716bc
Add a flag to control whether to initialize gloo (#29150)
4 years ago
ShenLiang cddc70964d
fix InMemoryDataset doc (#28688)
4 years ago
JZ-LIANG 0dadacc4eb
[sharding] doc, api, bug fixed (#28983)
4 years ago
lilong12 2a864c70c4
fix the bug in gloo (#29112)
4 years ago
WangXi e931c7baf9
Fix multi nccl comm & wait server ready (#28663)
4 years ago
gongweibao 1358397e97
Clean up the redundant files and unify the launch interface. (#28928)
4 years ago
Leo Chen 3815d7aa40
Upgrade string literals to raw string (#28989)
4 years ago
123malin fbf9564f6b
【paddle.distributed.fleet】Optimize ParameterServer's Async Mode (#28442)
4 years ago
lilong12 f77a78cdee
enable pipeline to run with Executor.run() (#28373)
4 years ago
JZ-LIANG 5a9f6889c1
[Sharding] add new features (#28568)
4 years ago
Chengmo 4dc8c44ba1
【Paddle.Fleet】Fix fleetrun heter (#28252)
4 years ago
mapingshuo 81244fbfab
add sharding strategy in fleet(#27900)
4 years ago
WangXi 11acbfae06
refine auto strategy, test=document_fix (#28211)
4 years ago
MRXLT 55098b975e
fleet support paddle.optimzier (#28026)
4 years ago
WangXi fb641c915e
【paddle.fleet】fleet add _get_applied_meta_list and _get_applied_graph_list (#27952)
4 years ago
lilong12 ff0ebefc1e
put gloo initialization log to file (#27969)
4 years ago
tangwei12 202bfab1be
Feature/large scale kv save base/delta (#27470)
4 years ago