Commit Graph

153 Commits (75936d838fdc8a92f61425fb32faeda6e40ef1f7)

Author SHA1 Message Date
JZ-LIANG 75936d838f
Recompute Offload (#30233)
5 years ago
tangwei12 25f80fd304
Fix/distributed proto (#29981)
5 years ago
Chengmo d479ae1725
【Paddle.Fleet】Support local save sparse param (#30175)
5 years ago
Chen Weihang 3016ba852e
remove distributed prepare context (#30219)
5 years ago
Chengmo 528e03fc08
【Paddle.Fleet】Fix tensor table (#30075)
5 years ago
Chen Weihang 8020e34e7c
Simplify the options of spawn based on fleetrun (#30144)
5 years ago
gongweibao 4d2a4bb27a
fix logs info test=develop (#30071)
5 years ago
WangXi ab04997846
[fleet] combine amp and gradient merge, test=develop (#30086)
5 years ago
gongweibao eea7090c26
fix selected_gpus test=develop (#30044)
5 years ago
Chen Weihang 46c4695421
Set FLAGS_selected_gpus for spawn (#29962)
5 years ago
lilong12 b0bd93de00
Disable gloo by default (#29805)
5 years ago
lilong12 2bc5121da8
add the paddle.distributed.split api (#29970)
5 years ago
lilong12 01950ceb42
fix the bug in pipeline data parallelism (#29731)
5 years ago
tangwei12 032414ca2a
[Feature] one ps (3/4) (#29604)
5 years ago
ShenLiang 01e2874a0e
Support multi-stream communication for dynamic graph distributed (#29525)
5 years ago
WangXi 9cbcc6cadc
fleet sync build strategy, test=develop (#29732)
5 years ago
JZ-LIANG d33d468f02
[Sharding] add hybrid-dp feature (#29518)
5 years ago
ShenLiang 2ef9e0e23c
Rebuild group automatically in dynamic graph distributed (#29255)
5 years ago
lilong12 b122d0bb76
Fix bug in gloo that gloo initialization hangs (#29447)
5 years ago
ShenLiang 4064354a01
support dp run single card (#29358)
5 years ago
gongweibao 96de8b008f
cleanup enum test=develop (#29294)
5 years ago
ShenLiang 2d6aa1a5bb
fix warning of fleet (#29317)
5 years ago
ShenLiang 2cd0bf5764
Fix doc of fleet api (#29282)
5 years ago
ShenLiang 46b73e6cd9
Change the api of DataParallel and Fleet (#29224)
5 years ago
123malin cc9c619679
test=develop, fix doc (#29200)
5 years ago
WangXi 0c2a51d240
optimizer amp, all use fp16 communication, overlap last comm and compute (#28957)
5 years ago
123malin 92817f8005
test=develop, rm pathlib (#28658)
5 years ago
ShenLiang e2d01eb650
Support dynamic graph distributed (#28997)
5 years ago
Chen Long d576d6ddeb
fix some docs test=develop;test=document_fix (#29159)
5 years ago
lilong12 216e085605
update, test=develop (#29139)
5 years ago
lilong12 a1add716bc
Add a flag to control whether to initialize gloo (#29150)
5 years ago
ShenLiang cddc70964d
fix InMemoryDataset doc (#28688)
5 years ago
JZ-LIANG 0dadacc4eb
[sharding] doc, api, bug fixed (#28983)
5 years ago
lilong12 2a864c70c4
fix the bug in gloo (#29112)
5 years ago
WangXi e931c7baf9
Fix multi nccl comm & wait server ready (#28663)
5 years ago
gongweibao 1358397e97
Clean up the redundant files and unify the launch interface. (#28928)
5 years ago
Chen Weihang bb16c2515d
Polish parallel api impl & doc details (#28980)
5 years ago
Leo Chen 3815d7aa40
Upgrade string literals to raw string (#28989)
5 years ago
123malin fbf9564f6b
【paddle.distributed.fleet】Optimize ParameterServer's Async Mode (#28442)
5 years ago
lilong12 f77a78cdee
enable pipeline to run with Executor.run() (#28373)
5 years ago
Chen Weihang bff4179cc7
lazily init global group in collective (#28780)
5 years ago
JZ-LIANG 5a9f6889c1
[Sharding] add new features (#28568)
5 years ago
lilong12 e4f9415338
update doc, test=document_fix (#28498)
5 years ago
danleifeng a24d186814
fix nccl init failed in parallel dygraph mode (#28497)
5 years ago
Chengmo 4dc8c44ba1
【Paddle.Fleet】Fix fleetrun heter (#28252)
5 years ago
mapingshuo 81244fbfab
add sharding strategy in fleet(#27900)
5 years ago
WangXi 11acbfae06
refine auto strategy, test=document_fix (#28211)
5 years ago
MRXLT 55098b975e
fleet support paddle.optimzier (#28026)
5 years ago
lilong12 5bb348a1c2
add doc for ReduceOp (#28051)
5 years ago
WangXi fb641c915e
【paddle.fleet】fleet add _get_applied_meta_list and _get_applied_graph_list (#27952)
5 years ago