Commit Graph

122 Commits (a6edbc478beabaa9515bbadd7ecd5dd4d33b7ceb)

Author SHA1 Message Date
xiayanming a6edbc478b
support parsing ascend rank table file (#31000)
5 years ago
gongweibao ebef6601d5
Destroy session first. (#30954)
5 years ago
gongweibao de42d19336
Add paddle ascend distribution training supported (#30796)
5 years ago
OleNet ebb5d181e8
Ascendrc add converted op : [range/equal/range/uniform_random/expand/squeeze], fix cast op bug (#30797)
5 years ago
dingsiyu 4a26729540
Merge ascend_optimizer and ascend_parser. (#30776)
5 years ago
gongweibao 636fefd9f8
code style (#30781)
5 years ago
Void Main 904cc44349
[Feature] Build parser to support distributed training (#30658)
5 years ago
gongweibao f5aca8fbb4
Pass device_ids info from launch to trainer. (#30632)
5 years ago
Void Main d2404da768
Build praser for Hcom* operators (#30627)
5 years ago
gongweibao f9c97dd728
Add distribution supported (#30578)
5 years ago
hutuxian 6dd52c5b25
Ascend rc (#30483)
5 years ago
123malin 05f06d9ae1
test=develop, fix fleet.metric (#30438)
5 years ago
Chengmo 859431aadb
fix ps init(#30397)
5 years ago
123malin 2a98e9323a
test=develop, add distributed_infer (#30300)
5 years ago
JZ-LIANG 75936d838f
Recompute Offload (#30233)
5 years ago
tangwei12 25f80fd304
Fix/distributed proto (#29981)
5 years ago
Chengmo d479ae1725
【Paddle.Fleet】Support local save sparse param (#30175)
5 years ago
Chengmo 528e03fc08
【Paddle.Fleet】Fix tensor table (#30075)
5 years ago
gongweibao 4d2a4bb27a
fix logs info test=develop (#30071)
5 years ago
WangXi ab04997846
[fleet] combine amp and gradient merge, test=develop (#30086)
5 years ago
gongweibao eea7090c26
fix selected_gpus test=develop (#30044)
5 years ago
lilong12 b0bd93de00
Disable gloo by default (#29805)
5 years ago
lilong12 01950ceb42
fix the bug in pipeline data parallelism (#29731)
5 years ago
tangwei12 032414ca2a
[Feature] one ps (3/4) (#29604)
5 years ago
ShenLiang 01e2874a0e
Support multi-stream communication for dynamic graph distributed (#29525)
5 years ago
WangXi 9cbcc6cadc
fleet sync build strategy, test=develop (#29732)
5 years ago
JZ-LIANG d33d468f02
[Sharding] add hybrid-dp feature (#29518)
5 years ago
ShenLiang 2ef9e0e23c
Rebuild group automatically in dynamic graph distributed (#29255)
5 years ago
lilong12 b122d0bb76
Fix bug in gloo that gloo initialization hangs (#29447)
5 years ago
ShenLiang 4064354a01
support dp run single card (#29358)
5 years ago
gongweibao 96de8b008f
cleanup enum test=develop (#29294)
5 years ago
ShenLiang 2d6aa1a5bb
fix warning of fleet (#29317)
5 years ago
ShenLiang 2cd0bf5764
Fix doc of fleet api (#29282)
5 years ago
ShenLiang 46b73e6cd9
Change the api of DataParallel and Fleet (#29224)
5 years ago
123malin cc9c619679
test=develop, fix doc (#29200)
5 years ago
WangXi 0c2a51d240
optimizer amp, all use fp16 communication, overlap last comm and compute (#28957)
5 years ago
123malin 92817f8005
test=develop, rm pathlib (#28658)
5 years ago
ShenLiang e2d01eb650
Support dynamic graph distributed (#28997)
5 years ago
Chen Long d576d6ddeb
fix some docs test=develop;test=document_fix (#29159)
5 years ago
lilong12 a1add716bc
Add a flag to control whether to initialize gloo (#29150)
5 years ago
ShenLiang cddc70964d
fix InMemoryDataset doc (#28688)
5 years ago
JZ-LIANG 0dadacc4eb
[sharding] doc, api, bug fixed (#28983)
5 years ago
lilong12 2a864c70c4
fix the bug in gloo (#29112)
5 years ago
WangXi e931c7baf9
Fix multi nccl comm & wait server ready (#28663)
5 years ago
gongweibao 1358397e97
Clean up the redundant files and unify the launch interface. (#28928)
5 years ago
Leo Chen 3815d7aa40
Upgrade string literals to raw string (#28989)
5 years ago
123malin fbf9564f6b
【paddle.distributed.fleet】Optimize ParameterServer's Async Mode (#28442)
5 years ago
lilong12 f77a78cdee
enable pipeline to run with Executor.run() (#28373)
5 years ago
JZ-LIANG 5a9f6889c1
[Sharding] add new features (#28568)
5 years ago
Chengmo 4dc8c44ba1
【Paddle.Fleet】Fix fleetrun heter (#28252)
5 years ago