Commit Graph

169 Commits (387c1db4f1a1e400e9f684ffd93d6a47e61b8179)

Author SHA1 Message Date
xiayanming 387c1db4f1
Ascendrc (#31065)
4 years ago
gongweibao c687edecd8
Fix reshape on GE graph. (#31084)
4 years ago
xiayanming a6edbc478b
support parsing ascend rank table file (#31000)
4 years ago
gongweibao ebef6601d5
Destroy session first. (#30954)
4 years ago
gongweibao de42d19336
Add paddle ascend distribution training supported (#30796)
4 years ago
OleNet ebb5d181e8
Ascendrc add converted op : [range/equal/range/uniform_random/expand/squeeze], fix cast op bug (#30797)
4 years ago
dingsiyu 4a26729540
Merge ascend_optimizer and ascend_parser. (#30776)
4 years ago
gongweibao 636fefd9f8
code style (#30781)
4 years ago
Void Main 904cc44349
[Feature] Build parser to support distributed training (#30658)
4 years ago
gongweibao f5aca8fbb4
Pass device_ids info from launch to trainer. (#30632)
4 years ago
Void Main d2404da768
Build praser for Hcom* operators (#30627)
4 years ago
gongweibao f9c97dd728
Add distribution supported (#30578)
4 years ago
hutuxian 6dd52c5b25
Ascend rc (#30483)
4 years ago
123malin 05f06d9ae1
test=develop, fix fleet.metric (#30438)
4 years ago
Chengmo 859431aadb
fix ps init(#30397)
4 years ago
123malin 2a98e9323a
test=develop, add distributed_infer (#30300)
4 years ago
JZ-LIANG 75936d838f
Recompute Offload (#30233)
4 years ago
tangwei12 25f80fd304
Fix/distributed proto (#29981)
4 years ago
Chengmo d479ae1725
【Paddle.Fleet】Support local save sparse param (#30175)
4 years ago
Chen Weihang 3016ba852e
remove distributed prepare context (#30219)
4 years ago
Chengmo 528e03fc08
【Paddle.Fleet】Fix tensor table (#30075)
4 years ago
Chen Weihang 8020e34e7c
Simplify the options of spawn based on fleetrun (#30144)
4 years ago
gongweibao 4d2a4bb27a
fix logs info test=develop (#30071)
4 years ago
WangXi ab04997846
[fleet] combine amp and gradient merge, test=develop (#30086)
4 years ago
gongweibao eea7090c26
fix selected_gpus test=develop (#30044)
4 years ago
Chen Weihang 46c4695421
Set FLAGS_selected_gpus for spawn (#29962)
4 years ago
lilong12 b0bd93de00
Disable gloo by default (#29805)
4 years ago
lilong12 2bc5121da8
add the paddle.distributed.split api (#29970)
4 years ago
lilong12 01950ceb42
fix the bug in pipeline data parallelism (#29731)
4 years ago
tangwei12 032414ca2a
[Feature] one ps (3/4) (#29604)
4 years ago
ShenLiang 01e2874a0e
Support multi-stream communication for dynamic graph distributed (#29525)
4 years ago
WangXi 9cbcc6cadc
fleet sync build strategy, test=develop (#29732)
4 years ago
JZ-LIANG d33d468f02
[Sharding] add hybrid-dp feature (#29518)
4 years ago
ShenLiang 2ef9e0e23c
Rebuild group automatically in dynamic graph distributed (#29255)
5 years ago
lilong12 b122d0bb76
Fix bug in gloo that gloo initialization hangs (#29447)
5 years ago
ShenLiang 4064354a01
support dp run single card (#29358)
5 years ago
gongweibao 96de8b008f
cleanup enum test=develop (#29294)
5 years ago
ShenLiang 2d6aa1a5bb
fix warning of fleet (#29317)
5 years ago
ShenLiang 2cd0bf5764
Fix doc of fleet api (#29282)
5 years ago
ShenLiang 46b73e6cd9
Change the api of DataParallel and Fleet (#29224)
5 years ago
123malin cc9c619679
test=develop, fix doc (#29200)
5 years ago
WangXi 0c2a51d240
optimizer amp, all use fp16 communication, overlap last comm and compute (#28957)
5 years ago
123malin 92817f8005
test=develop, rm pathlib (#28658)
5 years ago
ShenLiang e2d01eb650
Support dynamic graph distributed (#28997)
5 years ago
Chen Long d576d6ddeb
fix some docs test=develop;test=document_fix (#29159)
5 years ago
lilong12 216e085605
update, test=develop (#29139)
5 years ago
lilong12 a1add716bc
Add a flag to control whether to initialize gloo (#29150)
5 years ago
ShenLiang cddc70964d
fix InMemoryDataset doc (#28688)
5 years ago
JZ-LIANG 0dadacc4eb
[sharding] doc, api, bug fixed (#28983)
5 years ago
lilong12 2a864c70c4
fix the bug in gloo (#29112)
5 years ago