You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Paddle/python/paddle/distributed
liuyuhui 4a8b8b4547
[Kunlun] add gen_bkcl_id_op, support multi XPU cards training using multiprocess (#30858)
5 years ago
..
fleet [Kunlun] add gen_bkcl_id_op, support multi XPU cards training using multiprocess (#30858) 5 years ago
__init__.py remove distributed prepare context (#30219) 5 years ago
cloud_utils.py Clean up the redundant files and unify the launch interface. (#28928) 5 years ago
collective.py add the paddle.distributed.split api (#29970) 5 years ago
launch.py Clean up the redundant files and unify the launch interface. (#28928) 5 years ago
parallel.py 【kunlun】dygraph supports multi xpu card training (#30671) 5 years ago
spawn.py Simplify the options of spawn based on fleetrun (#30144) 5 years ago
utils.py Clean up the redundant files and unify the launch interface. (#28928) 5 years ago