You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Paddle/python/paddle/distributed/fleet/meta_optimizers
WangXi 0c2a51d240
optimizer amp, all use fp16 communication, overlap last comm and compute (#28957)
5 years ago
..
sharding [sharding] doc, api, bug fixed (#28983) 5 years ago
__init__.py add sharding strategy in fleet(#27900) 6 years ago
amp_optimizer.py optimizer amp, all use fp16 communication, overlap last comm and compute (#28957) 5 years ago
common.py add sharding strategy in fleet(#27900) 6 years ago
dgc_optimizer.py add sharding strategy in fleet(#27900) 6 years ago
fp16_allreduce_optimizer.py fleet2.0 add fp16 grad compression (#27480) 6 years ago
gradient_merge_optimizer.py support gradient merge with recompute, test=develop (#27834) 6 years ago
graph_execution_optimizer.py Fix multi nccl comm & wait server ready (#28663) 5 years ago
lamb_optimizer.py fleet combine amp dgc recompute meta optimizer (#27643) 6 years ago
lars_optimizer.py fleet combine amp dgc recompute meta optimizer (#27643) 6 years ago
localsgd_optimizer.py fleet combine amp dgc recompute meta optimizer (#27643) 6 years ago
meta_optimizer_base.py 【paddle.fleet】parameter_server_optimizer support auto_strategy (#27181) 6 years ago
parameter_server_graph_optimizer.py 【paddle.fleet】Fix/role maker api fix (#27326) 6 years ago
parameter_server_optimizer.py Upgrade string literals to raw string (#28989) 5 years ago
pipeline_optimizer.py enable pipeline to run with Executor.run() (#28373) 5 years ago
recompute_optimizer.py fleet combine amp dgc recompute meta optimizer (#27643) 6 years ago
sharding_optimizer.py [sharding] doc, api, bug fixed (#28983) 5 years ago