Commit Graph

13 Commits (develop)

Author SHA1 Message Date
WangXi 31ed9c9eed
Fleet distributed strategy support pure fp16 (#30754)
5 years ago
WangXi ab04997846
[fleet] combine amp and gradient merge, test=develop (#30086)
5 years ago
WangXi 0c2a51d240
optimizer amp, all use fp16 communication, overlap last comm and compute (#28957)
5 years ago
WangXi fb641c915e
【paddle.fleet】fleet add _get_applied_meta_list and _get_applied_graph_list (#27952)
6 years ago
WangXi 0a1862d1d2
fleet combine amp dgc recompute meta optimizer (#27643)
6 years ago
ShenLiang 746a8ded29
fix comment of adaptive lsgd (#27362)
6 years ago
ShenLiang 54b81fa32c
add adaptivelsgd in meta_optimizer (#27289)
6 years ago
123malin 60c3ef3ab8
【paddle.fleet】parameter_server_optimizer support auto_strategy (#27181)
6 years ago
Dong Daxiang 0443b480b8
【paddle.fleet】add auto parallel L1 implementations (#27090)
6 years ago
Dong Daxiang 994217ea05
【paddle.fleet】fix api documents (#26777)
6 years ago
Dong Daxiang 83cd185947
【paddle.fleet】Meta from optimizer (#26392)
6 years ago
Dong Daxiang 4ec51e0205
【paddle.fleet】Clear disable (#26334)
6 years ago
Dong Daxiang 50a5bcfc9d
【paddle.fleet】paddle.fleet -> paddle.distributed.fleet. (#26186)
6 years ago