Commit Graph

19 Commits (adaec0073d02c0ea55bcabc4671ebfc8dbd3182c)

Author SHA1 Message Date
tangwei12 ed856d254e
fix ut (#29989)
5 years ago
tangwei12 032414ca2a
[Feature] one ps (3/4) (#29604)
5 years ago
123malin f2d68d3ed5
【paddle.fleet】parameter_server_optimizer support auto_strategy (#26838)
6 years ago
Chengmo d0962abd20
supplement bug fix of parameter server (#26217)
6 years ago
123malin 57d434df5d
add save/load for parameter server (#26235)
6 years ago
Chengmo eeeef957c7
Fix ps gpu (#26218)
6 years ago
123malin 2191a08317
【paddle.fleet】fleet_util move to paddle.fleet (#25805)
6 years ago
tangwei12 caa90a6510
Integrated Trainer of Parameter Server (API add `fluid.contrib.layers.sparse_embedding` only) (#22957)
6 years ago
123malin 00594c1c88
support dumping params/grads in transpiler mode (#22490)
6 years ago
tangwei12 b0675c8193
fix bug with compiledProgram (#22495)
6 years ago
tangwei12 82bc814a57
integrated HALF_ASYNC to communicator (#21869)
6 years ago
123malin 7fb817d447
add distributed_strategy (#21710)
6 years ago
silingtong123 3c33417905 modify the method of skipping CI in distributed unittests (#21764)
6 years ago
Chengmo 940c6ff1c8
Fix communicator slow bug & fix communicator stop bug (#20366)
7 years ago
tangwei12 c9139c3db3
trainer from dataset fetch targets (#19760)
7 years ago
tangwei12 8f0b3c0516
the integrated communicator (#19849)
7 years ago
tangwei12 65c7368400
Fix the correctness of async mode at distributed training (#18863)
7 years ago
guru4elephant ebf9797ec3
split different comm method for mnist distributed training (#18715)
7 years ago
tangwei12 101f74cb19
fix save/load in fleet (#17675)
7 years ago