Commit Graph

18 Commits (365c2c9c89152e546d0556b71f44dd8b1f003e5c)

Author SHA1 Message Date
123malin 54c368db1e
[API 2.0: doc] fix doc of nonzero (#27685)
4 years ago
Chen Weihang dec53a9c79
Remove DataParallel.scale_loss & apply_collective_grads (#27603)
4 years ago
yaoxuefeng 6d9ae66096
delete ExponentialMovingAverage in paddle/optimizer (#27683)
4 years ago
WangXi 5641ea2bf6
Remove optimizer which in fleet, test=develop (#27606)
4 years ago
MRXLT f936adbd2d
fix adam (#27343)
4 years ago
MRXLT 9166307315
add check for sparse parameters with weight_decay (#27141)
4 years ago
MRXLT 72f6e566be
fix sample code (#26962)
5 years ago
Chen Weihang 9cb57f94c6
Update set_dict method name & add aliases (#26700)
5 years ago
Yang Zhang 6129b0e246
Revert `no_grad` changes and add new implementation (#26826)
5 years ago
MRXLT 1f36d3cdcb
update optimizer (#26711)
5 years ago
Jiawei Wang a1b99fae07
Adadelta Optimizer (#26590)
5 years ago
Zhou Wei 30aab17734
[2.0API]support 2.0 lr_scheduler for 2.0 optimizer (#26737)
5 years ago
ShenLiang 33afeb315a
fix the tanh (#26657)
5 years ago
Zhou Wei 7af5cb9b32
fix english doc of all lr_scheduler (#26619)
5 years ago
Zhou Wei 407de03905
[2.0API] Reconstruct all API related to LR Scheduler, unify dygraph and static (#26550)
5 years ago
MRXLT eeda90d674
[WIP] update optimizer for 2.0 (#26288)
5 years ago
hong 2b6d00496e
Api move 20a (#24559)
5 years ago
XiaoguangHu 194a22c5a8
reorganize the paddle api test=develop (#23151)
5 years ago