Commit Graph

16 Commits (97b09687443cd336a782901cb0f44fbeeb64e847)

Author SHA1 Message Date
zhongpu 7ca836d3da
support if logic for Variable in dygraph (#22892)
6 years ago
Aurelius84 d6f72c4fcc
Add parameter(learning_rate) in NoamDecay (#23156)
6 years ago
hong 08483a6819
Add dygraph linear warm up decay (#21186)
6 years ago
Zeng Jinle c194b0c835
Try to deprecate unstable python memory optimize (#18983)
6 years ago
qingqing01 602cb6a5b4
Enhance linear_lr_warmup (#18463)
7 years ago
qingqing01 1ebd7434d5
Add linear learning warmup method in learning rate scheduler. (#16563)
7 years ago
shippingwang eb932f717a add cosine decay op, test=develop
7 years ago
typhoonzero da87f7a698 Revert "[Feature] Fp16 training for resnet50 (#14850)"
7 years ago
Wu Yi 3d750f9c5a
[Feature] Fp16 training for resnet50 (#14850)
7 years ago
minqiyang 99d3f08920 Add print_function for all python files
7 years ago
fengjiayi 4cba5500d2 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into fix_lr_decay
8 years ago
fengjiayi 381bacaa49 Fix piecewise_decay and fix a unittest error
8 years ago
QI JUN f7e9fe57d3
[Memory]More memory optimization policy (#8690)
8 years ago
QI JUN b341bac7e1
Refine cast op (#8923)
8 years ago
qiaolongfei 5d9dbe1e33 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into add-program-cache-for-executor
8 years ago
Qiao Longfei f45a82be4e
change learning_rate_decay to learning_rate_scheduler (#8583)
8 years ago