Commit Graph

60 Commits (dcce54ea76be48cb3a6ac398b7d9569e996ac054)

Author SHA1 Message Date
Yiqun Liu eb9ae55849
Optimize the performance of piecewise_decay. (#29077)
5 years ago
Leo Chen 3815d7aa40
Upgrade string literals to raw string (#28989)
5 years ago
chentianyu03 d05058d268
Remove and reorganize the alias of APIs (#27717)
5 years ago
Zhou Wei c701588b14
add base class of LearningRateEpochDecay, and API: MultiStepDecay, and API: StepDecay (#24821)
5 years ago
swtkiwi f5c6dd6def
test=develop (#24522)
5 years ago
Bai Yifan 4231d84077
enhance some op/api error message (#23768)
5 years ago
Aurelius84 d6f72c4fcc
Add parameter(learning_rate) in NoamDecay (#23156)
5 years ago
Zhang Ting 0d8f40d2b2
remove init_on_cpu and force_init_on_cpu APIs, test=develop (#22202)
5 years ago
tianshuo78520a d2ba91aad1
fix typo words (#22653)
5 years ago
xiaoting 3f0ca61a82
fix noam decay example, test=develop,test=document_fix (#22557)
5 years ago
songyouwei 99f5907e02 Fix layer & dygraph circular dependent (#22334)
6 years ago
hong 08483a6819
Add dygraph linear warm up decay (#21186)
6 years ago
Kaipeng Deng 3833b511a6
refine en API doc (#20206)
6 years ago
huangjun12 b347ca084e Fix document of APIs, test=develop, test=document_fix (#20314)
6 years ago
Kaipeng Deng 1f46253d4a
fix natural exp decay doc. test=develop (#19025)
6 years ago
xsrobin 8ce902541c
fix unalign of some examples (#18943)
6 years ago
qingqing01 602cb6a5b4
Enhance linear_lr_warmup (#18463)
6 years ago
xsrobin 47e2ef38e9
add "import paddle.fluid as fluid" to examples lack of it
6 years ago
Kaipeng Deng cf60e5a2db
fix API python example (#17226)
6 years ago
xiaoting 50ad9046c9 add import, test=develop (#17229)
6 years ago
ruri 39d6a985bc
fix some comments, include cosine_decay,l2_normalize,pixel_shuffle (#16763)
6 years ago
Wu Yi 8b58732013
remove append_LARS not used api test=develop (#16703)
6 years ago
Qiyang Min d8d73ff3db
Merge pull request #15584 from velconia/imperative_lr_scheduler
6 years ago
qingqing01 1ebd7434d5
Add linear learning warmup method in learning rate scheduler. (#16563)
6 years ago
minqiyang 64b0929417 Polish code
6 years ago
minqiyang fb7c787d34 Fix conflicts
6 years ago
minqiyang 48f3cbdf55 Polish code
6 years ago
minqiyang 99128a5c72 Implement Cosine and Noam Decay
6 years ago
minqiyang ec9c0874bc Implement Expotential NatureExp Inversetime and Polynomal Decay
6 years ago
minqiyang feb39028c6 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into imperative_lr_scheduler
6 years ago
shippingwang 733da7b2fc fixed typo, test=develop
6 years ago
shippingwang eb932f717a add cosine decay op, test=develop
6 years ago
minqiyang f8271649b4 Add PiecewiseDecay implementation
7 years ago
minqiyang 0ec53f987c Support imperative learning rate decay in optimizer
7 years ago
minqiyang 315b133e67 Add single GPU support to imperative
7 years ago
typhoonzero da87f7a698 Revert "[Feature] Fp16 training for resnet50 (#14850)"
7 years ago
Wu Yi 3d750f9c5a
[Feature] Fp16 training for resnet50 (#14850)
7 years ago
Tink_Y 6d04a9cf47 fix api format and example (#14686)
7 years ago
Wu Yi 26200f2e42
[1.1] [project] train imagenet using large batch size (#13766)
7 years ago
sneaxiy 3ad3635de0 fix conflict
7 years ago
sneaxiy 3ee0a6489d remove kwargs in python api
7 years ago
Wu Yi 29c63d180f
[Feature] dist op role and lr op role, to support memory optimize with dist training (#13220)
7 years ago
minqiyang 99d3f08920 Add print_function for all python files
7 years ago
minqiyang 559d36328c Apply 2to3 to current paddle main python code
7 years ago
fengjiayi 977764f28c Fix the other lr_decay
7 years ago
fengjiayi 381bacaa49 Fix piecewise_decay and fix a unittest error
7 years ago
qiaolongfei b77c886ed4 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into update-api-reference-1
7 years ago
Yu Yang 1171c2c57d
Merge pull request #11457 from JiayiFeng/dev_add_doc
7 years ago
qiaolongfei bf3ff5b091 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into update-api-reference-1
7 years ago
Wu Yi 53d1d0f0f2
add LARS support (#10374)
7 years ago