Commit Graph

116 Commits (3e579812944a0ab9e45b949f292d7ed50dbe916b)

Author SHA1 Message Date
minqiyang 3e57981294 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into imperative_lr_scheduler 6 years ago
lujun 1c9aaeebe0 move imperative to dygraph, test=develop 6 years ago
minqiyang 48f3cbdf55 Polish code 6 years ago
minqiyang 35c89f38c3 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into imperative_lr_scheduler 6 years ago
gongweibao eb83abeac3
Add DGC(Deep Gradient Compression) interface. () 6 years ago
minqiyang 99128a5c72 Implement Cosine and Noam Decay 6 years ago
Xin Pan f8c279b11c
Merge pull request from panyx0718/imperative2 6 years ago
minqiyang 4278be8c49 Merge branch 'imperative_lr_scheduler' of https://github.com/velconia/Paddle into imperative_lr_scheduler 6 years ago
minqiyang b5bbb13ac1 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into imperative_lr_scheduler 6 years ago
Jiabin Yang f735102eab
add layer norm to Layers, add transformer test in imperative mode () 6 years ago
Xin Pan fd24ab47ab polish 6 years ago
phlrain 77a08750e9 add var name in optimizer; test=develop 6 years ago
Qiyang Min 1f4aa7a202 Imperative remove all descs () 6 years ago
minqiyang 45c9f2a68a Fix bugs in piecewise decay 6 years ago
minqiyang feb39028c6 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into imperative_lr_scheduler 6 years ago
Jiabin Yang 654825cfe3
test=develop, reconstruct layer helper to fit imperative usage () 6 years ago
xuezhong 46fcadec18 add parameter description 6 years ago
xuezhong 57294fa890 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into fix_bug_adagrad 6 years ago
xuezhong 794b90c93f for backward compatibility 6 years ago
minqiyang 700495e11f Fix FtrlOptimizer's API comment 6 years ago
sneaxiy 7e399b0628 rename 6 years ago
sneaxiy f85245b409 test=develop 6 years ago
xuezhong 20e579ef2a add initial_accumulator_value for adagrad 6 years ago
minqiyang 1e0a78556d Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into imperative_lr_scheduler 6 years ago
minqiyang 0ec53f987c Support imperative learning rate decay in optimizer 6 years ago
minqiyang 3ce2d295c0 Refine stop_gradient 6 years ago
minqiyang c8965dc1ab Polish code 6 years ago
minqiyang 8ce198b2e1 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into imperative_resnet 6 years ago
minqiyang dbd4d058af Add static implementation and fix fc layer 6 years ago
minqiyang 315b133e67 Add single GPU support to imperative 6 years ago
Qiao Longfei a6b3bf6069 add attr min_row_size_to_use_multithread in op config test=develop 6 years ago
Qiao Longfei 8c516a24e5 remote min_row_size_to_use_multithread in adam interface test=develop 6 years ago
Qiao Longfei 9b4fe283e1 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into multithread-sparse-adam 6 years ago
minqiyang d0b640dca1 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into imperative_shared_ptr 6 years ago
Wu Yi fd85418329
[Feature] support mix precision training for resnet () 6 years ago
minqiyang 7aab39af15 Change grads to VarBase 6 years ago
Qiao Longfei 44b300556d change min_row_size_to_use_multithread to parameter of adam 6 years ago
minqiyang 336160e651 Complete imperative optimizer implementation 6 years ago
minqiyang 28013a5048 Polish code 6 years ago
minqiyang 5822f7f1d8 Polish code 6 years ago
minqiyang fff44af83f Support simple optimizer 6 years ago
minqiyang 68e9b841ab Add support for optimizer 6 years ago
typhoonzero da87f7a698 Revert "[Feature] Fp16 training for resnet50 ()" 6 years ago
Wu Yi 3d750f9c5a
[Feature] Fp16 training for resnet50 () 6 years ago
Qiao Longfei eb5d427d39 add comment for lazy_mode adam optimizer 6 years ago
Qiao Longfei c624417c6f change sparse mode to lazy mode 6 years ago
Qiao Longfei fc6ec6bd14 add sparse mode adam 6 years ago
Qiao Longfei d03cbd1b8c follow comment test=develop 6 years ago
Qiao Longfei 373f64986d add comment and unit test 6 years ago
Qiao Longfei 55edfca2b8 revert unused change 6 years ago