Commit Graph

242 Commits (5d039f40866ea2d879483668c685c5f18c4fc37d)

Author SHA1 Message Date
JZ-LIANG 5d039f4086
modified the implement of Lars optimizer (#26733)
5 years ago
Chen Weihang 9cb57f94c6
Update set_dict method name & add aliases (#26700)
5 years ago
Yang Zhang 6129b0e246
Revert `no_grad` changes and add new implementation (#26826)
5 years ago
Zhou Wei 407de03905
[2.0API] Reconstruct all API related to LR Scheduler, unify dygraph and static (#26550)
5 years ago
MRXLT eeda90d674
[WIP] update optimizer for 2.0 (#26288)
5 years ago
mapingshuo 7ae10900fa
fix slow var initialize, test=develop (#26516)
5 years ago
Dong Daxiang cbf8ba1591
add check approval (#26284)
5 years ago
Yang Zhang 617eb67f29
Upgrade `no_grad` decorator (#25472)
5 years ago
WangXi 2c9d0f3cb9
【paddle.fleet】Add dgc to fleet meta optimizer (#25738)
5 years ago
lilong12 8a68d2c213
Revert "add device attr for regularizer, test=develop (#24981)" (#25375)
5 years ago
mapingshuo c70f592002
add gradient Merge optimizer to meta (#25763)
5 years ago
tangwei12 caa90a6510
Integrated Trainer of Parameter Server (API add `fluid.contrib.layers.sparse_embedding` only) (#22957)
5 years ago
mapingshuo 3e2a348886
add string variable support for RecomputeOptimizer (#25728)
5 years ago
mapingshuo ed72406558
add gradient Merge Optimizer (#25625)
5 years ago
mapingshuo ea60e64470
correct the LookaheadOptimizer programDesc, test=develop (#25688)
5 years ago
leesusu 856e6d3348
Correct parameter l2 passed to ftrl op (#25223)
5 years ago
Zhou Wei 914ff10a8f
fix state dict to save/load learning rate scheduler (#25403)
5 years ago
hong fed0588571
Fix parameter list iterator bug (#25089)
5 years ago
lilong12 e39aa70ec7
add the support for pipeline (#24560)
5 years ago
lilong12 3d96601b82
modify pipeline optimizer to only support the mode of sync pipeline training (#25065)
5 years ago
Zhou Wei c505c4dbea
add new API: optimizer.set_lr (#24455)
5 years ago
lilong12 ab5a1fb853
add device attr for regularizer, test=develop (#24981)
5 years ago
Zhou Wei 98da8a295d
add new learing rate strategy to reduce lr when loss reach on plateau (#24322)
5 years ago
swtkiwi f5c6dd6def
test=develop (#24522)
5 years ago
hong 04e9d721a2
unitize name in optimizer; test=develop (#24008)
5 years ago
Zhou Wei 8002b2beb4
Avoid logging.info be printed many times in dygraph_mode,test=develop (#23932)
5 years ago
Zhou Wei 66dc8e30f0
move the initialize position of grad_clip to optimizer(__init__),and speed up clip (#23782)
5 years ago
mapingshuo f0e743f136
fix AMP and recompute (#23551)
5 years ago
Zhou Wei 629b6c7896
add the prompt message of repeated settings of regularization,test=develop (#23355)
5 years ago
qingqing01 6162cf2f2e
Make optimizer consistent in dygraph and static-graph and remove some LOG-INFO. (#23426)
5 years ago
Zhou Wei e8efaee92d
update gradient clip english doc for new gradient clipping strategy
5 years ago
Leo Chen a62599a888
[feature] prune program by feed and fetch_list automatically (#22474)
5 years ago
Zhou Wei 7fda333ac1
add new method of gradient_clip, better to use,test=develop (#23224)
5 years ago
Leo Chen 488b2387e2
Feature/expand params in auto-generated pybind functions for dygraph operators (#23181)
5 years ago
Zhang Ting eec10aaba2
set op_device for loss_op_desc (#23027)
6 years ago
WangXi f2265d9ffd
Fix problem use recompute and dgc same time (#23010)
6 years ago
mapingshuo 08a772cb46
fix API param bug of recompute.backward() (#22582)
6 years ago
WangXi 62fd3209e1
Fix dgc param regularizer, test=develop (#22888)
6 years ago
Zhang Ting 4e8bc02461
add fluid.device_guard to specify the device type for Op (#22254)
6 years ago
tianshuo78520a 433cef03e5
fix typo word (#22784)
6 years ago
zhaoyuchen2018 72dde4abde
Refine adam op to improve performance, test=develop (#22346)
6 years ago
tianshuo78520a d2ba91aad1
fix typo words (#22653)
6 years ago
WangXi d69df9bf26
Add wrong info when use DGC in cpu (#22515)
6 years ago
Aurelius84 50af6b5d79
polish no_grad_set of gradient and append_backward (#22440)
6 years ago
hong 00c0139e6e add learning rate api of optimizer (#22080)
6 years ago
zhongpu b1c081f4c7 polish Optimizer's API description, test=develop (#22314)
6 years ago
Aurelius84 60a6d68fb9
remove _optimized_guard in dygrahpe_mode (#22143)
6 years ago
Leo Chen d4bdbf8cf0
Polish nn code, test=develop (#22237)
6 years ago
zhongpu d0f0a2520c test Optimizer in dygraph (#21949)
6 years ago
zhongpu 7d10edc5ee add clear_gradients for Optimizer and add clear_gradients api description (#21948)
6 years ago