Commit Graph

380 Commits (dcce54ea76be48cb3a6ac398b7d9569e996ac054)

Author SHA1 Message Date
tangwei12 ebbdf52557
fix entry (#31079)
4 years ago
Zhen Wang 71acde9afc
Use correct master weights in AdamW. (#30895)
4 years ago
yingshengBD 0eea5d714f
post quantize support insert fake_quantize_dequantize node before the OPs that will be used in VIS's faceid models (#30659)
5 years ago
guofei 430f8449f1
Fix the error of save_quantized_model (#30583)
5 years ago
huangxu96 138620084c
Add fleet amp_init() (#30572)
5 years ago
cc ce6777fcdf
Fix bug of supporting channelwise dygraph quantized model, test=develop (#30531)
5 years ago
cc 5d8d463cf7
Collect weight threshold for lstm op in post_training_quantization (#28701)
5 years ago
cc 8e3a294045
skip quantizing ops in cpu inference (#30342)
5 years ago
Bai Yifan ad6fee2fa8
fix quantize error in speical naming model (#30354)
5 years ago
huangxu96 342d62de60
add amp example document (#30314)
5 years ago
huangxu96 ee623bff64
Implemented AddQuantDequantPass in imperative quantization. (#26692)
5 years ago
tangwei12 5e839e4da5
add sparse embedding & load vars for 2.0 & gloo bug fix (#30306)
5 years ago
Zhen Wang 7f7dfccf20
Support pure fp16 training for AMP API. (#29544)
5 years ago
guofei 1bdf924217
Quantization supports 2.0 APIs (#30036)
5 years ago
wangchaochaohu 7dd551e08b
refine the paddle place support using str (#28769)
5 years ago
WangXi ab04997846
[fleet] combine amp and gradient merge, test=develop (#30086)
5 years ago
cc 1fa863da40
Support dygraph quant model (#29927)
5 years ago
cc 62f455e023
Support quantizing program_desc (#29526)
5 years ago
guofei 8212874f47
Fix test_imperative_skip_out (#29939)
5 years ago
XiaoguangHu 726c78f293
clean redundant API alias in 2.0 - part 1 (#29928)
5 years ago
cc 61820fd217
add the time threshold of quantization tests, test=develop (#29786)
5 years ago
huangxu96 a29006d128
Optimizer trans momentum (#29597)
5 years ago
huangxu96 97e29411eb
fix a bug in multi_precision_fp16 unittest. (#29756)
5 years ago
huangxu96 b96dada4f0
add static.amp into setup.pu.in (#29621)
5 years ago
huangxu96 c05170d3d8
add alias for fluid.contrib.mixed_precision (#29562)
5 years ago
Wojciech Uss 917a11495f
fix ininite scale values (#29386)
5 years ago
Aurelius84 5d530c9319
fix amp support fleet (#29491)
5 years ago
LoveAn 03b42d9fa7
fix unittest on windows, test=develop (#29365)
5 years ago
Zhen Wang be3777a50a
Add pure fp16 training with master weights. (#27712)
5 years ago
furnace 7584bb5096
Layer norm fp16 (#29169)
5 years ago
Leo Chen 4556ad76b4
Upgrade string literals to raw string [part 2](#29217)
5 years ago
WangXi 0c2a51d240
optimizer amp, all use fp16 communication, overlap last comm and compute (#28957)
5 years ago
Wojciech Uss 4fd4095d1b
Add quantization of multi_gru op and tests (#28615)
5 years ago
guofei 638402274a
Integrate ImperativeOutScale into ImperativeQuantAware. (#27956)
5 years ago
Aurelius84 14013a2eba
Remove prettytable in requirements.txt (#29100)
5 years ago
huangxu96 40f5453725
Quant nn2.0 (#28764)
5 years ago
Leo Chen 3815d7aa40
Upgrade string literals to raw string (#28989)
5 years ago
furnace 8ff3550658
refactor momentum op to combine weight (#27414)
5 years ago
Chen Weihang 358d6bc90f
Fix test_weight_decay_extend random failed on windows (#28643)
5 years ago
Bai Yifan 5050e761b8
Support user-defined activation/weight quantize and preprocess. (#28570)
5 years ago
Leo Chen 11e32baf1e
Add matmtl_v2 to amp list (#28693)
5 years ago
cc d1e84f3e9e
Add some ops for cacluating output scale, test=develop (#28644)
5 years ago
YUNSHEN XIE ba0756325a
exec ut no more than 15s 1 (#28439)
5 years ago
Leo Chen 71d6220772
Skip reader op in mixed_precision decorator (#28353)
5 years ago
Chen Weihang 5d73bfdb98
fix test_weight_decay_extend error (#28178)
5 years ago
cnn 7c1aa0d69d
2.0rc api rename (#28088)
5 years ago
guofei 6bbb6e7f45
Implement the function of OutScaleForTraining/OutScaleForInference in dygraph (#26601)
5 years ago
WangXi 0a1862d1d2
fleet combine amp dgc recompute meta optimizer (#27643)
5 years ago
cc 8fabb1c32f
Add test attribute in channelwise_quant op, test=develop (#27742)
5 years ago
Chen Weihang 9b49f02441
Polish jit.save/load design & remove paddle.SaveLoadConfig (#27623)
5 years ago