Commit Graph

44 Commits (3f816bc8b4136d21f501e1a3c090df192046c46f)

Author SHA1 Message Date
cc 3f816bc8b4
[Quantization] Conv2d_transpose and mul support channnelwise quantization (#25639)
5 years ago
Pei Yang 9e9a569dae
add trt int8 support for elementwise_mul and scale (#25676)
5 years ago
cc 42189be67b
[Quant] Remove the output for moving_average_abs_max_scale op (#25697)
5 years ago
yukavio c9285a18a0
saving inference model when user define activation or weight preprocess function (#25749)
5 years ago
cc d8f4714bc1
[Quantization] Save output threshold by argname_index (#25272)
5 years ago
cc 8fc31d501b
Support conv2d_traspose quantize, test=develop (#25084)
5 years ago
Liufang Sang b174b99764
support user defined quantization func and preprocess (#24720)
5 years ago
cc dbcd7c69e9
Update sigmoid output from Y to out, test=develop (#24765)
5 years ago
cc 88e9d74a75
Collecting concat output threshold, test=develop (#24742)
5 years ago
cc 6c89ca2157
Add output threshold for ops that have several output activations, test=develop (#24726)
5 years ago
cc 4d35112255
[Fix bug] Init scale node in OutScaleForTrainingPass and enable test_quantization_scale_pass UT (#24393)
5 years ago
cc 25628587f1
Collect output scale for quantized op and fused op (#23369)
5 years ago
cc bd80903333
Add activation_type in AddQuantDequantPass to be compatible with paddleslim, test=develop (#23221)
5 years ago
cc 589cd8782f
Post_training_quantizaion supports min_max methon (#23078)
5 years ago
tianshuo78520a d2ba91aad1
fix typo words (#22653)
5 years ago
juncaipeng 8f7372ca81
add mul and matmul quantization, test=develop (#22054)
5 years ago
juncaipeng 1f57ac1241
delete concat in AddQuantDequantPass, test=develop (#21454)
5 years ago
itminner 07e6a94268 paddleslim quantization skip pattern support list of string (#21141)
5 years ago
juncaipeng 00b11a4a1e
Support more ops in post training quantization, test=develop (#21073)
6 years ago
juncaipeng fa522dffa0
Fix bug in add_quant_dequant_pass, test=develop (#21018)
6 years ago
juncaipeng 175ba39c03
Add post_training_quantization (#20800)
6 years ago
juncaipeng f201b465ec
Move pool2d to add_quant_dequant_pass, test=develop (#20586)
6 years ago
juncaipeng b0ceed6fb4
add fake_quant_dequant_op for average pool2d, test=develop (#19880)
6 years ago
Zhen Wang 0fe72469ea
Add the max-pool2d quantization support and the partial quantization support. (#19310)
6 years ago
Zhen Wang 3398f99608
Adding AddQuantDequantPass for TensorRT int8 (#17529)
6 years ago
Zhen Wang 65541d83b0
add scale pass for calculating the output scales.test=develop (#17259)
6 years ago
Zhen Wang a40121e4c8
fix the initialization process error. test=develop (#17213)
6 years ago
Zhen Wang 183bacebe3 clean codes and fix some bugs. test=develop
6 years ago
Zhen Wang 1c11f817e9 Use the resolve hazard method.
6 years ago
Zhen Wang 2ccbfd5e10 Fix some bugs for quantization passes.
6 years ago
Zhen Wang 8965819fbb rewrite the cuda kernels of channel_wise_quant_op and channe_wise_dequant_op. test=develop
6 years ago
Zhen Wang ec88b6cc5a add channel wise quantization in ir pass.
6 years ago
achao2013 81b4fad8b9 add moving average absmax op and fix bug (#15155)
6 years ago
Zhen Wang 7c8f7df2fe add some op_des funs to IrOpNode and add some var_des funs to IrVarNode. test=develop
6 years ago
Zhen Wang 33f99d6197 add IrNode&IrVarNode&IrOpNode. test=develop
6 years ago
WangZhen 28dfad5e27 fix some bugs about python3. test=develop
6 years ago
WangZhen a7efab7ec1 add comments for public API. test=develop
6 years ago
WangZhen 0db41a9c44 add op_role attr when creating op node.
6 years ago
WangZhen c64f22048a add convert_to_int8 pass and transform_for_mobile pass and their UTs.
6 years ago
WangZhen c8095eeb82 add freeze pass, and UT is passed.
6 years ago
WangZhen dde19a0ff8 add quantization freeze pass.
6 years ago
WangZhen 3b668c1574 Update some comments in the quantization transform pass. test=develop
6 years ago
WangZhen b913463e83 Update according to the reviewers' suggestion. test=develop
6 years ago
WangZhen 59e5cc51d6 Add quantization transform pass and UT.
6 years ago