Commit Graph

16 Commits (280a8784f7073619180d5e8af1b982fec3b14580)

Author SHA1 Message Date
Michał Gallus 0c39b97b4e [MKL-DNN] Add Fully Connected Op for inference only(#15226)
6 years ago
guomingz 2281ebf0f3 Enable the convolution/relu6(bounded_relu) fusion for FP32 on Intel platform. (#17130)
6 years ago
guomingz 2deac4e447 Fix the bug of test_conv2d_int8_mkldnn case which raised by improper parameter passing (#17058)
6 years ago
Leo Zhao 1edcd73115 remove unnecessary new line
6 years ago
Leo Zhao 61cc842a53 disable test_elementwise_mul_mkldnn_op case
6 years ago
Leo Zhao a9694bd3d6 convert output to nchw format to align with native version in avx512 mode
6 years ago
xiaolil1 e235882c18 Enable MKL-DNN INT8 Concat Kernel. (#16156)
6 years ago
xiaolil1 e818fa1004 Enable INT8 transpose kernel for MobileNet-SSD improvement. (#16159)
6 years ago
xiaolil1 a177d48217 Add Requantize OP (#15318)
6 years ago
lidanqing 02c106c717 MKLDNN: Add UT for conv_transpose_mkldnn op. (#16030)
6 years ago
lidanqing dd1c7ee604 UT for conv2d_mkldnn_op with fuse_bias and fuse_residual (#16016)
6 years ago
Krzysztof Binias 54f21a5c47 Add test for ceil mode
6 years ago
Krzysztof Binias 851ea04dec Add UTs to check whether primitives for activations and softmax already exist in backward
6 years ago
Krzysztof Binias 309ea6f2de Fix for pylint Failed
6 years ago
Krzysztof Binias 1578c60bdd Add new ut and remove unnecessary code
6 years ago
Krzysztof Binias b1bdcd4de8 Make separate folders for mkldnn codes
6 years ago