6.3 KiB
Contents
- Manifold Dynamic Pruning Description
- Dataset
- Features
- Environment Requirements
- Script Description
- Model Description
- Description of Random Situation
- ModelZoo Homepage
Manifold Dynamic Pruning Description
Neural network pruning is an essential approach for reducing the computational complexity of deep models so that they can be well deployed on resource-limited devices. Compared with conventional methods, the recently developed dynamic pruning methods determine redundant filters variant to each input instance which achieves higher acceleration. Most of the existing methods discover effective sub-networks for each instance independently and do not utilize the relationship between different inputs. To maximally excavate redundancy in the given network architecture, this paper proposes a new paradigm that dynamically removes redundant filters by embedding the manifold information of all instances into the space of pruned networks (dubbed as ManiDP). We first investigate the recognition complexity and feature similarity between images in the training set. Then, the manifold relationship between instances and the pruned sub-networks will be aligned in the training procedure. The effectiveness of the proposed method is verified on several benchmarks, which shows better performance in terms of both accuracy and computational cost compared to the state-of-the-art methods. For example, our method can reduce 55.3% FLOPs of ResNet-34 with only 0.57% top-1 accuracy degradation on ImageNet.
Paper: Yehui Tang, Yunhe Wang, Yixing Xu, Yiping Deng, Chao Xu, Dacheng Tao, Chang Xu. Manifold Regularized Dynamic Network Pruning. Submitted to CVPR 2021.
Dataset
Dataset used: CIFAR-10
- Dataset size: 60000 colorful images in 10 classes
- Train: 50000 images
- Test: 10000 images
- Data format: RGB images.
- Note: Data will be processed in src/dataset.py
Features
Mixed Precision(Ascend)
The mixed precision training method accelerates the deep learning neural network training process by using both the single-precision and half-precision data formats, and maintains the network precision achieved by the single-precision training at the same time. Mixed precision training can accelerate the computation process, reduce memory usage, and enable a larger model or batch size to be trained on specific hardware. For FP16 operators, if the input data type is FP32, the backend of MindSpore will automatically handle it with reduced precision. Users could check the reduced-precision operators by enabling INFO log and then searching ‘reduce precision’.
Environment Requirements
- Hardware(Ascend/GPU/CPU)
- Prepare hardware environment with Ascend、GPU or CPU processor. If you want to try Ascend, please send the application form to ascend@huawei.com. Once approved, you can get the resources.
- Framework
- For more information, please check the resources below:
Script description
Script and sample code
├── ManiDP
├── Readme.md # descriptions about adversarial-pruning # shell script for evaluation with CPU, GPU or Ascend
├── src
│ ├──loss.py # parameter configuration
│ ├──dataset.py # creating dataset
│ ├──resnet.py # Pruned ResNet architecture
├── eval.py # evaluation script
Training process
To Be Done
Eval process
Usage
After installing MindSpore via the official website, you can start evaluation as follows:
Launch
# infer example
Ascend: python eval.py --dataset_path path/to/cifar10 --platform Ascend --checkpoint_path [CHECKPOINT_PATH]
GPU: python eval.py --dataset_path path/to/cifar10 --platform GPU --checkpoint_path [CHECKPOINT_PATH]
CPU: python eval.py --dataset_path path/to/cifar10 --platform CPU --checkpoint_path [CHECKPOINT_PATH]
checkpoint can be produced in training process.
Result
result: {'acc': 0.9204727564102564}
Model Description
Performance
Evaluation Performance
ResNet20 on CIFAR-10
Parameters | |
---|---|
Model Version | ResNet20 |
uploaded Date | 03/27/2021 (month/day/year) ; |
MindSpore Version | 0.6.0-alpha |
Dataset | CIFAR-10 |
Parameters (M) | 0.27 |
FLOPs (M) | 18.74 |
Accuracy (Top1) | 92.05 |
Description of Random Situation
In dataset.py, we set the seed inside “create_dataset" function. We also use random seed in train.py.
ModelZoo Homepage
Please check the official homepage.