Email: dnn-energy at mit dot edu
New paper on "NetAdapt: Platform-Aware Neural Network Adaptation for Mobile Applications" will be presented at ECCV 2018. Available on arXiv. [ PDF ].
New paper on "Understanding the Limitations of Existing Energy-Efficient Design Approaches for Deep Neural Networks" [ PDF ].
Energy-Aware Pruned DNN Models released here.
Energy estimation website goes live!
Deep convolutional neural networks (CNNs) are indispensable to state-of-the-art computer vision algorithms. However, they are still rarely deployed on battery-powered mobile devices, such as smartphones and wearable gadgets, where vision algorithms can enable many revolutionary real-world applications. The key limiting factor is the high energy consumption of CNN processing due to its high computational complexity. While there are many previous efforts that try to reduce the CNN model size or amount of computation, we find that they do not necessarily result in lower energy consumption, and therefore do not serve as a good metric for energy cost estimation.
To close the gap between CNN design and energy consumption optimization, we propose an energy-aware pruning algorithm for CNNs that directly uses energy consumption estimation of a CNN to guide the pruning process. The energy estimation methodology uses parameters extrapolated from actual hardware measurements that target realistic battery-powered system setups. The proposed layer-by-layer pruning algorithm also prunes more aggressively than previously proposed pruning methods by minimizing the error in the output feature maps instead of the filter weights. For each layer, the weights are first pruned and then locally fine-tuned with a closed-form least-square solution to quickly restore the accuracy. After all layers are pruned, the entire network is further globally fine-tuned using back-propagation. With the proposed pruning method, the energy consumption of AlexNet and GoogLeNet are reduced by 3.7x and 1.6x, respectively, with less than 1% top-5 accuracy loss. Finally, we show that pruning the AlexNet with a reduced number of target classes can greatly decrease the number of weights but the energy reduction is limited.
Energy Estimation Methodology
Energy versus Accuracy Tradeoff
(Note: Results based on SqueezeNet v.1.0)
@inproceedings{cvpr_2017_yang_energy,
author = {{Yang, Tien-Ju and Chen, Yu-Hsin and Sze, Vivienne}},
title = {{Designing Energy-Efficient Convolutional Neural Networks using Energy-Aware Pruning}},
booktitle = {{IEEE Conference on Computer Vision and Pattern Recognition (CVPR)}},
year = {{2017}},
}