Deep Compression/Acceleration:模型压缩加速论文汇总

栏目: 数据库 · 发布时间: 6年前

内容简介:同时提供每月大咖直播分享、真实项目需求对接、干货资讯汇总,行业技术交流*延伸阅读

加入极市 专业CV交流群,与 6000+来自腾讯,华为,百度,北大,清华,中科院 等名企名校视觉开发者互动交流!更有机会与 李开复老师 等大牛群内互动!

同时提供每月大咖直播分享、真实项目需求对接、干货资讯汇总,行业技术交流 点击文末“ 阅读原文 ”立刻申请入群~

作者 | Mars_WH

来源 | 

blog.csdn.net/hw5226349/article/details/84888416

原文 | http://bbs.cvmart.net/topics/352

深度学习(Deep Learning)因其计算复杂度或参数冗余,在一些场景和设备上限制了相应的模型部署,需要借助模型压缩、优化加速、异构计算等方法突破瓶颈。其中模型压缩算法能够有效降低参数冗余,从而减少存储占用、通信带宽和计算复杂度,有助于深度学习的应用部署,本文汇总了近几年模型压缩方面的相关研究paper,欢迎收藏阅读~

结构structure

Searching for MobileNetV3

arxiv:https://arxiv.org/abs/1905.02244v1

中文解读: 重磅!MobileNetV3 来了!

[BMVC2018] IGCV3: Interleaved Low-Rank Group Convolutions for Efficient Deep Neural Networks
https://arxiv.org/abs/1806.00178

github: https://github.com/homles11/IGCV3

[CVPR2018] IGCV2: Interleaved Structured Sparse Convolutional Neural Networks

arxiv: https://arxiv.org/abs/1804.06202

[CVPR2018] MobileNetV2: Inverted Residuals and Linear Bottlenecks

arxiv: https://arxiv.org/abs/1801.04381

github: https://github.com/tensorflow/models/tree/master/research/slim/nets/mobilenet

[ECCV2018] ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design

arxiv: https://arxiv.org/abs/1807.11164

量化quantization

Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1

intro:二值网络

https://arxiv.org/abs/1602.02830

github:  https://github.com/MatthieuCourbariaux/BinaryNet

https://github.com/itayhubara/BinaryNet

[ACM2017] FINN: A Framework for Fast, Scalable Binarized Neural Network Inference

intro:二值网络

http://www.idi.ntnu.no/~yamanu/2017-fpga-finn-preprint.pdf

github: https://github.com/Xilinx/FINN

[CVPR2016] DoReFa-Net: Training Low Bitwidth Convolutional Neural Networks with Low Bitwidth Gradients

intro:低bit位

https://arxiv.org/abs/1606.06160

github: https://github.com/tensorpack/tensorpack/tree/master/examples/DoReFa-Net

[CVPR2016] XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks

intro:darknet团队出品

https://arxiv.org/abs/1603.05279

github: https://github.com/allenai/XNOR-Net

[CVPR2016] Ternary Weight Networks

arxiv: https://arxiv.org/abs/1605.04711

github: https://github.com/fengfu-chris/caffe-twns

Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference

Google出品

https://arxiv.org/abs/1712.05877

github: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/quantize

[ACM2017] Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations

intro:QNNs

https://arxiv.org/abs/1609.07061

github: https://github.com/peisuke/qnn

Two-Step Quantization for Low-bit Neural Networks

paper: http://openaccess.thecvf.com/content_cvpr_2018/papers/Wang_Two-Step_Quantization_for_CVPR_2018_paper.pdf

剪枝pruning

通道裁剪channel pruning

[NIPS2018] Discrimination-aware Channel Pruning for Deep Neural Networks

arxiv: https://arxiv.org/abs/1810.11809

github: https://github.com/Tencent/PocketFlow支持DisChnPrunedLearner

[ICCV2017] Channel Pruning for Accelerating Very Deep Neural Networks

intro:Lasso回归

https://arxiv.org/abs/1707.06168

github: https://github.com/yihui-he/channel-pruning

[ECCV2018] AMC: AutoML for Model Compression and Acceleration on Mobile Devices

intro:自动学习优化

https://arxiv.org/abs/1802.03494

https://www.jiqizhixin.com/articles/AutoML-for-Model-Compression-and-Acceleration-on-Mobile-Devices论文翻译

github: https://github.com/Tencent/PocketFlow

[ICCV2017] Learning Efficient Convolutional Networks through Network Slimming

intro:Zhuang Liu

https://arxiv.org/abs/1708.06519

github: https://github.com/Eric-mingjie/network-slimming

https://github.com/foolwood/pytorch-slimming

[ICLR2018] Rethinking the Smaller-Norm-Less-Informative Assumption in Channel Pruning of Convolution Layers

arxiv: https://arxiv.org/abs/1802.00124

github:[PyTorch] https://github.com/jack-willturner/batchnorm-pruning

[TensorFlow] https://github.com/bobye/batchnorm_prune

[CVPR2017] NISP: Pruning Networks using Neuron Importance Score Propagation

arxiv: https://arxiv.org/abs/1711.05908

[ICCV2017] ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression

web: http://lamda.nju.edu.cn/luojh/project/ThiNet_ICCV17/ThiNet_ICCV17_CN.html

github: https://github.com/Roll920/ThiNet

https://github.com/Roll920/ThiNet_Code

稀疏sparsity

SBNet: Sparse Blocks Network for Fast Inference

intro: Uber

https://arxiv.org/abs/1801.02108

github: https://github.com/uber/sbnet

To Prune, or Not to Prune: Exploring the Efficacy of Pruning for Model Compression

intro:稀疏

https://arxiv.org/abs/1710.01878

github: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/model_pruning

Submanifold Sparse Convolutional Networks

intro:Facebook

https://arxiv.org/abs/1706.01307

github: https://github.com/facebookresearch/SparseConvNet

融合fusion

蒸馏distillation

[NIPS2014] Distilling the Knowledge in a Neural Network

intro:Hinton出品

https://arxiv.org/abs/1503.02531

github: https://github.com/peterliht/knowledge-distillation-pytorch

综合comprehensive

[ICLR2016] Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding

intro:开创先河

https://arxiv.org/abs/1510.00149

github: https://github.com/songhan

Model Distillation with Knowledge Transfer from Face Classification to Alignment and Verification

intro:实验比较多,适合工程化

arxiv:

https://arxiv.org/abs/1709.02929

持续更新中,欢迎关注

*延伸阅读

点击左下角 阅读原文 ”, 即可申请加入极市 目标跟踪、目标检测、工业检测、人脸方向、视觉竞赛等技术交流群, 更有每月大咖直播分享、真实项目需求对接、干货资讯汇总,行业技术交流, 一起来让思想之光照的更远吧~

Deep Compression/Acceleration:模型压缩加速论文汇总

觉得有用麻烦给个在看啦~    Deep Compression/Acceleration:模型压缩加速论文汇总


以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

The Haskell School of Music

The Haskell School of Music

Paul Hudak、Donya Quick / Cambridge University Press / 2018-10-4 / GBP 42.99

This book teaches functional programming through creative applications in music and sound synthesis. Readers will learn the Haskell programming language and explore numerous ways to create music and d......一起来看看 《The Haskell School of Music》 这本书的介绍吧!

UNIX 时间戳转换
UNIX 时间戳转换

UNIX 时间戳转换

RGB HSV 转换
RGB HSV 转换

RGB HSV 互转工具

HSV CMYK 转换工具
HSV CMYK 转换工具

HSV CMYK互换工具