来呀,快活呀~


  • 首页

  • 归档

  • 标签

  • 关于

  • 搜索
close

model compression 标签

论文阅读 - Contrastive Representation Distillation

05-05

论文 - Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference

04-15

论文 - MetaPruning:Meta Learning for Automatic Neural Network Channel Pruning

10-26

论文 - Rethinking The Value of Network Pruning

10-22

论文总结 - 模型剪枝 Model Pruning

10-03

论文 - Like What You Like - Knowledge Distill via Neuron Selectivity Transfer

10-02

论文 - Distilling the Knowledge in a Neural Network

06-07

论文 - SqueezeNet, AlexNet-level accuracy with 50x fewer parameters and <0.5mb model size< span>

03-24

论文 - MobileNets, Efficient Convolutional Neural Networks for Mobile Vision Applications

03-23

论文 - Xception, Deep Learning with Depthwise separable Convolution

03-22
12
一个脱离了高级趣味的人

一个脱离了高级趣味的人

相与枕藉乎舟中,不知东方之既白

91 日志
36 标签
RSS
GitHub 微博
© 2021 一个脱离了高级趣味的人
由 Hexo 强力驱动
主题 - NexT.Muse