

[1503.02531] Distilling the Knowledge in a Neural Network
source link: https://arxiv.org/abs/1503.02531
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

[Submitted on 9 Mar 2015]
Distilling the Knowledge in a Neural Network
A very simple way to improve the performance of almost any machine learning algorithm is to train many different models on the same data and then to average their predictions. Unfortunately, making predictions using a whole ensemble of models is cumbersome and may be too computationally expensive to allow deployment to a large number of users, especially if the individual models are large neural nets. Caruana and his collaborators have shown that it is possible to compress the knowledge in an ensemble into a single model which is much easier to deploy and we develop this approach further using a different compression technique. We achieve some surprising results on MNIST and we show that we can significantly improve the acoustic model of a heavily used commercial system by distilling the knowledge in an ensemble of models into a single model. We also introduce a new type of ensemble composed of one or more full models and many specialist models which learn to distinguish fine-grained classes that the full models confuse. Unlike a mixture of experts, these specialist models can be trained rapidly and in parallel.
Recommend
-
69
Soft-Decision-Tree Soft-Decision-Tree is the pytorch implementation of Distilling a Neural Network Into a Soft Decision Tree, paper recently published on Arxiv about adopt...
-
20
[Knowledge Distillation] Distilling the Knowledge in a Neural Network
-
19
poj 1503 高精度加法 2013-09-07 分类:未分类 阅读(4401) 评论(0) 把输入...
-
5
【论文笔记】Distilling the Knowledge in a Neural Network 2019年11月02日 Author: Guofei 文章归类: 0-读论文 ,文章编号: 1 版权声明:本文...
-
10
Distilling the Christmas spirit with stats People really love alcohol at Christmas. Or, at the very least they love...
-
6
Integer Inquiry POJ 1503 - Integer Inquiry Time: 1000MS Memory: 10000K 难度: 初级 分类: 高精度算法
-
7
Systems & Design Distilling The Essence Of Four DAC Keynotes
-
8
News » After Hours »
-
9
News » Topics »
-
11
[Submitted on 13 Mar 2015 (v1), last revised 4 Oct 2017 (this version, v2)] LSTM: A Search Space Odyssey
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK