FTL: A universal framework for training low-bit DNNs via Feature Transfer

Nov 20, 2020·
K. Du
,
Y. Zhang
,
H. Guan
,
Q. Tian
,
Y. Wang*
,
S. Cheng
,
J. Lin
· 0 min read
Abstract
Low-bit Deep Neural Networks (low-bit DNNs) have recently received significant attention for their high efficiency. However, low-bit DNNs are often difficult to optimize due to the saddle points in loss surfaces. Here we introduce a novel feature-based knowledge transfer framework, which utilizes a 32-bit DNN to guide the training of a low-bit DNN via feature maps. It is challenge because feature maps from two branches lie in continuous and discrete space respectively, and such mismatch has not been handled properly by existing feature transfer frameworks. In this paper, we propose to directly transfer information-rich continuous-space feature to the low-bit branch…
Type
Publication
ECCV 2020