Distilling with performance enhanced students

Turner, J., Crowley, E. J., Radu, V., Cano Reyes, J. , Storkey, A. and O'Boyle, M. (2019) Distilling with performance enhanced students. arXiv, (Unpublished)

[img] Text
226526.pdf - Submitted Version
Available under License Creative Commons Attribution Non-commercial Share Alike.


Publisher's URL: https://arxiv.org/abs/1810.10460


The task of accelerating large neural networks on general purpose hardware has, in recent years, prompted the use of channel pruning to reduce network size. However, the efficacy of pruning based approaches has since been called into question. In this paper, we turn to distillation for model compression---specifically, attention transfer---and develop a simple method for discovering performance enhanced student networks. We combine channel saliency metrics with empirical observations of runtime performance to design more accurate networks for a given latency budget. We apply our methodology to residual and densely-connected networks, and show that we are able to find resource-efficient student networks on different hardware platforms while maintaining very high accuracy. These performance-enhanced student networks achieve up to 10% boosts in top-1 ImageNet accuracy over their channel-pruned counterparts for the same inference time.

Item Type:Articles
Additional Information:This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 732204 (Bonseyes). This work is supported by the Swiss State Secretariat for Education, Research and Innovation (SERI) under contract number 16.0159.
Glasgow Author(s) Enlighten ID:Cano Reyes, Dr Jose
Authors: Turner, J., Crowley, E. J., Radu, V., Cano Reyes, J., Storkey, A., and O'Boyle, M.
College/School:College of Science and Engineering > School of Computing Science
Journal Name:arXiv
Copyright Holders:Copyright © 2020 The Authors
Publisher Policy:Reproduced under a Creative Commons license

University Staff: Request a correction | Enlighten Editors: Update this record