Fast and Robust Compression of Deep Convolutional Neural Networks

Jia Wen, Liu Yang, Chenyang Shen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations


Deep convolutional neural networks (CNNs) currently demonstrate the state-of-the-art performance in several domains. However, a large amount of memory and computing resources are required in the commonly used CNN models, posing challenges in training as well as deploying, especially on those devices with limited computational resources. Inspired by the recent advancement of random tensor decomposition, we introduce a Hierarchical Framework for Fast and Robust Compression (HFFRC), which significantly reduces the number of parameters needed to represent a convolution layer via a fast low-rank Tucker decomposition algorithm, while preserving its expressive power. In the merit of randomized algorithm, the proposed compression framework is robust to noises in parameters. In addition, it is a general framework that any tensor decomposition method can be easily adopted. The efficiency and effectiveness of the proposed approach have been demonstrated via comprehensive experiments conducted on the benchmarks CIFAR-10 and CIFAR-100 image classification datasets.

Original languageEnglish (US)
Title of host publicationArtificial Neural Networks and Machine Learning – ICANN 2020 - 29th International Conference on Artificial Neural Networks, Proceedings
EditorsIgor Farkaš, Paolo Masulli, Stefan Wermter
PublisherSpringer Science and Business Media Deutschland GmbH
Number of pages12
ISBN (Print)9783030616151
StatePublished - 2020
Event29th International Conference on Artificial Neural Networks, ICANN 2020 - Bratislava, Slovakia
Duration: Sep 15 2020Sep 18 2020

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12397 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference29th International Conference on Artificial Neural Networks, ICANN 2020


  • Deep convolutional neural networks
  • Model compression
  • Random Tucker decomposition

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science


Dive into the research topics of 'Fast and Robust Compression of Deep Convolutional Neural Networks'. Together they form a unique fingerprint.

Cite this