In:
PLOS ONE, Public Library of Science (PLoS), Vol. 18, No. 10 ( 2023-10-9), p. e0292517-
Abstract:
Previous studies have shown that deep models are often over-parameterized, and this parameter redundancy makes deep compression possible. The redundancy of model weight is often manifested as low rank and sparsity. Ignoring any part of the two or the different distributions of these two characteristics in the model will lead to low accuracy and a low compression rate of deep compression. To make full use of the difference between low-rank and sparsity, a unified framework combining low-rank tensor decomposition and structured pruning is proposed: a hybrid model compression method based on sensitivity grouping (HMC). This framework unifies the existing additive hybrid compression method (AHC) and the non-additive hybrid compression method (NaHC) proposed by us into one model. The latter group the network according to the sensitivity difference of the convolutional layer to different compression methods, which can better integrate the low rank and sparsity of the model compared with the former. Experiments show that our approach achieves a better trade-off between test accuracy and compression ratio when compressing the ResNet family of models than other recent compression methods using a single strategy or additive hybrid compression.
Type of Medium:
Online Resource
ISSN:
1932-6203
DOI:
10.1371/journal.pone.0292517
DOI:
10.1371/journal.pone.0292517.g001
DOI:
10.1371/journal.pone.0292517.g002
DOI:
10.1371/journal.pone.0292517.g003
DOI:
10.1371/journal.pone.0292517.g004
DOI:
10.1371/journal.pone.0292517.g005
DOI:
10.1371/journal.pone.0292517.g006
DOI:
10.1371/journal.pone.0292517.g007
DOI:
10.1371/journal.pone.0292517.g008
DOI:
10.1371/journal.pone.0292517.t001
DOI:
10.1371/journal.pone.0292517.t002
DOI:
10.1371/journal.pone.0292517.t003
DOI:
10.1371/journal.pone.0292517.t004
DOI:
10.1371/journal.pone.0292517.t005
Language:
English
Publisher:
Public Library of Science (PLoS)
Publication Date:
2023
detail.hit.zdb_id:
2267670-3
Permalink