KIT | KIT-Bibliothek | Impressum | Datenschutz

Deep Neural Network Pruning with Progressive Regularizer

Zhou, Yexu 1,2; Zhao, Haibin ORCID iD icon 1,2; Hefenbrock, Michael; Li, Siyan; Beigl, Michael ORCID iD icon 1,2
1 Institut für Telematik (TM), Karlsruher Institut für Technologie (KIT)
2 Fakultät für Informatik (INFORMATIK), Karlsruher Institut für Technologie (KIT)

Abstract (englisch):

Pruning is a pivotal approach in network compression. It not only encourages lightweight deep neural networks, but also helps to mitigate overfitting. Generally, regularization is used to guide more parameters towards zero and thus reduce the overall model complexity. Unfortunately, there are two issued remaining unsolved in regularization based pruning. One is that the optimal trade-off between regularization and loss minimization, often expressed via a scaling hyperparameter, needs to be found through extensive experimentation. The other one is the importance criteria that can reflect the true relative importance. The most widely used criterion is the magnitude-based, which has been argued to be inaccurate. To these two issues, in this paper, we propose a progressive regularization scheme, in which the factor scaling the regularization term is gradually increased during training, until the target sparsity for filter pruning is reached. Compared to the previous approach, the scaling factor is no longer a hyperparameter that needs to be tuned, but is replaced with a sparsity-aware parameter that increases progressively. In this way, the value of the scaling factor can be automatically aligned with the target sparsity, avoiding the drawbacks in its tuning. ... mehr


Preprint §
DOI: 10.5445/IR/1000169627
Veröffentlicht am 27.03.2024
Cover der Publikation
Zugehörige Institution(en) am KIT Institut für Telematik (TM)
Publikationstyp Proceedingsbeitrag
Publikationsmonat/-jahr 07.2024
Sprache Englisch
Identifikator KITopen-ID: 1000169627
Erschienen in 2024 IEEE International Joint Conference on Neural Network (IJCNN 2024), Yokohama, 30th June - 05 July 2024
Veranstaltung International Joint Conference on Neural Networks (IJCNN 2024), Yokohama, Japan, 30.06.2024 – 05.07.2024
Verlag Institute of Electrical and Electronics Engineers (IEEE)
Bemerkung zur Veröffentlichung in press
Schlagwörter progressive regularizer, neural network, tiny machine learning, lightweight
KIT – Die Forschungsuniversität in der Helmholtz-Gemeinschaft
KITopen Landing Page