KIT | KIT-Bibliothek | Impressum | Datenschutz

A Comparative Study of Pruning Methods in Transformer-Based Time Series Forecasting

Kuhn, Nicholas 1; Weyrauch, Arvid ORCID iD icon 1; Öz, Muhamed 1; Streit, Achim ORCID iD icon 1; Götz, Markus ORCID iD icon 1; Debus, Charlotte 1
1 Scientific Computing Center (SCC), Karlsruher Institut für Technologie (KIT)

Abstract:

The high parameter count and corresponding computational demand of transformer-based models for time-series forecasting poses a challenge to training and real-world deployment. Pruning is an established approach to reduce compute, but its effects have not yet been studied on Transformer-based models for time series forecasting. To close this gap, we provide a comparative benchmark study by evaluating unstructured and structured pruning on various state-of-the-art multivariate time series models. Our results show that fine-tuning some pruned models is necessary. Furthermore, we demonstrate that even with corresponding hardware and software support, structured pruning is unable to provide significant time savings.


Originalveröffentlichung
DOI: 10.1109/ICDMW69685.2025.00032
Zugehörige Institution(en) am KIT Scientific Computing Center (SCC)
Publikationstyp Proceedingsbeitrag
Publikationsdatum 12.11.2025
Sprache Englisch
Identifikator ISBN: 979-8-3315-8132-9
KITopen-ID: 1000192546
Erschienen in 2025 IEEE International Conference on Data Mining Workshops (ICDMW)
Veranstaltung IEEE International Conference on Data Mining Workshop (ICDMW 2025), Washington, DC, USA, 12.11.2025 – 15.11.2025
Verlag Institute of Electrical and Electronics Engineers (IEEE)
Seiten 223–228
Schlagwörter Time Series Forecasting, Pruning, Transformers, Deep Learning
Nachgewiesen in Scopus
OpenAlex
Relationen in KITopen
KIT – Die Universität in der Helmholtz-Gemeinschaft
KITopen Landing Page