KIT | KIT-Bibliothek | Impressum | Datenschutz

Distilling particle knowledge for fast reconstruction at high-energy physics experiments

Bal, A. 1; Brandes, T. 2; Iemmi, F.; Klute, M. 1; Maier, B. 1; Mikuni, V.; Årrestad, T. K.
1 Institut für Experimentelle Teilchenphysik (ETP), Karlsruher Institut für Technologie (KIT)
2 Karlsruher Institut für Technologie (KIT)

Abstract:

Knowledge distillation is a form of model compression that allows artificial neural networks of different sizes to learn from one another. Its main application is the compactification of large deep neural networks to free up computational resources, in particular on edge devices. In this article, we consider proton-proton collisions at the High-Luminosity Large Hadron Collider (HL-LHC) and demonstrate a successful knowledge transfer from an event-level graph neural network (GNN) to a particle-level small deep neural network (DNN). Our algorithm, DistillNet, is a DNN that is trained to learn about the provenance of particles, as provided by the soft labels that are the GNN outputs, to predict whether or not a particle originates from the primary interaction vertex. The results indicate that for this problem, which is one of the main challenges at the HL-LHC, there is minimal loss during the transfer of knowledge to the small student network, while improving significantly the computational resource needs compared to the teacher. This is demonstrated for the distilled student network on a CPU, as well as for a quantized and pruned student network deployed on an field programmable gate array. ... mehr


Verlagsausgabe §
DOI: 10.5445/IR/1000171053
Veröffentlicht am 29.05.2024
Cover der Publikation
Zugehörige Institution(en) am KIT Institut für Experimentelle Teilchenphysik (ETP)
Publikationstyp Zeitschriftenaufsatz
Publikationsmonat/-jahr 06.2024
Sprache Englisch
Identifikator ISSN: 2632-2153
KITopen-ID: 1000171053
Erschienen in Machine Learning: Science and Technology
Verlag Institute of Physics Publishing Ltd (IOP Publishing Ltd)
Band 5
Heft 2
Seiten Art.-Nr.: 025033
Vorab online veröffentlicht am 07.05.2024
Schlagwörter knowledge distillation, model compression, pattern recognition
Nachgewiesen in Dimensions
Web of Science
Scopus
Relationen in KITopen
KIT – Die Forschungsuniversität in der Helmholtz-Gemeinschaft
KITopen Landing Page