KIT | KIT-Bibliothek | Impressum | Datenschutz

Fit2Ear: Generating Personalized Earplugs from Smartphone Depth Camera Images

Zhao, Haibin ORCID iD icon 1,2; Röddiger, Tobias ORCID iD icon 1,2; Yufei, Feng; Beigl, Michael ORCID iD icon 1,2
1 Institut für Telematik (TM), Karlsruher Institut für Technologie (KIT)
2 Fakultät für Informatik (INFORMATIK), Karlsruher Institut für Technologie (KIT)

Abstract (englisch):

Earphones, due to their deep integration into daily life, have been developed for unobtrusive and ubiquitous health monitoring. How-ever, these advanced algorithms greatly rely on the high quality sensing data. However, the data collected with universal earplugs could potentially generate undesirable noise, such as vibrations or even falling off. As a result, the algorithms may exhibit limited performance. In this regard, we build a dataset containing RGBD and IMU data captured by a smartphone. To provide a precise and solid ground truth, we employ additional control information from a robotic arm that holds the smartphone scanning ears along a pre-defined trajectory. With this dataset, we propose a tightly coupled information fusion algorithm for the ground truth ear modeling. Finally, we fabricate the earplugs and conduct an end-to-end evaluation of the wearability of the modeled earplugs in a user study.


Verlagsausgabe §
DOI: 10.5445/IR/1000172963
Veröffentlicht am 10.10.2024
Cover der Publikation
Zugehörige Institution(en) am KIT Institut für Telematik (TM)
Publikationstyp Proceedingsbeitrag
Publikationsdatum 09.10.2024
Sprache Englisch
Identifikator KITopen-ID: 1000172963
Erschienen in ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp ’24)
Veranstaltung ACM international joint conference on Pervasive and Ubiquitous Computing (UbiComp 2024), Melbourne, Australien, 05.10.2024 – 09.10.2024
Bemerkung zur Veröffentlichung in press
Vorab online veröffentlicht am 29.07.2024
KIT – Die Forschungsuniversität in der Helmholtz-Gemeinschaft
KITopen Landing Page