Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT
  Recognition of Hearing Needs From Body and Eye Movements to Improve Hearing Instruments

Tessendorf, B., Bulling, A., Roggen, D., Stiefmeier, T., Feilner, M., Derleth, P., et al. (2011). Recognition of Hearing Needs From Body and Eye Movements to Improve Hearing Instruments. In K. Lyons, J. Hightower, & E. M. Huang (Eds.), Pervasive Computing (pp. 314-331). Berlin: Springer.

Item is

Externe Referenzen

einblenden:

Urheber

einblenden:
ausblenden:
 Urheber:
Tessendorf, Bernd1, Autor
Bulling, Andreas2, Autor           
Roggen, Daniel1, Autor
Stiefmeier, Thomas1, Autor
Feilner, Manuela1, Autor
Derleth, Peter1, Autor
Tröster, Gerhard1, Autor
Affiliations:
1External Organizations, ou_persistent22              
2Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society, ou_1116547              

Inhalt

einblenden:
ausblenden:
Schlagwörter: -
 Zusammenfassung: Hearing instruments (HIs) have emerged as true pervasive computers as they continuously adapt the hearing program to the user\textquoterights context. However, current HIs are not able to distinguish different hearing needs in the same acoustic environment. In this work, we explore how information derived from body and eye movements can be used to improve the recognition of such hearing needs. We conduct an experiment to provoke an acoustic environment in which different hearing needs arise: active conversation and working while colleagues are having a conversation in a noisy office environment. We record body movements on nine body locations, eye movements using electrooculography (EOG), and sound using commercial HIs for eleven participants. Using a support vector machine (SVM) classifier and person-independent training we improve the accuracy of 77% based on sound to an accuracy of 92% using body movements. With a view to a future implementation into a HI we then perform a detailed analysis of the sensors attached to the head. We achieve the best accuracy of 86% using eye movements compared to 84% for head movements. Our work demonstrates the potential of additional sensor modalities for future HIs and motivates to investigate the wider applicability of this approach on further hearing situations and needs.

Details

einblenden:
ausblenden:
Sprache(n): eng - English
 Datum: 2011-06
 Publikationsstatus: Erschienen
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: -
 Identifikatoren: DOI: 10.1007/978-3-642-21726-5_20
BibTex Citekey: tessendorf11_pervasive
 Art des Abschluß: -

Veranstaltung

einblenden:
ausblenden:
Titel: 9th International Conference on Pervasive Computing
Veranstaltungsort: San Francisco, CA
Start-/Enddatum: 2011-06-12 - 2011-06-14

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle 1

einblenden:
ausblenden:
Titel: Pervasive Computing
  Untertitel : 9th International Conference, Pervasive 2011
  Kurztitel : Pervasive 2011
Genre der Quelle: Konferenzband
 Urheber:
Lyons, Kent1, Herausgeber
Hightower, Jeffrey1, Herausgeber
Huang, Elaine M.1, Herausgeber
Affiliations:
1 External Organizations, ou_persistent22            
Ort, Verlag, Ausgabe: Berlin : Springer
Seiten: - Band / Heft: - Artikelnummer: - Start- / Endseite: 314 - 331 Identifikator: ISBN: 978-3-642-21725-8

Quelle 2

einblenden:
ausblenden:
Titel: Lecture Notes in Computer Science
  Kurztitel : LNCS
Genre der Quelle: Reihe
 Urheber:
Affiliations:
Ort, Verlag, Ausgabe: -
Seiten: - Band / Heft: 6696 Artikelnummer: - Start- / Endseite: - Identifikator: -