English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Recognition of Hearing Needs From Body and Eye Movements to Improve Hearing Instruments

Tessendorf, B., Bulling, A., Roggen, D., Stiefmeier, T., Feilner, M., Derleth, P., et al. (2011). Recognition of Hearing Needs From Body and Eye Movements to Improve Hearing Instruments. In K. Lyons, J. Hightower, & E. M. Huang (Eds.), Pervasive Computing (pp. 314-331). Berlin: Springer.

Item is

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Tessendorf, Bernd1, Author
Bulling, Andreas2, Author           
Roggen, Daniel1, Author
Stiefmeier, Thomas1, Author
Feilner, Manuela1, Author
Derleth, Peter1, Author
Tröster, Gerhard1, Author
Affiliations:
1External Organizations, ou_persistent22              
2Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society, ou_1116547              

Content

show
hide
Free keywords: -
 Abstract: Hearing instruments (HIs) have emerged as true pervasive computers as they continuously adapt the hearing program to the user\textquoterights context. However, current HIs are not able to distinguish different hearing needs in the same acoustic environment. In this work, we explore how information derived from body and eye movements can be used to improve the recognition of such hearing needs. We conduct an experiment to provoke an acoustic environment in which different hearing needs arise: active conversation and working while colleagues are having a conversation in a noisy office environment. We record body movements on nine body locations, eye movements using electrooculography (EOG), and sound using commercial HIs for eleven participants. Using a support vector machine (SVM) classifier and person-independent training we improve the accuracy of 77% based on sound to an accuracy of 92% using body movements. With a view to a future implementation into a HI we then perform a detailed analysis of the sensors attached to the head. We achieve the best accuracy of 86% using eye movements compared to 84% for head movements. Our work demonstrates the potential of additional sensor modalities for future HIs and motivates to investigate the wider applicability of this approach on further hearing situations and needs.

Details

show
hide
Language(s): eng - English
 Dates: 2011-06
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: DOI: 10.1007/978-3-642-21726-5_20
BibTex Citekey: tessendorf11_pervasive
 Degree: -

Event

show
hide
Title: 9th International Conference on Pervasive Computing
Place of Event: San Francisco, CA
Start-/End Date: 2011-06-12 - 2011-06-14

Legal Case

show

Project information

show

Source 1

show
hide
Title: Pervasive Computing
  Subtitle : 9th International Conference, Pervasive 2011
  Abbreviation : Pervasive 2011
Source Genre: Proceedings
 Creator(s):
Lyons, Kent1, Editor
Hightower, Jeffrey1, Editor
Huang, Elaine M.1, Editor
Affiliations:
1 External Organizations, ou_persistent22            
Publ. Info: Berlin : Springer
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 314 - 331 Identifier: ISBN: 978-3-642-21725-8

Source 2

show
hide
Title: Lecture Notes in Computer Science
  Abbreviation : LNCS
Source Genre: Series
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 6696 Sequence Number: - Start / End Page: - Identifier: -