English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors

Steil, J., Müller, P., Sugano, Y., & Bulling, A. (2018). Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors. Retrieved from http://arxiv.org/abs/1801.06011.

Item is

Files

show Files
hide Files
:
arXiv:1801.06011.pdf (Preprint), 6MB
Name:
arXiv:1801.06011.pdf
Description:
File downloaded from arXiv at 2018-04-09 11:39
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-

Locators

show

Creators

show
hide
 Creators:
Steil, Julian1, Author           
Müller, Philipp1, Author           
Sugano, Yusuke2, Author           
Bulling, Andreas1, Author           
Affiliations:
1Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society, ou_1116547              
2External Organizations, ou_persistent22              

Content

show
hide
Free keywords: Computer Science, Human-Computer Interaction, cs.HC
 Abstract: Users' visual attention is highly fragmented during mobile interactions but the erratic nature of these attention shifts currently limits attentive user interfaces to adapt after the fact, i.e. after shifts have already happened, thereby severely limiting the adaptation capabilities and user experience. To address these limitations, we study attention forecasting -- the challenging task of predicting whether users' overt visual attention (gaze) will shift between a mobile device and environment in the near future or how long users' attention will stay in a given location. To facilitate the development and evaluation of methods for attention forecasting, we present a novel long-term dataset of everyday mobile phone interactions, continuously recorded from 20 participants engaged in common activities on a university campus over 4.5 hours each (more than 90 hours in total). As a first step towards a fully-fledged attention forecasting interface, we further propose a proof-of-concept method that uses device-integrated sensors and body-worn cameras to encode rich information on device usage and users' visual scene. We demonstrate the feasibility of forecasting bidirectional attention shifts between the device and the environment as well as for predicting the first and total attention span on the device and environment using our method. We further study the impact of different sensors and feature sets on performance and discuss the significant potential but also remaining challenges of forecasting user attention during mobile interactions.

Details

show
hide
Language(s): eng - English
 Dates: 2018-01-182018
 Publication Status: Published online
 Pages: 24 p.
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: arXiv: 1801.06011
URI: http://arxiv.org/abs/1801.06011
BibTex Citekey: steil2018_arxiv2
 Degree: -

Event

show

Legal Case

show

Project information

show

Source

show