English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Referring to Objects with Spoken and Haptic Modalities

Landragin, F., Bellalem, N., & Romary, L. (2002). Referring to Objects with Spoken and Haptic Modalities. In Fourth IEEE International Conference on Multimodal Interfaces (pp. 99-104). Los Alamitos, Calif.: IEEE Computer Society.

Item is

Files

show Files
hide Files
:
Referring to Objects with Spoken and Haptic Modalities.pdf (Any fulltext), 566KB
Name:
Referring to Objects with Spoken and Haptic Modalities.pdf
Description:
-
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show

Creators

show
hide
 Creators:
Landragin, Frédéric, Author
Bellalem, Nadia, Author
Romary, Laurent1, Author           
Affiliations:
1Max Planck Digital Library, Max Planck Society, ou_persistent25              

Content

show
hide
Free keywords: Cognitive science - Computer science Multimedia Human-Computer Interaction
 Abstract: The gesture input modality considered in multimodal dialogue systems is mainly reduced to pointing or manipulating actions. With an approach based on the spontaneous character of the communication, the treatment of such actions involves many processes. Without any constraints, the user may use gesture in association with speech, and may exploit the visual context peculiarities, guiding his articulation of gesture trajectories and his choices of words. The semantic interpretation of multimodal utterances also becomes a complex problem, taking into account varieties of referring expressions, varieties of gestural trajectories, structural parameters from the visual context, and also directives from a specific task. Following the spontaneous approach, we propose to give the maximal understanding capabilities to dialogue systems, to ensure that various interaction modes must be taken into account. Considering the development of haptic sense devices (as PHANToM) which increase the capabilities of sensations, particularly tactile and kinesthetic ones, we propose to explore a new domain of research concerning the integration of haptic gesture into multimodal dialogue systems, in terms of its possible associations with speech for objects reference and manipulation. We focus in this paper on the compatibility between haptic gesture and multimodal reference models, and on the consequences of processing this new modality on intelligent system architectures, which is not yet enough studied from a semantic point of view.

Details

show
hide
Language(s):
 Dates: 2002
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: eDoc: 324368
Other: hal.archives-ouvertes.fr:hal-00112772_v1
 Degree: -

Event

show
hide
Title: International Conference on Multimodal Interfaces
Place of Event: Pittsburgh, Pennsylvania
Start-/End Date: 2002-10-14 - 2002-10-16

Legal Case

show

Project information

show

Source 1

show
hide
Title: Fourth IEEE International Conference on Multimodal Interfaces
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: Los Alamitos, Calif. : IEEE Computer Society
Pages: XIII, 543 S. Volume / Issue: - Sequence Number: - Start / End Page: 99 - 104 Identifier: ISBN: 0-7695-1834-6