日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細

登録内容を編集ファイル形式で保存
 
 
ダウンロード電子メール
  Perceptual relevance of kinematic components of facial movements extracted by unsupervised learning

Giese, M., Chiovetto, E., & Curio, C. (2012). Perceptual relevance of kinematic components of facial movements extracted by unsupervised learning. Poster presented at 35th European Conference on Visual Perception, Alghero, Italy.

Item is

基本情報

表示: 非表示:
資料種別: ポスター

ファイル

表示: ファイル

関連URL

表示:

作成者

表示:
非表示:
 作成者:
Giese, MA, 著者           
Chiovetto, E, 著者
Curio, C1, 2, 3, 著者           
所属:
1Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497794              
2Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              
3Project group: Cognitive Engineering, Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_2528702              

内容説明

表示:
非表示:
キーワード: -
 要旨: The idea that complex facial or body movements are composed of simpler components (usually referred to as 'movement primitives'or 'action units') is common in motor control (Chiovetto 2011 Journal of Neurophysiology105(4), 1429-31.) as well as in the study of facial expressions (Ekman and Friesen, 1978). However, such components have rarely been extracted from real facial movement data. Methods: Combining a novel algorithm for anechoic demixing derived from (Omlor and Giese 2011 Journal of Machine Learning Research121111-1148) with a motion retargetting system for 3D facial animation (Curio et al, 2010, MIT Press, 47-65), we estimated spatially and temporally localized components that capture the major part of the variance of dynamic facial expressions. The estimated components were used to generate stimuli for a psychophysical experiment assessing classification rates and emotional expressiveness ratings for stimuli containing combinations of the extracted components. Results: We investigated how the information carried by the different extracted dynamic facial movement components is integrated in facial expression perception. In addition, we tried to apply different cue fusion models to account quantitatively for the obtained experimental results.

資料詳細

表示:
非表示:
言語:
 日付: 2012-09
 出版の状態: 出版
 ページ: -
 出版情報: -
 目次: -
 査読: -
 識別子(DOI, ISBNなど): BibTex参照ID: GieseCC2012
DOI: 10.1177/03010066120410S101
 学位: -

関連イベント

表示:
非表示:
イベント名: 35th European Conference on Visual Perception
開催地: Alghero, Italy
開始日・終了日: -

訴訟

表示:

Project information

表示:

出版物 1

表示:
非表示:
出版物名: Perception
種別: 学術雑誌
 著者・編者:
所属:
出版社, 出版地: London : Pion Ltd.
ページ: - 巻号: 41 (ECVP Abstract Supplement) 通巻号: - 開始・終了ページ: 150 識別子(ISBN, ISSN, DOIなど): ISSN: 0301-0066
CoNE: https://pure.mpg.de/cone/journals/resource/954925509369