Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

It's All Me: Varying Viewpoints and Motor Learning in a Virtual Reality Environment

MPG-Autoren
/persons/resource/persons84194

Schomaker,  J
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84254

Tesch,  J
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83831

Bresciani,  J-P
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Schomaker, J., Tesch, J., Bülthoff, H., & Bresciani, J.-P. (2010). It's All Me: Varying Viewpoints and Motor Learning in a Virtual Reality Environment. Poster presented at 11th International Multisensory Research Forum (IMRF 2010), Liverpool, UK.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-BFD0-7
Zusammenfassung
In the present study, healthy subjects performed a visuo-vestibular motor adaptation task in virtual reality. The task consisted of keeping the extended arm and hand stable in space during a whole-body rotation induced by a robotic wheelchair. Performance was first quantified in a pretest in which no visual feedback was available during the rotation. During the subsequent learning phase optical flow resulting from body rotation was provided. This visual feedback was manipulated to create the illusion of a smaller rotational movement than actually occurred, hereby altering the visuo-vestibular mapping. The adaptation effects of the learning phase were measured during a posttest identical to the pretest. Three different groups of subjects were exposed to different perspectives on the visual scene, i.e., first-person-, top- or mirror-view. Interestingly, sensorimotor adaptation occurred for all three viewpoint conditions (p < 0.05). Furthermore, in the mirror-view participants showed significantly less variability in performance. These results suggest that the visually richer mirror-view enhanced motor learning relative to the other viewpoints. Therefore, using virtual reality to provide rich multimodal stimulation including mirror views could add to traditional neurorehabilitation techniques by facilitating motor learning.