English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Fast and Robust Hand Tracking Using Detection-Guided Optimization

Sridhar, S., Mueller, F., Oulasvirta, A., & Theobalt, C. (2016). Fast and Robust Hand Tracking Using Detection-Guided Optimization. Retrieved from http://arxiv.org/abs/1602.04124.

Item is

Files

show Files
hide Files
:
arXiv:1602.04124.pdf (Preprint), 4MB
Name:
arXiv:1602.04124.pdf
Description:
File downloaded from arXiv at 2016-10-13 10:05 Accepted version of paper published at CVPR 2015
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-

Locators

show

Creators

show
hide
 Creators:
Sridhar, Srinath1, Author           
Mueller, Franziska1, Author           
Oulasvirta, Antti2, Author           
Theobalt, Christian1, Author           
Affiliations:
1Computer Graphics, MPI for Informatics, Max Planck Society, ou_40047              
2External Organizations, ou_persistent22              

Content

show
hide
Free keywords: Computer Science, Computer Vision and Pattern Recognition, cs.CV
 Abstract: Markerless tracking of hands and fingers is a promising enabler for human-computer interaction. However, adoption has been limited because of tracking inaccuracies, incomplete coverage of motions, low framerate, complex camera setups, and high computational requirements. In this paper, we present a fast method for accurately tracking rapid and complex articulations of the hand using a single depth camera. Our algorithm uses a novel detection-guided optimization strategy that increases the robustness and speed of pose estimation. In the detection step, a randomized decision forest classifies pixels into parts of the hand. In the optimization step, a novel objective function combines the detected part labels and a Gaussian mixture representation of the depth to estimate a pose that best fits the depth. Our approach needs comparably less computational resources which makes it extremely fast (50 fps without GPU support). The approach also supports varying static, or moving, camera-to-scene arrangements. We show the benefits of our method by evaluating on public datasets and comparing against previous work.

Details

show
hide
Language(s): eng - English
 Dates: 2016-02-122016
 Publication Status: Published online
 Pages: 9 p.
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: arXiv: 1602.04124
URI: http://arxiv.org/abs/1602.04124
BibTex Citekey: SridhararXiv1602.04124
 Degree: -

Event

show

Legal Case

show

Project information

show

Source

show