作者: Gabyong Park , Taejin Ha , Woontack Woo
DOI: 10.1007/978-3-319-07458-0_37
关键词: Augmented reality 、 Artificial intelligence 、 Tracking (particle physics) 、 Gesture 、 Focus (computing) 、 Computer graphics (images) 、 Wearable computer 、 Computer science 、 Computer vision 、 Rotation (mathematics) 、 Virtual image 、 Object (computer science)
摘要: This paper proposes methods for tracking a bare hand with near-range depth camera attached to video see-through Head-mounted Display HMD virtual object manipulation in an Augmented Reality AR environment. The particular focus herein is upon using gestures that are frequently used daily life. First, we use segment the easily, considering both skin color and information within arms' reaches. Then, fingertip base positions extracted through primitive models of finger palm. According these positions, rotation parameters joints estimated inverse-kinematics algorithm. Finally, user's hands localized from physical space by camera-tracking then 3D manipulation. Our method applicable various interaction scenarios such as digital access/control, creative CG modeling, virtual-hand-guiding, or game UIs.