Integrating Gesture Recognition and Direct Manipulation.

作者: Dean Rubine

DOI:

关键词:

摘要: A gesture-based interface is one in which the user specifies commands by simple drawings, typically made with a mouse or stylus. single intuitive gesture can simultaneously specify objects, an operation, and additional parameters, making gestures more powerful than "clicks" "drags" of traditional direct-manipulation interfaces. However, problem most gesturebased systems that entire must be entered interaction completed before system responds. Such makes it awkward to use for operations require continuous feedback. GRANDMA, tool building applications, overcomes this shortcoming through two methods integrating gesturing direct manipulation. First, GRANDMA allows views respond clicks drags (e.g. widgets) coexist same interface. More interestingly, supports new two-phase technique, collection phase immediately followed manipulation phase. In its simplest form, transition indicated keeping still while continuing hold button down. Alternatively, occur once enough has been seen recognize unambiguously. The latter method, called eager recognition, results smooth natural interaction. addition describing how integration manipulation, paper presents algorithm creating recognizers from example gestures.

参考文章(17)
Zalman Stern, Wilfred J. Hansen, Thom Peters, Miles Bader, Mark Sherman, Maria G. Wadlow, Andrew J. Palay, Michael L. Kazar, Thomas P. Neuendorffer, The Andrew Toolkit - An Overview. USENIX Winter. pp. 9- 22 ,(1988)
WILLIAM BUXTON, There's more to interaction than meets the eye: some issues in manual input Human-Computer Interaction. pp. 122- 137 ,(1987) , 10.1201/B15703-15
Andrew Novobilski, Brad J. Cox, Object-oriented programming ; an evolutionary approach ,(1986)
Dario Giuse, DP - Command Set. ,(1982)
Tyson R. Henry, Scott E. Hudson, Gary L. Newell, Integrating gesture and snapping into a user interface toolkit user interface software and technology. pp. 112- 122 ,(1990) , 10.1145/97924.97938
Gordon Kurtenbach, Bill Buxton, GEdit: a test bed for editing by contiguous gestures ACM Sigchi Bulletin. ,vol. 23, pp. 22- 26 ,(1991) , 10.1145/122488.122490
Margaret R. Minsky, Manipulating simulated objects with real-world gestures using a force and position sensitive screen international conference on computer graphics and interactive techniques. ,vol. 18, pp. 195- 203 ,(1984) , 10.1145/800031.808598
James S. Lipscomb, A trainable gesture recognizer Pattern Recognition. ,vol. 24, pp. 895- 907 ,(1991) , 10.1016/0031-3203(91)90009-T
Robert W. Scheifler, Jim Gettys, The X window system ACM Transactions on Graphics. ,vol. 5, pp. 79- 109 ,(1986) , 10.1145/22949.24053
I. Guyon, P. Albrecht, Y. Le Cun, J. Denker, W. Hubbard, Design of a neural network character recognizer for a touch terminal Pattern Recognition. ,vol. 24, pp. 105- 119 ,(1991) , 10.1016/0031-3203(91)90081-F