Wednesday, May 5, 2021

$1 Unistroke Recognizer

Download

$1 source code: JavaScript, C#
Dynamic time warping: C#
Rubine classifier: C#
Pseudocode: $1, Protractor
Unistroke gesture logs: XML
Paper: PDF

This software is distributed under the New BSD License agreement.

About

The $1 Unistroke Recognizer is a 2-D single-stroke recognizer designed for rapid prototyping of gesture-based user interfaces. In machine learning terms, $1 is an instance-based nearest-neighbor classifier with a 2-D Euclidean distance function, i.e., a geometric template matcher. $1 is a significant extension of the proportional shape matching approach used in SHARK2, which itself is an adaptation of Tappert's elastic matching approach with zero look-ahead. Despite its simplicity, $1 requires very few templates to perform well and is only about 100 lines of code, making it easy to deploy. An optional enhancement called Protractor improves $1's speed.

The $N Multistroke Recognizer extends $1 to gestures with multiple strokes. The $P Point-Cloud Recognizer performs unistroke and multistroke recognition without the combinatoric overhead of $N, as it ignores stroke number, order, and direction. The $Q Super-Quick Recognizer extends $P for use on low-powered mobiles and wearables, as it is a whopping 142× faster and slightly more accurate.

The $-family recognizers have been built into numerous projects and even industry prototypes, and have had many follow-ons by others. Read about the $-family's impact.

Demo

In the demo below, only one unistroke template is loaded for each of the 16 gesture types. You can add additional unistrokes as you wish, and even define your own custom unistrokes.


Make strokes on this canvas. If a misrecognition occurs, add the misrecognized unistroke as an example of the intended gesture.

The <canvas> element is not supported by this browser.

Our Gesture Software Projects

  • $Q: Super-quick multistroke recognizer - optimized for low-power mobiles and wearables
  • $P+: Point-cloud multistroke recognizer - optimized for people with low vision
  • $P: Point-cloud multistroke recognizer - for recognizing multistroke gestures as point-clouds
  • $N: Multistroke recognizer - for recognizing simple multistroke gestures
  • $1: Unistroke recognizer - for recognizing unistroke gestures
  • AGATe: AGreement Analysis Toolkit - for calculating agreement in gesture-elicitation studies
  • GHoST: Gesture HeatmapS Toolkit - for visualizing variation in gesture articulation
  • GREAT: Gesture RElative Accuracy Toolkit - for measuring variation in gesture articulation
  • GECKo: GEsture Clustering toolKit - for clustering gestures and calculating agreement

Our Gesture Publications

  1. Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2018). $Q: A super-quick, articulation-invariant stroke-gesture recognizer for low-resource devices. Proceedings of the ACM Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '18). Barcelona, Spain (September 3-6, 2018). New York: ACM Press. Article No. 23.
  2. Vatavu, R.-D. (2017). Improving gesture recognition accuracy on touch screens for users with low vision. Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI '17). Denver, Colorado (May 6-11, 2017). New York: ACM Press, pp. 4667-4679.
  3. Vatavu, R.-D. and Wobbrock, J.O. (2016). Between-subjects elicitation studies: Formalization and tool support. Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI '16). San Jose, California (May 7-12, 2016). New York: ACM Press, pp. 3390-3402.
  4. Vatavu, R.-D. and Wobbrock, J.O. (2015). Formalizing agreement analysis for elicitation studies: New measures, significance test, and toolkit. Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI '15). Seoul, Korea (April 18-23, 2015). New York: ACM Press, pp. 1325-1334.
  5. Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2014). Gesture heatmaps: Understanding gesture performance with colorful visualizations. Proceedings of the ACM International Conference on Multimodal Interfaces (ICMI '14). Istanbul, Turkey (November 12-16, 2014). New York: ACM Press, pp. 172-179.
  6. Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2013). Relative accuracy measures for stroke gestures. Proceedings of the ACM International Conference on Multimodal Interfaces (ICMI '13). Sydney, Australia (December 9-13, 2013). New York: ACM Press, pp. 279-286.
  7. Anthony, L., Vatavu, R.-D. and Wobbrock, J.O. (2013). Understanding the consistency of users' pen and finger stroke gesture articulation. Proceedings of Graphics Interface (GI '13). Regina, Saskatchewan (May 29-31, 2013). Toronto, Ontario: Canadian Information Processing Society, pp. 87-94.
  8. Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2012). Gestures as point clouds: A $P recognizer for user interface prototypes. Proceedings of the ACM International Conference on Multimodal Interfaces (ICMI '12). Santa Monica, California (October 22-26, 2012). New York: ACM Press, pp. 273-280.
  9. Anthony, L. and Wobbrock, J.O. (2012). $N-Protractor: A fast and accurate multistroke recognizer. Proceedings of Graphics Interface (GI '12). Toronto, Ontario (May 28-30, 2012). Toronto, Ontario: Canadian Information Processing Society, pp. 117-120.
  10. Anthony, L. and Wobbrock, J.O. (2010). A lightweight multistroke recognizer for user interface prototypes. Proceedings of Graphics Interface (GI '10). Ottawa, Ontario (May 31-June 2, 2010). Toronto, Ontario: Canadian Information Processing Society, pp. 245-252.
  11. Wobbrock, J.O., Wilson, A.D. and Li, Y. (2007). Gestures without libraries, toolkits or training: A $1 recognizer for user interface prototypes. Proceedings of the ACM Symposium on User Interface Software and Technology (UIST '07). Newport, Rhode Island (October 7-10, 2007). New York: ACM Press, pp. 159-168.

Background Publications by Others

  1. Li, Y. (2010). Protractor: A fast and accurate gesture recognizer. Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI '10). Atlanta, Georgia (April 10-15, 2010). New York: ACM Press, pp. 2169-2172.
  2. Kristensson, P. and Zhai, S. (2004). SHARK2: A large vocabulary shorthand writing system for pen-based computers. Proceedings of the ACM Symposium on User Interface Software and Technology (UIST '04). Santa Fe, New Mexico (October 24-27, 2004). New York: ACM Press, pp. 43-52.
  3. Rubine, D. (1991). Specifying gestures by example. Proceedings of the ACM Conference on Computer Graphics and Interactive Techniques (SIGGRAPH '91). Las Vegas, Nevada (July 28 - August 2, 1991). New York: ACM Press, pp. 329-337.
  4. Tappert, C.C. (1982). Cursive script recognition by elastic matching. IBM Journal of Research and Development 26 (6), pp. 765-771.

    Copyright © 2007-2019 Jacob O. Wobbrock. All rights reserved.
    Last updated June 30, 2019.




from Hacker News https://ift.tt/3h1XsjX

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.