ShoeSense: A New Perspective on Gestural Interaction and Wearable Applications. Bailly, G., Müller, J., Rohs, M., Wigdor, D., & Kratz, S. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, of CHI, pages 1239-1248, 2012. ACM.
ShoeSense: A New Perspective on Gestural Interaction and Wearable Applications [link]Website  abstract   bibtex   
When the user is engaged with a real-world task it can be inappropriate or difficult to use a smartphone. To address this concern, we developed ShoeSense, a wearable system consisting in part of a shoe-mounted depth sensor pointing upward at the wearer. ShoeSense recognizes relaxed and discreet as well as large and demonstrative hand gestures. In particular, we designed three gesture sets (Triangle, Radial, and Finger-Count) for this setup, which can be performed without visual attention. The advantages of ShoeSense are illustrated in five scenarios: (1) quickly performing frequent operations without reaching for the phone, (2) discreetly performing operations without disturbing others, (3) enhancing operations on mobile devices, (4) supporting accessibility, and (5) artistic performances. We present a proof-of-concept, wearable implementation based on a depth camera and report on a lab study comparing social acceptability, physical and mental demand, and user preference. A second study demonstrates a 94-99% recognition rate of our recognizers.
@inProceedings{
 title = {ShoeSense: A New Perspective on Gestural Interaction and Wearable Applications},
 type = {inProceedings},
 year = {2012},
 identifiers = {[object Object]},
 pages = {1239-1248},
 websites = {http://dx.doi.org/10.1145/2207676.2208576},
 publisher = {ACM},
 series = {CHI},
 id = {9f2ce307-427f-3ddd-8e19-66e7347a9935},
 created = {2018-07-12T21:31:07.641Z},
 file_attached = {false},
 profile_id = {f954d000-ce94-3da6-bd26-b983145a920f},
 group_id = {b0b145a3-980e-3ad7-a16f-c93918c606ed},
 last_modified = {2018-07-12T21:31:07.641Z},
 read = {false},
 starred = {false},
 authored = {false},
 confirmed = {true},
 hidden = {false},
 citation_key = {Bailly2012ShoeSense},
 source_type = {inproceedings},
 private_publication = {false},
 abstract = {When the user is engaged with a real-world task it can be inappropriate or difficult to use a smartphone. To address this concern, we developed ShoeSense, a wearable system consisting in part of a shoe-mounted depth sensor pointing upward at the wearer. ShoeSense recognizes relaxed and discreet as well as large and demonstrative hand gestures. In particular, we designed three gesture sets (Triangle, Radial, and Finger-Count) for this setup, which can be performed without visual attention. The advantages of ShoeSense are illustrated in five scenarios: (1) quickly performing frequent operations without reaching for the phone, (2) discreetly performing operations without disturbing others, (3) enhancing operations on mobile devices, (4) supporting accessibility, and (5) artistic performances. We present a proof-of-concept, wearable implementation based on a depth camera and report on a lab study comparing social acceptability, physical and mental demand, and user preference. A second study demonstrates a 94-99% recognition rate of our recognizers.},
 bibtype = {inProceedings},
 author = {Bailly, Gilles and Müller, Jörg and Rohs, Michael and Wigdor, Daniel and Kratz, Sven},
 booktitle = {Proceedings of the SIGCHI Conference on Human Factors in Computing Systems}
}

Downloads: 0