Closed-Eye Gaze Gestures: Detection and Recognition of Closed-Eye Movements with Cameras in Smart Glasses. Findling, R. D., Nguyen, L. N., & Sigg, S. In International Work-Conference on Artificial Neural Networks, 2019. Paper doi abstract bibtex 5 downloads Gaze gestures bear potential for user input with mobile devices, especially smart glasses, due to being always available and hands-free. So far, gaze gesture recognition approaches have utilized open-eye movements only and disregarded closed-eye movements. This paper is a first investigation of the feasibility of detecting and recognizing closed-eye gaze gestures from close-up optical sources, e.g. eye-facing cameras embedded in smart glasses. We propose four different closed-eye gaze gesture protocols, which extend the alphabet of existing open-eye gaze gesture approaches. We further propose a methodology for detecting and extracting the corresponding closed-eye movements with full optical flow, time series processing, and machine learning. In the evaluation of the four protocols we find closed-eye gaze gestures to be detected 82.8%-91.6% of the time, and extracted gestures to be recognized correctly with an accuracy of 92.9%-99.2%.
@InProceedings{Rainhard_2019_iwann,
author={Rainhard Dieter Findling and Le Ngu Nguyen and Stephan Sigg},
title={Closed-Eye Gaze Gestures: Detection and Recognition of Closed-Eye Movements with Cameras in Smart Glasses},
booktitle={International Work-Conference on Artificial Neural Networks},
year={2019},
doi = {10.1007/978-3-030-20521-8_27},
abstract ={Gaze gestures bear potential for user input with mobile devices, especially smart glasses, due to being always available and hands-free. So far, gaze gesture recognition approaches have utilized open-eye movements only and disregarded closed-eye movements. This paper is a first investigation of the feasibility of detecting and recognizing closed-eye gaze gestures from close-up optical sources, e.g. eye-facing cameras embedded in smart glasses. We propose four different closed-eye gaze gesture protocols, which extend the alphabet of existing open-eye gaze gesture approaches. We further propose a methodology for detecting and extracting the corresponding closed-eye movements with full optical flow, time series processing, and machine learning. In the evaluation of the four protocols we find closed-eye gaze gestures to be detected 82.8%-91.6% of the time, and extracted gestures to be recognized correctly with an accuracy of 92.9%-99.2%.},
url_Paper = {http://ambientintelligence.aalto.fi/findling/pdfs/publications/Findling_19_ClosedEyeGaze.pdf},
project = {hidemygaze},
group = {ambience}}
Downloads: 5
{"_id":"hBhCim6q8oRrLMBdG","bibbaseid":"findling-nguyen-sigg-closedeyegazegesturesdetectionandrecognitionofclosedeyemovementswithcamerasinsmartglasses-2019","author_short":["Findling, R. D.","Nguyen, L. N.","Sigg, S."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","author":[{"firstnames":["Rainhard","Dieter"],"propositions":[],"lastnames":["Findling"],"suffixes":[]},{"firstnames":["Le","Ngu"],"propositions":[],"lastnames":["Nguyen"],"suffixes":[]},{"firstnames":["Stephan"],"propositions":[],"lastnames":["Sigg"],"suffixes":[]}],"title":"Closed-Eye Gaze Gestures: Detection and Recognition of Closed-Eye Movements with Cameras in Smart Glasses","booktitle":"International Work-Conference on Artificial Neural Networks","year":"2019","doi":"10.1007/978-3-030-20521-8_27","abstract":"Gaze gestures bear potential for user input with mobile devices, especially smart glasses, due to being always available and hands-free. So far, gaze gesture recognition approaches have utilized open-eye movements only and disregarded closed-eye movements. This paper is a first investigation of the feasibility of detecting and recognizing closed-eye gaze gestures from close-up optical sources, e.g. eye-facing cameras embedded in smart glasses. We propose four different closed-eye gaze gesture protocols, which extend the alphabet of existing open-eye gaze gesture approaches. We further propose a methodology for detecting and extracting the corresponding closed-eye movements with full optical flow, time series processing, and machine learning. In the evaluation of the four protocols we find closed-eye gaze gestures to be detected 82.8%-91.6% of the time, and extracted gestures to be recognized correctly with an accuracy of 92.9%-99.2%.","url_paper":"http://ambientintelligence.aalto.fi/findling/pdfs/publications/Findling_19_ClosedEyeGaze.pdf","project":"hidemygaze","group":"ambience","bibtex":"@InProceedings{Rainhard_2019_iwann,\nauthor={Rainhard Dieter Findling and Le Ngu Nguyen and Stephan Sigg},\ntitle={Closed-Eye Gaze Gestures: Detection and Recognition of Closed-Eye Movements with Cameras in Smart Glasses},\nbooktitle={International Work-Conference on Artificial Neural Networks},\nyear={2019},\ndoi = {10.1007/978-3-030-20521-8_27},\nabstract ={Gaze gestures bear potential for user input with mobile devices, especially smart glasses, due to being always available and hands-free. So far, gaze gesture recognition approaches have utilized open-eye movements only and disregarded closed-eye movements. This paper is a first investigation of the feasibility of detecting and recognizing closed-eye gaze gestures from close-up optical sources, e.g. eye-facing cameras embedded in smart glasses. We propose four different closed-eye gaze gesture protocols, which extend the alphabet of existing open-eye gaze gesture approaches. We further propose a methodology for detecting and extracting the corresponding closed-eye movements with full optical flow, time series processing, and machine learning. In the evaluation of the four protocols we find closed-eye gaze gestures to be detected 82.8%-91.6% of the time, and extracted gestures to be recognized correctly with an accuracy of 92.9%-99.2%.},\nurl_Paper = {http://ambientintelligence.aalto.fi/findling/pdfs/publications/Findling_19_ClosedEyeGaze.pdf},\n project = {hidemygaze},\ngroup = {ambience}}\n\n","author_short":["Findling, R. D.","Nguyen, L. N.","Sigg, S."],"key":"Rainhard_2019_iwann","id":"Rainhard_2019_iwann","bibbaseid":"findling-nguyen-sigg-closedeyegazegesturesdetectionandrecognitionofclosedeyemovementswithcamerasinsmartglasses-2019","role":"author","urls":{" paper":"http://ambientintelligence.aalto.fi/findling/pdfs/publications/Findling_19_ClosedEyeGaze.pdf"},"metadata":{"authorlinks":{}},"downloads":5},"bibtype":"inproceedings","biburl":"http://ambientintelligence.aalto.fi/bibtex/LiteraturAll","dataSources":["aPfcTvMp5jE2KuS7H","a6QYyvmdLfrsx7DiL"],"keywords":[],"search_terms":["closed","eye","gaze","gestures","detection","recognition","closed","eye","movements","cameras","smart","glasses","findling","nguyen","sigg"],"title":"Closed-Eye Gaze Gestures: Detection and Recognition of Closed-Eye Movements with Cameras in Smart Glasses","year":2019,"downloads":6}