Inferring Visualization Task Properties, User Performance, and User Cognitive Abilities from Eye Gaze Data. Steichen, B., Conati, C., & Carenini, G. ACM Trans. Interact. Intell. Syst., 4(2):11:1–11:29, July, 2014.
Inferring Visualization Task Properties, User Performance, and User Cognitive Abilities from Eye Gaze Data [link]Paper  doi  abstract   bibtex   
Information visualization systems have traditionally followed a one-size-fits-all model, typically ignoring an individual user's needs, abilities, and preferences. However, recent research has indicated that visualization performance could be improved by adapting aspects of the visualization to the individual user. To this end, this article presents research aimed at supporting the design of novel user-adaptive visualization systems. In particular, we discuss results on using information on user eye gaze patterns while interacting with a given visualization to predict properties of the user's visualization task; the user's performance (in terms of predicted task completion time); and the user's individual cognitive abilities, such as perceptual speed, visual working memory, and verbal working memory. We provide a detailed analysis of different eye gaze feature sets, as well as over-time accuracies. We show that these predictions are significantly better than a baseline classifier even during the early stages of visualization usage. These findings are then discussed with a view to designing visualization systems that can adapt to the individual user in real time.
@article{steichen_inferring_2014,
	title = {Inferring {Visualization} {Task} {Properties}, {User} {Performance}, and {User} {Cognitive} {Abilities} from {Eye} {Gaze} {Data}},
	volume = {4},
	issn = {2160-6455},
	url = {http://doi.acm.org/10.1145/2633043},
	doi = {10.1145/2633043},
	abstract = {Information visualization systems have traditionally followed a one-size-fits-all model, typically ignoring an individual user's needs, abilities, and preferences. However, recent research has indicated that visualization performance could be improved by adapting aspects of the visualization to the individual user. To this end, this article presents research aimed at supporting the design of novel user-adaptive visualization systems. In particular, we discuss results on using information on user eye gaze patterns while interacting with a given visualization to predict properties of the user's visualization task; the user's performance (in terms of predicted task completion time); and the user's individual cognitive abilities, such as perceptual speed, visual working memory, and verbal working memory. We provide a detailed analysis of different eye gaze feature sets, as well as over-time accuracies. We show that these predictions are significantly better than a baseline classifier even during the early stages of visualization usage. These findings are then discussed with a view to designing visualization systems that can adapt to the individual user in real time.},
	number = {2},
	urldate = {2019-12-13},
	journal = {ACM Trans. Interact. Intell. Syst.},
	author = {Steichen, Ben and Conati, Cristina and Carenini, Giuseppe},
	month = jul,
	year = {2014},
	keywords = {HOW - Pattern Analysis, WHY - User Behaviour, User Characteristics, User Modelling, WHEN - Hybrid Approaches, WHY - Adaptive Systems / Guidance, HOW - Classification Models, Type of Work: Application \& Design Study},
	pages = {11:1--11:29},
	file = {Steichen et al. - 2014 - Inferring Visualization Task Properties, User Perf.pdf:C\:\\Users\\conny\\Zotero\\storage\\7G7S8RQX\\Steichen et al. - 2014 - Inferring Visualization Task Properties, User Perf.pdf:application/pdf}
}

Downloads: 0