Designing a human-centered, multimodal gis interface to support emergency management. Ingmar, R., Agrawal, P., Sven, F., Brewer, I., Wang, H., Rajeev, S., Cai, G., & Alan, M., M. 2002.
Designing a human-centered, multimodal gis interface to support emergency management [pdf]Website  abstract   bibtex   
Geospatial information is critical to effective, collaborative decision-making during emergency management situations; however conventional GIS are not suited for multi-user access and high-level abstract queries. Currently, decision makers do not always have the real time information they need; GIS analysts produce maps at the request of individual decision makers, often leading to overlapping requests with slow delivery times. In order to overcome these limitations, a paradigm shift in interface design for GIS is needed. The research reported upon here attempts to overcome analyst-driven, menu-controlled, keyboard and mouse operated GIS by designing a multimodal, multi-user GIS interface that puts geospatial data directly in the hands of decision makers. A large screen display is used for data visualization, and collaborative, multi-user interactions in emergency management are supported through voice and gesture recognition. Speech and gesture recognition is coupled with a knowledge-based dialogue management system for storing and retrieving geospatial data. This paper describes the first prototype and the insights gained for human-centered multimodal GIS interface design.

Downloads: 0