|Home||<< 1 >>|
Dragos Datcu, & Leon J.M. Rothkrantz. (2007). The use of active appearance model for facial expression recognition in crisis environments. In K. Nieuwenhuis P. B. B. Van de Walle (Ed.), Intelligent Human Computer Systems for Crisis Response and Management, ISCRAM 2007 Academic Proceedings Papers (pp. 515–524). Delft: Information Systems for Crisis Response and Management, ISCRAM.
Abstract: In the past a crisis event was notified by local witnesses that use to make phone calls to the special services. They reported by speech according to their observation on the crisis site. The recent improvements in the area of human computer interfaces make possible the development of context-aware systems for crisis management that support people in escaping a crisis even before external help is available at site. Apart from collecting the people's reports on the crisis, these systems are assumed to automatically extract useful clues during typical human computer interaction sessions. The novelty of the current research resides in the attempt to involve computer vision techniques for performing an automatic evaluation of facial expressions during human-computer interaction sessions with a crisis management system. The current paper details an approach for an automatic facial expression recognition module that may be included in crisis-oriented applications. The algorithm uses Active Appearance Model for facial shape extraction and SVM classifier for Action Units detection and facial expression recognition.
Keywords: Face recognition; Gesture recognition; Active appearance models; Automatic evaluation; Automatic facial expression recognition; Computer vision techniques; Context-aware systems; Crisis management systems; Facial expression recognition; Human computer interfaces; Human computer interaction
Siska Fitrianie, & Leon J.M. Rothkrantz. (2009). Computed ontology-based situation awareness of multi-user observations. In S. J. J. Landgren (Ed.), ISCRAM 2009 – 6th International Conference on Information Systems for Crisis Response and Management: Boundary Spanning Initiatives and New Perspectives. Gothenburg: Information Systems for Crisis Response and Management, ISCRAM.
Abstract: In recent years, we have developed a framework of human-computer interaction that offers recognition of various communication modalities including speech, lip movement, facial expression, handwriting/drawing, gesture, text and visual symbols. The framework allows the rapid construction of a multimodal, multi-device, and multi-user communication system within crisis management. This paper reports the approaches used in multi-user information integration (input fusion) and multimodal presentation (output fission) modules, which can be used in isolation, but also as part of the framework. The latter is able to specify and produce contextsensitive and user-tailored output combining language, speech, visual-language and graphics. These modules provide a communication channel between the system and users with different communication devices. By the employment of ontology, the system's view about the world is constructed from multi-user observations and appropriate multimodal responses are generated.
Keywords: Character recognition; Face recognition; Gesture recognition; Information systems; Speech recognition; Communication device; Communication modalities; Information integration; Multi-modal; Multi-user; Multiuser communication; Rapid construction; Situation awareness; Visual languages
Track: Open Track
Konstantinos Konstantoudakis, Georgios Albanis, Emmanouil Christakis, Nikolaos Zioulis, Anastasios Dimou, Dimitrios Zarpalas, et al. (2020). Single-Handed Gesture UAV Control for First Responders – A Usability and Performance User Study. In Amanda Hughes, Fiona McNeill, & Christopher W. Zobel (Eds.), ISCRAM 2020 Conference Proceedings – 17th International Conference on Information Systems for Crisis Response and Management (pp. 937–951). Blacksburg, VA (USA): Virginia Tech.
Abstract: Unmanned aerial vehicles (UAVs) have increased in popularity in recent years and are now involved in many activities, professional and otherwise. First responders, those teams and individuals who are the first to respond in crisis situations, have been using UAVs to assist them in locating victims and identifying hazards without endangering human personnel needlessly. However, professional UAV controllers tend to be heavy and cumbersome, requiring both hands to operate. First responders, on the other hand, often need to carry other important equipment and need to keep their hands free during a mission. This work considers enabling first responders to control UAVs with single-handed gestures, freeing their other hand and reducing their encumbrance. Two sets of gesture UAV controls are presented and implemented in a simulated environment, and a two-part user study is conducted: the first part assesses the comfort of each gesture and their intuitive association with basic flight control concepts; and the second evaluates two different modes of gesture control in a population of users including both genders, and first responders as well as members of the general populace. The results, consisting of both objective and subjective measurements, are discussed, hindrances and problems are identified, and directions of future work and research are mapped out.
Keywords: First Responders; UAV; Gesture Recognition; User Study