|
Ferda Ofli, Firoj Alam, & Muhammad Imran. (2020). Analysis of Social Media Data using Multimodal Deep Learning for Disaster Response. In Amanda Hughes, Fiona McNeill, & Christopher W. Zobel (Eds.), ISCRAM 2020 Conference Proceedings – 17th International Conference on Information Systems for Crisis Response and Management (pp. 802–811). Blacksburg, VA (USA): Virginia Tech.
Abstract: Multimedia content in social media platforms provides significant information during disaster events. The types of information shared include reports of injured or deceased people, infrastructure damage, and missing or found people, among others. Although many studies have shown the usefulness of both text and image content for disaster response purposes, the research has been mostly focused on analyzing only the text modality in the past. In this paper, we propose to use both text and image modalities of social media data to learn a joint representation using state-of-the-art deep learning techniques. Specifically, we utilize convolutional neural networks to define a multimodal deep learning architecture with a modality-agnostic shared representation. Extensive experiments on real-world disaster datasets show that the proposed multimodal architecture yields better performance than models trained using a single modality (e.g., either text or image).
|
|
|
Hristo Tanev, Vanni Zavarella, & Josef Steinberger. (2017). Monitoring disaster impact: detecting micro-events and eyewitness reports in mainstream and social media. In eds Aurélie Montarnal Matthieu Lauras Chihab Hanachi F. B. Tina Comes (Ed.), Proceedings of the 14th International Conference on Information Systems for Crisis Response And Management (pp. 592–602). Albi, France: Iscram.
Abstract: This paper approaches the problem of monitoring the impact of the disasters by mining web sources for the events, caused by these disasters. We refer to these disaster effects as “micro-events”. Micro-events typically following a large disaster include casualties, damage on infrastructures, vehicles, services and resource supply, as well as relief operations. We present natural language grammar learning algorithms which form the basis for building micro-event detection systems from data, with no or minor human intervention, and we show how they can be applied to mainstream news and social media for monitoring disaster impact. We also experimented with applying statistical classifiers to distill, from social media situational updates on disasters, eyewitness reports from directly affected people. Finally, we describe a Twitter mining robot, which integrates some of these monitoring techniques and is intended to serve as a multilingual content hub for enhancing situational awareness.
|
|