Xukun Li, Doina Caragea, Cornelia Caragea, Muhammad Imran, & Ferda Ofli. (2019). Identifying Disaster Damage Images Using a Domain Adaptation Approach. In Z. Franco, J. J. González, & J. H. Canós (Eds.), Proceedings of the 16th International Conference on Information Systems for Crisis Response And Management. Valencia, Spain: Iscram.
Abstract: Approaches for effectively filtering useful situational awareness information posted by eyewitnesses of disasters,
in real time, are greatly needed. While many studies have focused on filtering textual information, the research
on filtering disaster images is more limited. In particular, there are no studies on the applicability of domain
adaptation to filter images from an emergent target disaster, when no labeled data is available for the target disaster.
To fill in this gap, we propose to apply a domain adaptation approach, called domain adversarial neural networks
(DANN), to the task of identifying images that show damage. The DANN approach has VGG-19 as its backbone,
and uses the adversarial training to find a transformation that makes the source and target data indistinguishable.
Experimental results on several pairs of disasters suggest that the DANN model generally gives similar or better
results as compared to the VGG-19 model fine-tuned on the source labeled data.