Nils Bourgon, Benamara Farah, Alda Mari, Véronique Moriceau, Gaetan Chevalier, Laurent Leygue, et al. (2022). Are Sudden Crises Making me Collapse? Measuring Transfer Learning Performances on Urgency Detection. In Rob Grace, & Hossein Baharmand (Eds.), ISCRAM 2022 Conference Proceedings – 19th International Conference on Information Systems for Crisis Response and Management (pp. 701–709). Tarbes, France.
Abstract: This paper aims at measuring transfer learning performances across different types of crises related to sudden or unexpected events (like earthquakes, terror attacks, explosions, technological incidents) that cannot be foreseen by emergency services and on the occurrence of which they have virtually no control. Although sudden crises are present in most existing crisis datasets, as far as we are aware, no one studied their impact on classifiers performances when evaluated in an out-of-type scenario in which models are tested on a particular type of crisis unseen during training. Our contribution is threefold: (1) A new dataset of about 3,800 French tweets related to four sudden events that occurred in France annotated for both relatedness (i.e., useful vs. not useful for emergency responders) and urgency (i.e., not useful vs. urgent vs. not urgent), (2) A set of monotask and multitask zero-shot learning experiments to transfer knowledge across events and types, and finally, (3) Experiments involving few-shot learning to measure the amount of sudden events instances needed during training to guarantee good performances. When compared to a cross-event setting, our preliminary results are encouraging and show that transfer from predictable ecological crisis to sudden events is feasible and constitutes a first step towards real-time crisis management systems from social media content.
|
|
Shivam Sharma, & Cody Buntain. (2021). An Evaluation of Twitter Datasets from Non-Pandemic Crises Applied to Regional COVID-19 Contexts. In Anouck Adrot, Rob Grace, Kathleen Moore, & Christopher W. Zobel (Eds.), ISCRAM 2021 Conference Proceedings – 18th International Conference on Information Systems for Crisis Response and Management (pp. 808–815). Blacksburg, VA (USA): Virginia Tech.
Abstract: In 2020, we have witnessed an unprecedented crisis event, the COVID-19 pandemic. Various questions arise regarding the nature of this crisis data and the impacts it would have on the existing tools. In this paper, we aim to study whether we can include pandemic-type crisis events with general non-pandemic events and hypothesize that including labeled crisis data from a variety of non-pandemic events will improve classification performance over models trained solely on pandemic events. To test our hypothesis we study the model performance for different models by performing a cross validation test on pandemic only held-out sets for two different types of training sets, one containing only pandemic data and the other a combination of pandemic and non-pandemic crisis data, and comparing the results of the two. Our results approve our hypothesis and give evidence of some crucial information propagation upon inclusion of non-pandemic crisis data to pandemic data.
|
|