1.1
1
xml
info:srw/schema/1/mods-v3.2
How does a Pre-Trained Transformer Integrate Contextual Keywords? Application to Humanitarian Computing
Valentin Barriere
author
Guillaume Jacquet
author
2021
Virginia Tech
Blacksburg, VA (USA)
English
In a classification task, dealing with text snippets and metadata usually requires to deal with multimodal approaches. When those metadata are textual, it is tempting to use them intrinsically with a pre-trained transformer, in order to leverage the semantic information encoded inside the model. This paper describes how to improve a humanitarian classification task by adding the crisis event type to each tweet to be classified. Based on additional experiments of the model weights and behavior, it identifies how the proposed neural network approach is partially over-fitting the particularities of the Crisis Benchmark, to better highlight how the model is still undoubtedly learning to use and take advantage of the metadata's textual semantics.
Transformers
Contextual keywords
Humanitarian Computing
Tweets analysis
valbarrierepro@gmail.com
exported from refbase (http://idl.iscram.org/show.php?record=2371), last updated on Tue, 13 Jul 2021 18:40:40 +0200
text
http://idl.iscram.org/files/valentinbarriere/2021/2371_ValentinBarriere+GuillaumeJacquet2021.pdf
ValentinBarriere+GuillaumeJacquet2021
ISCRAM 2021 Conference Proceedings – 18th International Conference on Information Systems for Crisis Response and Management
Iscram 2021
Anouck Adrot
editor
Rob Grace
editor
Kathleen Moore
editor
Christopher W. Zobel
editor
18th International Conference on Information Systems for Crisis Response and Management
2021
Virginia Tech
Blacksburg, VA (USA)
conference publication
766
771
978-1-949373-61-5
1