Author: Liu, Junhua; Singhal, Trisha; Blessing, Lucienne T.M.; Wood, Kristin L.; Lim, Kwan Hui
Title: CrisisBERT: a Robust Transformer for Crisis Classification and Contextual Crisis Embedding Cord-id: li476eqy Document date: 2020_5_11
ID: li476eqy
Snippet: Classification of crisis events, such as natural disasters, terrorist attacks and pandemics, is a crucial task to create early signals and inform relevant parties for spontaneous actions to reduce overall damage. Despite crisis such as natural disasters can be predicted by professional institutions, certain events are first signaled by civilians, such as the recent COVID-19 pandemics. Social media platforms such as Twitter often exposes firsthand signals on such crises through high volume inform
Document: Classification of crisis events, such as natural disasters, terrorist attacks and pandemics, is a crucial task to create early signals and inform relevant parties for spontaneous actions to reduce overall damage. Despite crisis such as natural disasters can be predicted by professional institutions, certain events are first signaled by civilians, such as the recent COVID-19 pandemics. Social media platforms such as Twitter often exposes firsthand signals on such crises through high volume information exchange over half a billion tweets posted daily. Prior works proposed various crisis embeddings and classification using conventional Machine Learning and Neural Network models. However, none of the works perform crisis embedding and classification using state of the art attention-based deep neural networks models, such as Transformers and document-level contextual embeddings. This work proposes CrisisBERT, an end-to-end transformer-based model for two crisis classification tasks, namely crisis detection and crisis recognition, which shows promising results across accuracy and f1 scores. The proposed model also demonstrates superior robustness over benchmark, as it shows marginal performance compromise while extending from 6 to 36 events with only 51.4% additional data points. We also proposed Crisis2Vec, an attention-based, document-level contextual embedding architecture for crisis embedding, which achieve better performance than conventional crisis embedding methods such as Word2Vec and GloVe. To the best of our knowledge, our works are first to propose using transformer-based crisis classification and document-level contextual crisis embedding in the literature.
Search related documents:
Co phrase search for related documents- accuracy calculate and machine learning: 1, 2
- accuracy calculate and machine learning approach: 1
- accuracy f1 score and logistic regression: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17
- accuracy f1 score and logistic regression model: 1, 2, 3, 4, 5
- accuracy f1 score and lr logistic regression: 1, 2, 3, 4
- accuracy f1 score and lstm model: 1
- accuracy f1 score and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25
- accuracy f1 score and machine learning approach: 1, 2, 3, 4
- accuracy f1 score performance and logistic regression: 1, 2, 3
- accuracy f1 score performance and logistic regression model: 1
- accuracy f1 score performance and lstm model: 1
- accuracy f1 score performance and machine learning: 1, 2, 3, 4, 5, 6, 7, 8
- accuracy f1 score performance and machine learning approach: 1
- logistic regression and lr logistic regression: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25
- logistic regression and lstm model: 1, 2
- logistic regression and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25
- logistic regression and machine learning approach: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18
- logistic regression model and lr logistic regression: 1, 2, 3, 4, 5, 6, 7
- logistic regression model and machine learning approach: 1, 2, 3, 4
Co phrase search for related documents, hyperlinks ordered by date