Author: Park, Jong Won
Title: Continual BERT: Continual Learning for Adaptive Extractive Summarization of COVID-19 Literature Cord-id: sot2y5y6 Document date: 2020_7_7
ID: sot2y5y6
Snippet: The scientific community continues to publish an overwhelming amount of new research related to COVID-19 on a daily basis, leading to much literature without little to no attention. To aid the community in understanding the rapidly flowing array of COVID-19 literature, we propose a novel BERT architecture that provides a brief yet original summarization of lengthy papers. The model continually learns on new data in online fashion while minimizing catastrophic forgetting, thus fitting to the need
Document: The scientific community continues to publish an overwhelming amount of new research related to COVID-19 on a daily basis, leading to much literature without little to no attention. To aid the community in understanding the rapidly flowing array of COVID-19 literature, we propose a novel BERT architecture that provides a brief yet original summarization of lengthy papers. The model continually learns on new data in online fashion while minimizing catastrophic forgetting, thus fitting to the need of the community. Benchmark and manual examination of its performance show that the model provide a sound summary of new scientific literature.
Search related documents:
Co phrase search for related documents- abstract sars and loss minimize: 1
Co phrase search for related documents, hyperlinks ordered by date