Author: Xu, Tianyang; Zhang, Chunyun
                    Title: Reinforced Generative Adversarial Network for Abstractive Text Summarization  Cord-id: klkpm5n3  Document date: 2021_5_31
                    ID: klkpm5n3
                    
                    Snippet: Sequence-to-sequence models provide a viable new approach to generative summarization, allowing models that are no longer limited to simply selecting and recombining sentences from the original text. However, these models have three drawbacks: their grasp of the details of the original text is often inaccurate, and the text generated by such models often has repetitions, while it is difficult to handle words that are beyond the word list. In this paper, we propose a new architecture that combine
                    
                    
                    
                     
                    
                    
                    
                    
                        
                            
                                Document: Sequence-to-sequence models provide a viable new approach to generative summarization, allowing models that are no longer limited to simply selecting and recombining sentences from the original text. However, these models have three drawbacks: their grasp of the details of the original text is often inaccurate, and the text generated by such models often has repetitions, while it is difficult to handle words that are beyond the word list. In this paper, we propose a new architecture that combines reinforcement learning and adversarial generative networks to enhance the sequence-to-sequence attention model. First, we use a hybrid pointer-generator network that copies words directly from the source text, contributing to accurate reproduction of information without sacrificing the ability of generators to generate new words. Second, we use both intra-temporal and intra-decoder attention to penalize summarized content and thus discourage repetition. We apply our model to our own proposed COVID-19 paper title summarization task and achieve close approximations to the current model on ROUEG, while bringing better readability.
 
  Search related documents: 
                                Co phrase  search for related documents- Try single phrases listed below for: 1
  
 
                                Co phrase  search for related documents, hyperlinks ordered by date