Author: Song, Jaeyun; Shim, Hajin; Yang, Eunho
                    Title: Mutually-Constrained Monotonic Multihead Attention for Online ASR  Cord-id: resog1g1  Document date: 2021_3_26
                    ID: resog1g1
                    
                    Snippet: Despite the feature of real-time decoding, Monotonic Multihead Attention (MMA) shows comparable performance to the state-of-the-art offline methods in machine translation and automatic speech recognition (ASR) tasks. However, the latency of MMA is still a major issue in ASR and should be combined with a technique that can reduce the test latency at inference time, such as head-synchronous beam search decoding, which forces all non-activated heads to activate after a small fixed delay from the fi
                    
                    
                    
                     
                    
                    
                    
                    
                        
                            
                                Document: Despite the feature of real-time decoding, Monotonic Multihead Attention (MMA) shows comparable performance to the state-of-the-art offline methods in machine translation and automatic speech recognition (ASR) tasks. However, the latency of MMA is still a major issue in ASR and should be combined with a technique that can reduce the test latency at inference time, such as head-synchronous beam search decoding, which forces all non-activated heads to activate after a small fixed delay from the first head activation. In this paper, we remove the discrepancy between training and test phases by considering, in the training of MMA, the interactions across multiple heads that will occur in the test time. Specifically, we derive the expected alignments from monotonic attention by considering the boundaries of other heads and reflect them in the learning process. We validate our proposed method on the two standard benchmark datasets for ASR and show that our approach, MMA with the mutually-constrained heads from the training stage, provides better performance than baselines.
 
  Search related documents: 
                                Co phrase  search for related documents- Try single phrases listed below for: 1
 
                                Co phrase  search for related documents, hyperlinks ordered by date