Selected article for: "low resource and machine translation"

Author: Zhan, Runzhe; Liu, Xuebo; Wong, Derek F.; Chao, Lidia S.
Title: Meta-Curriculum Learning for Domain Adaptation in Neural Machine Translation
  • Cord-id: 07fws9e8
  • Document date: 2021_3_3
  • ID: 07fws9e8
    Snippet: Meta-learning has been sufficiently validated to be beneficial for low-resource neural machine translation (NMT). However, we find that meta-trained NMT fails to improve the translation performance of the domain unseen at the meta-training stage. In this paper, we aim to alleviate this issue by proposing a novel meta-curriculum learning for domain adaptation in NMT. During meta-training, the NMT first learns the similar curricula from each domain to avoid falling into a bad local optimum early,
    Document: Meta-learning has been sufficiently validated to be beneficial for low-resource neural machine translation (NMT). However, we find that meta-trained NMT fails to improve the translation performance of the domain unseen at the meta-training stage. In this paper, we aim to alleviate this issue by proposing a novel meta-curriculum learning for domain adaptation in NMT. During meta-training, the NMT first learns the similar curricula from each domain to avoid falling into a bad local optimum early, and finally learns the curricula of individualities to improve the model robustness for learning domain-specific knowledge. Experimental results on 10 different low-resource domains show that meta-curriculum learning can improve the translation performance of both familiar and unfamiliar domains. All the codes and data are freely available at https://github.com/NLP2CT/Meta-Curriculum.

    Search related documents:
    Co phrase search for related documents
    • Try single phrases listed below for: 1