Selected article for: "deep learning and lstm cnn"

Author: Seki, Yohei; Zhao, Kangkang; Oguni, Masaki; Sugiyama, Kazunari
Title: CNN-based framework for classifying temporal relations with question encoder
  • Cord-id: xp2gvayz
  • Document date: 2021_10_13
  • ID: xp2gvayz
    Snippet: Temporal-relation classification plays an important role in the field of natural language processing. Various deep learning-based classifiers, which can generate better models using sentence embedding, have been proposed to address this challenging task. These approaches, however, do not work well due to the lack of task-related information. To overcome this problem, we propose a novel framework that incorporates prior information by employing awareness of events and time expressions (time–eve
    Document: Temporal-relation classification plays an important role in the field of natural language processing. Various deep learning-based classifiers, which can generate better models using sentence embedding, have been proposed to address this challenging task. These approaches, however, do not work well due to the lack of task-related information. To overcome this problem, we propose a novel framework that incorporates prior information by employing awareness of events and time expressions (time–event entities) with various window sizes to focus on context words around the entities as a filter. We refer to this module as “question encoder.” In our approach, this kind of prior information can extract task-related information from simple sentence embedding. Our experimental results on a publicly available Timebank-Dense corpus demonstrate that our approach outperforms some state-of-the-art techniques, including CNN-, LSTM-, and BERT-based temporal relation classifiers.

    Search related documents:
    Co phrase search for related documents
    • adam optimizer and logistic regression: 1
    • adam optimizer and long lstm short term memory: 1, 2
    • adam optimizer and lstm approach: 1
    • logistic regression and long dependency: 1
    • logistic regression and long lstm short term memory: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10
    • logistic regression and longitudinal effect: 1, 2, 3
    • long lstm short term memory and lstm approach: 1, 2, 3, 4, 5, 6, 7, 8