Selected article for: "art state and knowledge acquisition"

Author: Pu, Shi; Yudelson, Michael; Ou, Lu; Huang, Yuchi
Title: Deep Knowledge Tracing with Transformers
  • Cord-id: ef97jzc4
  • Document date: 2020_6_10
  • ID: ef97jzc4
    Snippet: In this work, we propose a Transformer-based model to trace students’ knowledge acquisition. We modified the Transformer structure to utilize 1) the association between questions and skills and 2) the elapsed time between question steps. The use of question-skill associations allows the model to learn specific representation for frequently encountered questions while representing rare questions with their underline skill representations. The inclusion of elapsed time opens the opportunity to a
    Document: In this work, we propose a Transformer-based model to trace students’ knowledge acquisition. We modified the Transformer structure to utilize 1) the association between questions and skills and 2) the elapsed time between question steps. The use of question-skill associations allows the model to learn specific representation for frequently encountered questions while representing rare questions with their underline skill representations. The inclusion of elapsed time opens the opportunity to address forgetting. Our approach outperforms the state-of-the-art methods in the literature by roughly 10% in AUC with frequently used public datasets.

    Search related documents:
    Co phrase search for related documents
    • Try single phrases listed below for: 1