Author: Pandey, Shalini; Srivastava, Jaideep
Title: RKT : Relation-Aware Self-Attention for Knowledge Tracing Cord-id: 0k6cfyox Document date: 2020_8_28
ID: 0k6cfyox
Snippet: The world has transitioned into a new phase of online learning in response to the recent Covid19 pandemic. Now more than ever, it has become paramount to push the limits of online learning in every manner to keep flourishing the education system. One crucial component of online learning is Knowledge Tracing (KT). The aim of KT is to model student's knowledge level based on their answers to a sequence of exercises referred as interactions. Students acquire their skills while solving exercises and
Document: The world has transitioned into a new phase of online learning in response to the recent Covid19 pandemic. Now more than ever, it has become paramount to push the limits of online learning in every manner to keep flourishing the education system. One crucial component of online learning is Knowledge Tracing (KT). The aim of KT is to model student's knowledge level based on their answers to a sequence of exercises referred as interactions. Students acquire their skills while solving exercises and each such interaction has a distinct impact on student ability to solve a future exercise. This \textit{impact} is characterized by 1) the relation between exercises involved in the interactions and 2) student forget behavior. Traditional studies on knowledge tracing do not explicitly model both the components jointly to estimate the impact of these interactions. In this paper, we propose a novel Relation-aware self-attention model for Knowledge Tracing (RKT). We introduce a relation-aware self-attention layer that incorporates the contextual information. This contextual information integrates both the exercise relation information through their textual content as well as student performance data and the forget behavior information through modeling an exponentially decaying kernel function. Extensive experiments on three real-world datasets, among which two new collections are released to the public, show that our model outperforms state-of-the-art knowledge tracing methods. Furthermore, the interpretable attention weights help visualize the relation between interactions and temporal patterns in the human learning process.
Search related documents:
Co phrase search for related documents- ablation study and low improvement: 1
- acc accuracy and logistic regression: 1, 2
- acc accuracy and long lstm short term memory: 1, 2
- acc accuracy and lstm short term memory: 1, 2
- acc accuracy auc curve and long lstm short term memory: 1
- acc accuracy auc curve and lstm short term memory: 1
Co phrase search for related documents, hyperlinks ordered by date