计算机科学与探索 ›› 2022, Vol. 16 ›› Issue (8): 1742-1763.DOI: 10.3778/j.issn.1673-9418.2111054
收稿日期:
2021-11-10
修回日期:
2022-03-24
出版日期:
2022-08-01
发布日期:
2022-08-19
通讯作者:
+E-mail: xuluqian1002@163.com.作者简介:
曾凡智(1965—),男,湖北洪湖人,博士,教授,硕士生导师,CCF会员,主要研究方向为计算机视觉、图像处理、数据挖掘。基金资助:
ZENG Fanzhi, XU Luqian(), ZHOU Yan, ZHOU Yuexia, LIAO Junwei
Received:
2021-11-10
Revised:
2022-03-24
Online:
2022-08-01
Published:
2022-08-19
About author:
ZENG Fanzhi, born in 1965, Ph.D., professor, M.S. supervisor, member of CCF. His research interests include computer vision, image processing and data mining.Supported by:
摘要:
知识追踪(KT)作为智慧教育领域的重点研究方向之一,利用智能辅导系统(ITS)提供的大量学习轨迹信息对学生进行建模,自动衡量学生的知识水平,为其提供个性化的学习方案,达到人工智能辅助教育的目的。全面回顾了面向智慧教育的知识追踪模型研究进展,三类具有代表性的模型分别为基于贝叶斯的知识追踪、基于Logistic模型的知识追踪以及近年来迅速发展并且表现出更好性能的深度学习知识追踪。基于贝叶斯的知识追踪分为贝叶斯知识追踪(BKT)以及结合个性化、知识相关性、节点状态与现实问题扩展的BKT模型;基于Logistic模型的知识追踪分为项目反应理论(IRT)与因子分析模型两类;基于深度学习的知识追踪分为深度知识追踪(DKT)及其改进模型以及设计网络结构与引入注意力机制。介绍了目前可供研究者们使用的国际公开教育数据集与常用的模型评估指标,比较和分析了不同类型方法的性能、特点以及应用场景,并对当前研究所存在的问题以及未来发展方向进行探讨与展望。
中图分类号:
曾凡智, 许露倩, 周燕, 周月霞, 廖俊玮. 面向智慧教育的知识追踪模型研究综述[J]. 计算机科学与探索, 2022, 16(8): 1742-1763.
ZENG Fanzhi, XU Luqian, ZHOU Yan, ZHOU Yuexia, LIAO Junwei. Review of Knowledge Tracing Model for Intelligent Education[J]. Journal of Frontiers of Computer Science and Technology, 2022, 16(8): 1742-1763.
研究 | 发表年份 | 扩展类别 | 方法概述 | 局限性 |
---|---|---|---|---|
Pardos等人[ | 2010 | 结合个性化 | 为每个学生设计不同初始背景的知识状态 | 依赖于简化的假设,如每道题目仅涉及一个KC,学习过程中不存在遗忘情况等 |
Lee等人[ | 2012 | 结合个性化 | 设计学生导向模型,提高个体性差异 | |
Yudelson等人[ | 2013 | 结合个性化 | 将模型参数划分为知识部分和学生部分提高模型性能 | |
Wang等人[ | 2020 | 结合个性化 | 基于聚类学生进行贝叶斯知识追踪 | |
Käser等人[ | 2014 | 结合知识相关性 | 使用DBN表示KC拓扑结构 | 需要设置阈值,而不同类型KC需要不一样的阈值范围以及设置依据 |
Hawkins等人[ | 2014 | 结合知识相关性 | 考虑到题目相似性的BKT-ST模型 | |
Wang等人[ | 2016 | 结合知识相关性 | 基于知识状态的层次性和时间特性进行建模 | |
Käser等人[ | 2017 | 结合知识相关性 | 利用DBK在单个模型中联合考虑不同的KC | |
Wang等人[ | 2013 | 结合节点状态 | 采用0到1的连续型表示法,细化学生的知识状态 | 参数较多,计算量大,复杂度高 |
Falakmasir等人[ | 2015 | 结合节点状态 | 用3-gram代替二元节点状态的Spectral BKT模型 | |
Zhang等人[ | 2018 | 结合节点状态 | 采用三支决策的思想改进二进制节点状态 | |
Pardos等人[ | 2011 | 结合现实问题 | 将题目的难度系数引入BKT | 先验概率的确定存在主观性,简单的模型很难纳入实际情况的复杂性 |
Xu等人[ | 2014 | 结合现实问题 | 利用脑电图探测学生做题时的心理 | |
Spaulding等人[ | 2015 | 结合现实问题 | 融入学生的情感状态 | |
Agarwal等人[ | 2020 | 结合现实问题 | 引入学生答题情况近期率权重的MS-BKT模型,细化学生知识状态 | |
黄诗雯等人[ | 2021 | 结合现实问题 | 融合了学生的学习行为与遗忘因素 |
表1 BKT模型扩展方法概览
Table 1 Overview of BKT model extension methods
研究 | 发表年份 | 扩展类别 | 方法概述 | 局限性 |
---|---|---|---|---|
Pardos等人[ | 2010 | 结合个性化 | 为每个学生设计不同初始背景的知识状态 | 依赖于简化的假设,如每道题目仅涉及一个KC,学习过程中不存在遗忘情况等 |
Lee等人[ | 2012 | 结合个性化 | 设计学生导向模型,提高个体性差异 | |
Yudelson等人[ | 2013 | 结合个性化 | 将模型参数划分为知识部分和学生部分提高模型性能 | |
Wang等人[ | 2020 | 结合个性化 | 基于聚类学生进行贝叶斯知识追踪 | |
Käser等人[ | 2014 | 结合知识相关性 | 使用DBN表示KC拓扑结构 | 需要设置阈值,而不同类型KC需要不一样的阈值范围以及设置依据 |
Hawkins等人[ | 2014 | 结合知识相关性 | 考虑到题目相似性的BKT-ST模型 | |
Wang等人[ | 2016 | 结合知识相关性 | 基于知识状态的层次性和时间特性进行建模 | |
Käser等人[ | 2017 | 结合知识相关性 | 利用DBK在单个模型中联合考虑不同的KC | |
Wang等人[ | 2013 | 结合节点状态 | 采用0到1的连续型表示法,细化学生的知识状态 | 参数较多,计算量大,复杂度高 |
Falakmasir等人[ | 2015 | 结合节点状态 | 用3-gram代替二元节点状态的Spectral BKT模型 | |
Zhang等人[ | 2018 | 结合节点状态 | 采用三支决策的思想改进二进制节点状态 | |
Pardos等人[ | 2011 | 结合现实问题 | 将题目的难度系数引入BKT | 先验概率的确定存在主观性,简单的模型很难纳入实际情况的复杂性 |
Xu等人[ | 2014 | 结合现实问题 | 利用脑电图探测学生做题时的心理 | |
Spaulding等人[ | 2015 | 结合现实问题 | 融入学生的情感状态 | |
Agarwal等人[ | 2020 | 结合现实问题 | 引入学生答题情况近期率权重的MS-BKT模型,细化学生知识状态 | |
黄诗雯等人[ | 2021 | 结合现实问题 | 融合了学生的学习行为与遗忘因素 |
模型 | 发表年份 | 方法概述 | 局限性 |
---|---|---|---|
LFA[ | 2006 | 源于学习曲线的一种半自动化的方法 | 对练习时间具有敏感性 |
PFA[ | 2009 | LFA的改进方法,考虑题目的正误反应数量的学习累积 | 不能处理知识点之间的内在依赖性 |
IRT[ | 2018 | 基于IRT理论,为学生能力和题目难度建立参数模型 | 学生的能力水平在学习过程中是固定的 |
KTM[ | 2019 | 利用FMs将传统的Logistic模型推广到更高的维度 | 冷启动问题,不能准确代表之前的学习序列 |
RKTM[ | 2021 | 引入学生知识状态,与当前学习场景交互 | 特征提取困难,参数较多,增加了学习复杂度 |
表2 基于Logistic模型方法对比
Table 2 Comparison of Logistic regression model
模型 | 发表年份 | 方法概述 | 局限性 |
---|---|---|---|
LFA[ | 2006 | 源于学习曲线的一种半自动化的方法 | 对练习时间具有敏感性 |
PFA[ | 2009 | LFA的改进方法,考虑题目的正误反应数量的学习累积 | 不能处理知识点之间的内在依赖性 |
IRT[ | 2018 | 基于IRT理论,为学生能力和题目难度建立参数模型 | 学生的能力水平在学习过程中是固定的 |
KTM[ | 2019 | 利用FMs将传统的Logistic模型推广到更高的维度 | 冷启动问题,不能准确代表之前的学习序列 |
RKTM[ | 2021 | 引入学生知识状态,与当前学习场景交互 | 特征提取困难,参数较多,增加了学习复杂度 |
模型 | 发表年份 | 类别 | 方法概述 | 局限性 |
---|---|---|---|---|
DKT[ | 2015 | — | 采用RNN或LSTM建模学生序列 | — |
Deep-IRT[ | 2019 | 可解释性 | 结合心理测量理论IRT增强可解释性 | 模型参数较多,复杂度高,且每个参数也不具备传统方法的语义含义 |
KQN[ | 2019 | 可解释性 | 利用向量化表示提高模型的直观性与可解释性 | |
AKT[ | 2020 | 可解释性 | 与一系列可解释的心理测量模型组件相结合 | |
TC-MIRT[ | 2021 | 可解释性 | 结合MIRT构建增强网络组件生成可解释的参数 | |
DKT+[ | 2018 | 融合学习特征 | 引入对应DKT模型的重构以及三个正则化项,增强预测的一致性 | 将各类异构特征整合到数据中存在一定难度,参数量过大,需要加大计算量,模型训练速度低,训练周期过长 |
DKT+HFE[ | 2018 | 融合学习特征 | 应用树的分类器将异构特征隐式有效地嵌入DKT | |
DKT-DSC[ | 2018 | 融合学习特征 | 定期将学生按能力动态分组,隐式嵌入学生特征 | |
PDKT-C[ | 2018 | 融合学习特征 | 结合题目信息的先决条件进行建模作为约束 | |
DHKT[ | 2019 | 融合学习特征 | 建模题目和KC之间的层次结构 | |
DKT+FB[ | 2019 | 融合学习特征 | 融合三点与遗忘相关的附加信息 | |
DynEmb[ | 2020 | 融合学习特征 | 将矩阵分解技术与RNN相结合 | |
DKT LSTM[ | 2021 | 融合学习特征 | 融合学生异构特征与遗忘行为相关信息,通过SAEN进行建模 | |
LANA[ | 2021 | 融合学习特征 | 通过一种新颖的SRFE从学生各自的交互序列中提取其内在属性 | |
DKVMN[ | 2017 | 动态键值记忆网络 | 借鉴MANN,利用静态、动态外部矩阵分别读写学生知识状态 | 网络模型较大,知识增长的计算有一定局限性,捕捉序列之间的依赖关系能力有限 |
DKVMN-CA[ | 2019 | 动态键值记忆网络 | 改进DKVMN,支持人工标注概念树 | |
LPKT[ | 2020 | 动态键值记忆网络 | 采用DKVMN,结合学生的知识现状,完善模型的遗忘机制 | |
DKVMN-LA[ | 2021 | 动态键值记忆网络 | 引入学生学习能力与行为特征的多功能知识追踪算法 | |
CKT[ | 2020 | 3D卷积网络 | 采用3D ConvNets强化学生近期的学习状态与练习的短期特征 | 依赖特征工程提取,难以为高维数据提供准确的预测结果 |
GKT[ | 2019 | 图神经网络 | 利用GNN构建KC关系图 | 计算密集型,易受数据集大小的限制,由于KC划分粒度不一致,可能会直接影响学生知识状态的评估性能 |
GIKT[ | 2020 | 图神经网络 | 利用GCN提取练习-KC关系图中包含的高阶关系信息 | |
HGKT[ | 2020 | 图神经网络 | 结合练习之间的层次关系,建模练习学习依赖性 | |
JKT[ | 2021 | 图神经网络 | 联合图卷积网络提取隐藏在“练习-KC”图中的深层隐式信息 | |
DGMN[ | 2021 | 图神经网络 | 利用外部记忆结构的知识状态动态构建潜在KC及其关系图,同时考虑遗忘行为 | |
HMN[ | 2021 | 层次记忆网络 | 基于ASMM与DNC建模KT中的学生记忆机制 | 外部存储矩阵对记忆的划分不一,可能直接影响预测的准确度 |
EERNNA[ | 2018 | 注意力机制 | 嵌入练习文本特征建模学生的学习过程 | 抑制了注意力机制周围的其他信息,没有考虑到练习的顺序 |
SAKT[ | 2019 | 注意力机制 | 基于自注意力机制建模学生的交互历史,减少无关练习对目标练习的影响 | |
SAINT[ | 2020 | 注意力机制 | 改进SAKT,基于深度自注意层建模练习和学生回答之间的关系 | |
AKT[ | 2020 | 注意力机制 | 建模题目与回答的上下文感知,表示提取学生的猜测与失误特征 | |
RKT[ | 2020 | 注意力机制 | 利用上下文信息来增强自注意力机制,采用对指数衰减核函数建模学生遗忘行为 | |
EKTA[ | 2021 | 注意力机制 | 改进EERNNA,追踪学生对特定KC的掌握程度 | |
MF-DAKT[ | 2021 | 注意力机制 | DAKT从不同角度捕捉因子和因子相互作用中包含的信息 | |
ATKT[ | 2021 | 注意力机制 | 利用高效注意力-LSTM自适应聚合先前知识隐藏状态的信息,通过AK增强模型的泛化能力 |
表3 深度学习方法对比
Table 3 Comparison of deep learning methods
模型 | 发表年份 | 类别 | 方法概述 | 局限性 |
---|---|---|---|---|
DKT[ | 2015 | — | 采用RNN或LSTM建模学生序列 | — |
Deep-IRT[ | 2019 | 可解释性 | 结合心理测量理论IRT增强可解释性 | 模型参数较多,复杂度高,且每个参数也不具备传统方法的语义含义 |
KQN[ | 2019 | 可解释性 | 利用向量化表示提高模型的直观性与可解释性 | |
AKT[ | 2020 | 可解释性 | 与一系列可解释的心理测量模型组件相结合 | |
TC-MIRT[ | 2021 | 可解释性 | 结合MIRT构建增强网络组件生成可解释的参数 | |
DKT+[ | 2018 | 融合学习特征 | 引入对应DKT模型的重构以及三个正则化项,增强预测的一致性 | 将各类异构特征整合到数据中存在一定难度,参数量过大,需要加大计算量,模型训练速度低,训练周期过长 |
DKT+HFE[ | 2018 | 融合学习特征 | 应用树的分类器将异构特征隐式有效地嵌入DKT | |
DKT-DSC[ | 2018 | 融合学习特征 | 定期将学生按能力动态分组,隐式嵌入学生特征 | |
PDKT-C[ | 2018 | 融合学习特征 | 结合题目信息的先决条件进行建模作为约束 | |
DHKT[ | 2019 | 融合学习特征 | 建模题目和KC之间的层次结构 | |
DKT+FB[ | 2019 | 融合学习特征 | 融合三点与遗忘相关的附加信息 | |
DynEmb[ | 2020 | 融合学习特征 | 将矩阵分解技术与RNN相结合 | |
DKT LSTM[ | 2021 | 融合学习特征 | 融合学生异构特征与遗忘行为相关信息,通过SAEN进行建模 | |
LANA[ | 2021 | 融合学习特征 | 通过一种新颖的SRFE从学生各自的交互序列中提取其内在属性 | |
DKVMN[ | 2017 | 动态键值记忆网络 | 借鉴MANN,利用静态、动态外部矩阵分别读写学生知识状态 | 网络模型较大,知识增长的计算有一定局限性,捕捉序列之间的依赖关系能力有限 |
DKVMN-CA[ | 2019 | 动态键值记忆网络 | 改进DKVMN,支持人工标注概念树 | |
LPKT[ | 2020 | 动态键值记忆网络 | 采用DKVMN,结合学生的知识现状,完善模型的遗忘机制 | |
DKVMN-LA[ | 2021 | 动态键值记忆网络 | 引入学生学习能力与行为特征的多功能知识追踪算法 | |
CKT[ | 2020 | 3D卷积网络 | 采用3D ConvNets强化学生近期的学习状态与练习的短期特征 | 依赖特征工程提取,难以为高维数据提供准确的预测结果 |
GKT[ | 2019 | 图神经网络 | 利用GNN构建KC关系图 | 计算密集型,易受数据集大小的限制,由于KC划分粒度不一致,可能会直接影响学生知识状态的评估性能 |
GIKT[ | 2020 | 图神经网络 | 利用GCN提取练习-KC关系图中包含的高阶关系信息 | |
HGKT[ | 2020 | 图神经网络 | 结合练习之间的层次关系,建模练习学习依赖性 | |
JKT[ | 2021 | 图神经网络 | 联合图卷积网络提取隐藏在“练习-KC”图中的深层隐式信息 | |
DGMN[ | 2021 | 图神经网络 | 利用外部记忆结构的知识状态动态构建潜在KC及其关系图,同时考虑遗忘行为 | |
HMN[ | 2021 | 层次记忆网络 | 基于ASMM与DNC建模KT中的学生记忆机制 | 外部存储矩阵对记忆的划分不一,可能直接影响预测的准确度 |
EERNNA[ | 2018 | 注意力机制 | 嵌入练习文本特征建模学生的学习过程 | 抑制了注意力机制周围的其他信息,没有考虑到练习的顺序 |
SAKT[ | 2019 | 注意力机制 | 基于自注意力机制建模学生的交互历史,减少无关练习对目标练习的影响 | |
SAINT[ | 2020 | 注意力机制 | 改进SAKT,基于深度自注意层建模练习和学生回答之间的关系 | |
AKT[ | 2020 | 注意力机制 | 建模题目与回答的上下文感知,表示提取学生的猜测与失误特征 | |
RKT[ | 2020 | 注意力机制 | 利用上下文信息来增强自注意力机制,采用对指数衰减核函数建模学生遗忘行为 | |
EKTA[ | 2021 | 注意力机制 | 改进EERNNA,追踪学生对特定KC的掌握程度 | |
MF-DAKT[ | 2021 | 注意力机制 | DAKT从不同角度捕捉因子和因子相互作用中包含的信息 | |
ATKT[ | 2021 | 注意力机制 | 利用高效注意力-LSTM自适应聚合先前知识隐藏状态的信息,通过AK增强模型的泛化能力 |
Datasets | Students | Concepts | Records | Website |
---|---|---|---|---|
ASSISTments2009 | 4 417 | 124 | 325 637 | https://sites.google.com/site/assistmentsdata/home/assistment-2009-2010data/skill-builder-data-2009-2010 |
ASSISTments2012 | 27 405 | 265 | 2 541 201 | https://sites.google.com/site/assistmentsdata/home/2012-13-school-data-with-affect |
ASSISTments2015 | 19 917 | 100 | 683 801 | https://sites.google.com/site/assistmentsdata/home/2015-assistments-skill-builderdata |
ASSISTments2017 | 1 709 | 102 | 942 816 | https://sites.google.com/view/assistmentsdatamining/dataset?authuser=0 |
KDD Cup2010 | 574 | 436 | 607 026 | https://pslcdatashop.web.cmu.edu/KDDCup/downloads.jsp |
Statics2011 | 335 | 1 362 | 361 092 | https://pslcdatashop.web.cmu.edu/DatasetInfo?datasetId=507 |
Synthetic-5 | 4 000 | 50 | 200 000 | https://github.com/chrispiech/DeepKnowledgeTracing/tree/master/data/synthetic |
slepemapy.cz | 91 331 | 1 459 | 10 087 306 | https://www.fi.muni.cz/adaptivelearning/?a=data |
EdNet | 784 309 | 293 | 131 441 538 | https://github.com/riiid/ednet |
Junyi Academy | 238 120 | 722 | 26 666 117 | https://pslcdatashop.web.cmu.edu/DatasetInfo?datasetId=1198 |
表4 公开数据集统计及其下载链接
Table 4 Public dataset statistics and download links
Datasets | Students | Concepts | Records | Website |
---|---|---|---|---|
ASSISTments2009 | 4 417 | 124 | 325 637 | https://sites.google.com/site/assistmentsdata/home/assistment-2009-2010data/skill-builder-data-2009-2010 |
ASSISTments2012 | 27 405 | 265 | 2 541 201 | https://sites.google.com/site/assistmentsdata/home/2012-13-school-data-with-affect |
ASSISTments2015 | 19 917 | 100 | 683 801 | https://sites.google.com/site/assistmentsdata/home/2015-assistments-skill-builderdata |
ASSISTments2017 | 1 709 | 102 | 942 816 | https://sites.google.com/view/assistmentsdatamining/dataset?authuser=0 |
KDD Cup2010 | 574 | 436 | 607 026 | https://pslcdatashop.web.cmu.edu/KDDCup/downloads.jsp |
Statics2011 | 335 | 1 362 | 361 092 | https://pslcdatashop.web.cmu.edu/DatasetInfo?datasetId=507 |
Synthetic-5 | 4 000 | 50 | 200 000 | https://github.com/chrispiech/DeepKnowledgeTracing/tree/master/data/synthetic |
slepemapy.cz | 91 331 | 1 459 | 10 087 306 | https://www.fi.muni.cz/adaptivelearning/?a=data |
EdNet | 784 309 | 293 | 131 441 538 | https://github.com/riiid/ednet |
Junyi Academy | 238 120 | 722 | 26 666 117 | https://pslcdatashop.web.cmu.edu/DatasetInfo?datasetId=1198 |
Specific metric | Formula mode |
---|---|
AUC | — |
ACC | |
RMSE | |
MAE | |
LL |
表5 常用的评估指标
Table 5 Commonly used evaluation indicators
Specific metric | Formula mode |
---|---|
AUC | — |
ACC | |
RMSE | |
MAE | |
LL |
模型大类 | 提出时间 | 原理 | 优势 | 局限性 | 适用场景 |
---|---|---|---|---|---|
BKT | 1994 | HMM | 模型简单,具有可靠的教学可解释性 | 依赖教育专家标注矩阵以及简化的假设 | 适用于根据先验知识状态自动给每个学生推荐题目的场景,需要先得到先验分布 |
FAM | 2006 | Logistic模型 | 模型简单,增加练习-KC的Q矩阵,具有可靠的教学可解释性 | 依赖教育专家标注矩阵,需要手工输入特征 | 适用于从历史数据中学习一般参数对学生建模进而预测作答表现的场景,需要教育专家标注的Q矩阵 |
DKT | 2015 | RNN/LSTM | 性能较优,自动输入特征学习,无需专家对KC进行显示编码 | 模型复杂,训练规模较大,不具备教学可解释性 | 适用于不需要解释学生知识状态、只需给出学生学习结果情况的场景,如智能组卷场景 |
DKVMN | 2017 | MANN | 网络相对简单,提高了模型的记忆能力 | 参数较多,训练规模大 | 适用于学生日常练习记录交互日志,在练习-KC单一映射的场景下快速建立学生知识状态的掌握情况 |
GKT | 2019 | GNN | 建模底层KC之间的关系图,具有一定的可解释性 | 实际教学中KC划分粒度不一致直接影响学生知识状态的评估性能 | 适用于练习-KC之间存在多重复杂关系的场景,对学生知识状态掌握情况的细节要求较高,具备一定的可解释性 |
表6 模型比较
Table 6 Model comparison
模型大类 | 提出时间 | 原理 | 优势 | 局限性 | 适用场景 |
---|---|---|---|---|---|
BKT | 1994 | HMM | 模型简单,具有可靠的教学可解释性 | 依赖教育专家标注矩阵以及简化的假设 | 适用于根据先验知识状态自动给每个学生推荐题目的场景,需要先得到先验分布 |
FAM | 2006 | Logistic模型 | 模型简单,增加练习-KC的Q矩阵,具有可靠的教学可解释性 | 依赖教育专家标注矩阵,需要手工输入特征 | 适用于从历史数据中学习一般参数对学生建模进而预测作答表现的场景,需要教育专家标注的Q矩阵 |
DKT | 2015 | RNN/LSTM | 性能较优,自动输入特征学习,无需专家对KC进行显示编码 | 模型复杂,训练规模较大,不具备教学可解释性 | 适用于不需要解释学生知识状态、只需给出学生学习结果情况的场景,如智能组卷场景 |
DKVMN | 2017 | MANN | 网络相对简单,提高了模型的记忆能力 | 参数较多,训练规模大 | 适用于学生日常练习记录交互日志,在练习-KC单一映射的场景下快速建立学生知识状态的掌握情况 |
GKT | 2019 | GNN | 建模底层KC之间的关系图,具有一定的可解释性 | 实际教学中KC划分粒度不一致直接影响学生知识状态的评估性能 | 适用于练习-KC之间存在多重复杂关系的场景,对学生知识状态掌握情况的细节要求较高,具备一定的可解释性 |
方法 | ASSIST2009 | ASSIST2012 | ASSIST2015 | ASSIST2017 | KDD Cup 2010 | Statics2011 | Synthetic-5 | slepemapy.cz | EdNet | Junyi | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|
BKT | 67.00 | — | — | — | — | — | — | — | — | — | ||
BKT+ | — | — | — | — | — | — | 80.00 | — | — | — | ||
PFA | 70.00 | — | — | — | — | — | — | — | — | — | ||
IRT | 73.00 | 73.17 | — | — | — | 68.00 | — | — | — | — | ||
KTM | 81.86 | — | — | — | — | — | — | — | — | — | ||
RKTM | 76.50 | 76.90 | 72.62 | — | — | 83.39 | 81.86 | — | — | — | ||
DKT | 86.00 | — | 72.52 | — | — | 80.20 | 75.00 | — | — | — | ||
DKT+ | 82.27 | — | 73.71 | 73.43 | — | 83.49 | 82.64 | — | — | — | ||
DKT+HFE | 74.70 | — | — | — | — | — | — | — | — | 73.30 | ||
DKT-DSC | 91.00 | 87.00 | 87.00 | — | 81.00 | — | — | — | — | — | ||
PDKT-C | 78.00 | — | — | — | — | — | — | — | — | — | ||
DHKT | 78.66 | 77.47 | — | — | — | 83.33 | — | — | — | — | ||
DKT+FB | — | 73.09 | — | — | — | — | — | 80.46 | — | — | ||
DKT LSTM | 73.68 | — | — | — | — | 68.15 | — | — | — | — | ||
DynEmb | 73.90 | 73.60 | — | — | 86.80 | — | — | — | — | — | ||
LANA | — | — | — | — | — | — | — | — | 80.59 | — | ||
DKVMN | 81.57 | — | 72.68 | — | — | 82.84 | 82.73 | — | — | — | ||
Deep-IRT | 81.65 | — | 72.88 | — | — | 83.09 | 82.98 | — | — | — | ||
LPKT | 82.35 | — | 73.83 | — | — | — | — | — | — | — | ||
DKVMN-LA | 91.90 | — | — | — | — | — | — | — | — | — | ||
CKT | 82.54 | — | 72.91 | — | — | 82.41 | 82.45 | — | — | — | ||
KQN | 82.32 | — | 73.40 | — | — | 83.20 | 82.81 | — | — | — | ||
HMN | 82.73 | — | 73.01 | 73.10 | — | 83.51 | — | — | — | — | ||
GKT | 72.30 | — | — | — | 76.90 | — | — | — | — | — | ||
GIKT | 78.96 | 77.54 | — | — | — | — | — | — | 75.23 | — | ||
JKT | 79.80 | — | 76.50 | 74.50 | 83.40 | 85.60 | 85.90 | — | — | — | ||
DGMN | 86.10 | — | — | — | — | 86.40 | — | — | — | — | ||
SAKT | 84.80 | — | 85.40 | 73.40 | — | 85.30 | 83.20 | — | — | — | ||
SAINT | — | — | — | — | — | — | — | — | 78.11 | — | ||
AKT | 83.46 | — | 78.28 | 77.02 | — | 82.68 | — | — | — | — | ||
RKT | — | 79.30 | — | — | — | — | — | — | — | 86.00 | ||
MF-DAKT | 85.10 | — | — | — | — | — | — | — | 77.60 | — | ||
ATKT | 82.44 | — | 80.45 | 72.97 | — | 83.25 | — | — | — | — |
表7 模型性能对比概览
Table 7 Overview of model performance %
方法 | ASSIST2009 | ASSIST2012 | ASSIST2015 | ASSIST2017 | KDD Cup 2010 | Statics2011 | Synthetic-5 | slepemapy.cz | EdNet | Junyi | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|
BKT | 67.00 | — | — | — | — | — | — | — | — | — | ||
BKT+ | — | — | — | — | — | — | 80.00 | — | — | — | ||
PFA | 70.00 | — | — | — | — | — | — | — | — | — | ||
IRT | 73.00 | 73.17 | — | — | — | 68.00 | — | — | — | — | ||
KTM | 81.86 | — | — | — | — | — | — | — | — | — | ||
RKTM | 76.50 | 76.90 | 72.62 | — | — | 83.39 | 81.86 | — | — | — | ||
DKT | 86.00 | — | 72.52 | — | — | 80.20 | 75.00 | — | — | — | ||
DKT+ | 82.27 | — | 73.71 | 73.43 | — | 83.49 | 82.64 | — | — | — | ||
DKT+HFE | 74.70 | — | — | — | — | — | — | — | — | 73.30 | ||
DKT-DSC | 91.00 | 87.00 | 87.00 | — | 81.00 | — | — | — | — | — | ||
PDKT-C | 78.00 | — | — | — | — | — | — | — | — | — | ||
DHKT | 78.66 | 77.47 | — | — | — | 83.33 | — | — | — | — | ||
DKT+FB | — | 73.09 | — | — | — | — | — | 80.46 | — | — | ||
DKT LSTM | 73.68 | — | — | — | — | 68.15 | — | — | — | — | ||
DynEmb | 73.90 | 73.60 | — | — | 86.80 | — | — | — | — | — | ||
LANA | — | — | — | — | — | — | — | — | 80.59 | — | ||
DKVMN | 81.57 | — | 72.68 | — | — | 82.84 | 82.73 | — | — | — | ||
Deep-IRT | 81.65 | — | 72.88 | — | — | 83.09 | 82.98 | — | — | — | ||
LPKT | 82.35 | — | 73.83 | — | — | — | — | — | — | — | ||
DKVMN-LA | 91.90 | — | — | — | — | — | — | — | — | — | ||
CKT | 82.54 | — | 72.91 | — | — | 82.41 | 82.45 | — | — | — | ||
KQN | 82.32 | — | 73.40 | — | — | 83.20 | 82.81 | — | — | — | ||
HMN | 82.73 | — | 73.01 | 73.10 | — | 83.51 | — | — | — | — | ||
GKT | 72.30 | — | — | — | 76.90 | — | — | — | — | — | ||
GIKT | 78.96 | 77.54 | — | — | — | — | — | — | 75.23 | — | ||
JKT | 79.80 | — | 76.50 | 74.50 | 83.40 | 85.60 | 85.90 | — | — | — | ||
DGMN | 86.10 | — | — | — | — | 86.40 | — | — | — | — | ||
SAKT | 84.80 | — | 85.40 | 73.40 | — | 85.30 | 83.20 | — | — | — | ||
SAINT | — | — | — | — | — | — | — | — | 78.11 | — | ||
AKT | 83.46 | — | 78.28 | 77.02 | — | 82.68 | — | — | — | — | ||
RKT | — | 79.30 | — | — | — | — | — | — | — | 86.00 | ||
MF-DAKT | 85.10 | — | — | — | — | — | — | — | 77.60 | — | ||
ATKT | 82.44 | — | 80.45 | 72.97 | — | 83.25 | — | — | — | — |
[1] | BAKHSHINATEGH B, ZAIANE O R, ELATIA S, et al. Educational data mining applications and tasks: a survey of the last 10 years[J]. Education and Information Technologies, 2018, 23: 537-553. |
[2] | HA H, HWANG U, HONG Y, et al. Deep trustworthy know-ledge tracing[J]. arXiv:1805.10768v3, 2018. |
[3] | 刘恒宇, 张天成, 武培文, 等. 知识追踪综述[J]. 华东师范大学学报(自然科学版), 2019(5): 9-23. |
LIU H Y, ZHANG T C, WU P W, et al. A review of know-ledge tracking[J]. Journal of East China Normal University (Natural Science), 2019(5): 9-23. | |
[4] | CASALINO G, GRILLI L, LIMONE P, et al. Deep learning for knowledge tracing in learning analytics: an overview[C]// Proceedings of the 1st Workshop on Technology Enhanced Learning Environments for Blended Education-The Italian e-Learning Conference 2021, Foggia, Jan 21-22, 2021: 1-10. |
[5] | 胡学钢, 刘菲, 卜晨阳. 教育大数据中认知跟踪模型研究进展[J]. 计算机研究与发展, 2020, 57(12): 2523-2546. |
HU X G, LIU F, BU C Y. Research advances on knowledge tracing models in educational big data[J]. Journal of Com-puter Research and Development, 2020, 57(12): 2523-2546. | |
[6] | GERVET T, KOEDINGER K, SCHNEIDER J, et al. When is deep learning the best approach to knowledge tracing?[J]. Journal of Educational Data Mining, 2020, 12(3): 31-54. |
[7] | 梁琨, 任依梦, 尚余虎, 等. 深度学习驱动的知识追踪研究进展综述[J]. 计算机工程与应用, 2021, 57(21): 41-58. |
LIANG K, REN Y M, SHANG Y H, et al. Review of knowledge tracing preprocessing based on deep learning[J]. Computer Engineering and Applications, 2021, 57(21): 41-58. | |
[8] | PANDEY S, KARYPIS G, SRIVASTAVA J, An empirical comparison of deep learning models for knowledge tracing on large-scale dataset[J].arXiv:2101.06373, 2021. |
[9] | SAPOUNTZI A, BHULAI S, CORNELISZ I, et al. Dynamic knowledge tracing models for large-scale adaptive learning environments[J]. International Journal on Advances in Intel-ligent Systems, 2019, 12: 93-110. |
[10] | CORBETT A T, ANDERSON J R. Knowledge tracing: mo-deling the acquisition of procedural knowledge[J]. The Journal of Personalization Research, 1994, 4(4): 253-278. |
[11] | BAUM L E, PETRIE T. Statistical inference for probabilistic functions of finite state Markov chains[J]. Annals of Mathe-matical Statistics, 1966, 37(6): 1554-1563. |
[12] | PARDOS Z A, HEFFERNAN N T. Modeling individualiza-tion in a Bayesian networks implementation of knowledge tracing[C]// LNCS 6075: Proceedings of the 18th Interna-tional Conference on User Modeling, Adaptation, and Per-sonalization, Big Island, Jun 20-24, 2010. Berlin, Heidel-berg: Springer, 2010: 255-266. |
[13] | LEE J I, BRUNSKILL E. The impact on individualizing student models on necessary practice opportunities[C]// Pro-ceedings of the 5th International Conference on Educational Data Mining, Chania, Jun 19-21, 2012: 118-125. |
[14] | YUDELSON M V, KOEDINGER K R, GORDON G J. Individualized Bayesian knowledge tracing models[C]// LNCS 7926: Proceedings of the 16th International Conference on Artificial Intelligence in Education, Memphis, Jul 9-13, 2013. Berlin, Heidelberg: Springer, 2013: 171-180. |
[15] | WANG D, ZHANG Z, SONG J, et al. Traditional knowledge tracing models for clustered students[J]. The Educational Review, USA, 2020, 4(12): 244-251. |
[16] | KÄSER T, KLINGLER S, SCHWING A G, et al. Beyond knowledge tracing: modeling skill topologies with Bayesian networks[C]// LNCS 8474: Proceedings of the 12th Interna-tional Conference on Intelligent Tutoring Systems, Honolulu, Jun 5-9, 2014. Cham: Springer, 2014: 188-198. |
[17] | HAWKINS W J, HEFFERNAN N T. Using similarity to the previous problem to improve Bayesian knowledge tracing[C]// Proceedings of the Workshops held at Educational Data Mining 2014, co-located with 7th International Conference on Educational Data Mining, London, Jul 4-7, 2014: 1-5. |
[18] | WANG Z, ZHU J L, LI X, et al. Structured knowledge tracing models for student assessment on coursera[C]// Pro-ceedings of the 3rd ACM Conference on Learning @Scale, Edinburgh, Apr 25-26, 2016. New York: ACM, 2016: 209-212. |
[19] | KÄSER T, KLINGLER S, SCHWING A G, et al. Dynamic Bayesian networks for student modeling[J]. IEEE Transactions on Learning Technologies, 2017, 10(4): 450-462. |
[20] | FALAKMASIR M H, YUDELSON M, RITTER S, et al. Spectral Bayesian knowledge tracing[C]// Proceedings of the 8th International Conference on Educational Data Mining, Madrid, Jun 26-29, 2015: 360-363. |
[21] | WANG Y T, HEFFERNAN N T. Extending knowledge tracing to allow partial credit: using continuous versus binary nodes[C]// LNCS 7926: Proceedings of the 2013 International Conference on Artificial Intelligence in Education, Memphis, Jul 9-13, 2013. Berlin, Heidelberg: Springer, 2013: 181-188. |
[22] | ZHANG K, YAO Y Y. A three learning states Bayesian knowledge tracing model[J]. Knowledge-Based Systems, 2018, 148: 189-201. |
[23] | DE BAKER R S J, CORBETT A T, ALEVEN V. More accurate student modeling through contextual estimation of slip and guess probabilities in Bayesian knowledge tracing[C]// LNCS 5091: Proceedings of the 9th International Con-ference on Intelligent Tutoring Systems, Montreal, Jun 23-27, 2008. Berlin,Heidelberg: Springer, 2008: 406-415. |
[24] | PARDOS Z A, HEFFERNAN N T. KT-IDEM: introducing item difficulty to the knowledge tracing model[C]// LNCS 6787: Proceedings of the 19th International Conference on User Modeling, Adaption, and Personalization, Girona, Jul 11-15, 2011. Berlin, Heidelberg: Springer, 2011: 243-254. |
[25] | XU Y B, CHANG K M, YUAN Y R, et al. Using EEG in knowledge tracing[C]// Proceedings of the 7th International Conference on Educational Data Mining, London, Jul 4-7, 2014: 361-362. |
[26] | SPAULDING S, BREAZEAL C. Affect and inference in Bayesian knowledge tracing with a robot tutor[C]// Procee-dings of the 10th Annual ACM/IEEE International Confer-ence on Human-Robot Interaction, Portland, Mar 2-5, 2015. New York: ACM, 2015: 219-220. |
[27] | 黄诗雯, 刘朝晖, 罗凌云, 等. 融合行为和遗忘因素的贝叶斯知识追踪模型研究[J]. 计算机应用研究, 2021, 38(7): 1993-1997. |
HUANG S W, LIU C H, LUO L Y, et al. Research on Bayesian knowledge tracking model integrating behavior and forgetting factors[J]. Application Research of Computers, 2021, 38(7): 1993-1997. | |
[28] | AGARWAL D, BAKER R, MURALEEDHARAN A. Dyna-mic knowledge tracing through data driven recency weights[C]// Proceedings of the 13th International Conference on Educational Data Mining, Jul 10-13, 2020: 725-729. |
[29] | HAMBLETON R K, SWAMINATHAN H. Item response theory: a basic concept[J]. Educational Research and Reviews, 2017, 12(5): 258. |
[30] | DEONOVIC B, YUDELSON M, BOLSINOVA M, et al. Learning meets assessment: on the relation between item response theory and Bayesian knowledge tracing[J]. Behavior-metrika, 2018, 45(2): 457-474. |
[31] | CEN H, KOEDINGER K R, JUNKER B. Learning factors analysis—a general method for cognitive model evaluation and improvement[C]// LNCS 4053: Proceedings of the 8th International Conference on Intelligent Tutoring Systems, Taiwan, China, Jun 26-30, 2006. Berlin, Heidelberg: Springer, 2006: 164-175. |
[32] | PAVLIK P I, CEN H, KOEDINGER K R. Performance factors analysis—a new alternative to knowledge tracing[C]// Proceedings of the 14th International Conference on Artificial Intelligence in Education, Brighton, Jul 6-10, 2009. Amsterdam: IOS Press, 2009: 531-538. |
[33] | VIE J J, KASHIMA H. Knowledge tracing machines: facto-rization machines for knowledge tracing[C]// Proceedings of the 33rd AAAI Conference on Artificial Intelligence, the 31st Innovative Applications of Artificial Intelligence Con-ference, the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, Honolulu, Jan 27-Feb 1, 2019. Menlo Park:AAAI, 2019: 750-757. |
[34] | RENDLE S. Factorization machines[C]// Proceedings of the 10th IEEE International Conference on Data Mining, Sydney, Dec 14-17, 2010. Washington: IEEE Computer Society, 2010: 995-1000. |
[35] | LAI Z F, WANG L, LING Q. Recurrent knowledge tracing machine based on the knowledge state of students[J]. Expert Systems, 2021, 38(8): e12782. |
[36] | VAN HOUDT G, MOSQUERA C, NÁPOLES G. A review on the long short-term memory model[J]. Artificial Intellig-ence Review, 2020, 53(8): 5929-5955. |
[37] | CHUNG J, GÜLÇEHRE C, CHO K, et al. Gated feedback recurrent neural networks[C]// Proceedings of the 32nd Inter-national Conference on Machine Learning, Lille, Jul 6-11, 2015: 2067-2075. |
[38] | LEE M, CHANG J H. Augmented latent features of deep neural network-based automatic speech recognition for motor-driven robots[J]. Applied Sciences, 2020, 10(13): 4602-4611. |
[39] | ZHANG L, XIANG X. Video event classification based on two-stage neural network[J]. Multimedia Tools and Applica-tions, 2020, 79(29): 21471-21486. |
[40] | PANDEY S, KARYPIS G, SRIVASTAVA J. An empirical comparison of deep learning models for knowledge tracing on large-scale dataset[J].arXiv:2101.06373, 2021. |
[41] | PIECH C, SPENCER J, HUANG J, et al. Deep knowledge tracing[J]. arXiv:1506.05908, 2015. |
[42] | SCHMIDHUBER J. Deep learning in neural networks: an overview[J]. Neural Networks, 2015, 61: 85-117. |
[43] | KASHIN B S, TEMLYAKOV V N. A remark on compressed sensing[J]. Mathematical Notes, 2007, 82(5): 748-755. |
[44] | KHAJAH M, LINDSEY R V, MOZER M C. How deep is knowledge tracing?[J]. |
[45] | XIONG X L, ZHAO S Y, VAN INWEGEN E, et al. Going deeper with deep knowledge tracing[C]// Proceedings of the 9th International Conference on Educational Data Mining, Raleigh, Jun 29-Jul 2, 2016: 545-550. |
[46] | WIETING J, KIELA D. No training required: exploring random encoders for sentence classification[J]. arXiv:1901.10444, 2019. |
[47] | DING X, LARSON E C. On the interpretability of deep learning based models for knowledge tracing[J]. arXiv:2101.11335, 2021. |
[48] | CHOI Y, LEE Y, SHIN D, et al. EdNet: a large-scale hier-archical dataset in education[C]// LNCS 12164: Proceedings of the 21st International Conference on Artificial Intelligence in Education, Ifrane, Jul 6-10, 2020. Cham: Springer, 2020: 69-73. |
[49] | SU Y, CHENG Z Y, LUO P F, et al. Time-and-concept enhanced deep multidimensional item response theory for interpretable knowledge tracing[J]. Knowledge-Based Systems, 2021, 218: 106819. |
[50] | YEUNG C K, YEUNG D Y. Addressing two problems in deep knowledge tracing via prediction-consistent regulari-zation[C]// Proceedings of the 5th Annual ACM Conference on Learning at Scale, London, Jun 26-28, 2018. New York: ACM, 2018: 1-10. |
[51] | ZHANG L, XIONG X L, ZHAO S Y, et al. Incorporating rich features into deep knowledge tracing[C]// Proceedings of the 4th ACM Conference on Learning @Scale, Cam-bridge, Apr 20-21, 2017. New York: ACM, 2017: 169-172. |
[52] | YANG H Q, CHEUNG L P. Implicit heterogeneous features embedding in deep knowledge tracing[J]. Cognitive Com-putation, 2018, 10(1): 3-14. |
[53] | MINN S, YU Y, DESMARAIS M C, et al. Deep knowledge tracing and dynamic student classification for knowledge tracing[C]// Proceedings of the 2018 IEEE International Conference on Data Mining, Singapore, Nov 17-20, 2018. Washington: IEEE Computer Society, 2018: 1182-1187. |
[54] | CHEN P H, LU Y, ZHENG V W, et al. Prerequisite-driven deep knowledge tracing[C]// Proceedings of the 2018 IEEE International Conference on Data Mining, Singapore, Nov 17-20, 2018. Washington: IEEE Computer Society, 2018: 39-48. |
[55] | WANG T Q, MA F L, GAO J. Deep hierarchical knowledge tracing[C]// Proceedings of the 12th International Conference on Educational Data Mining, Montréal, Jul 2-5, 2019: 671-674. |
[56] | NAGATANI K, ZHANG Q, SATO M, et al. Augmenting knowledge tracing by considering forgetting behavior[C]// Proceedings of the 2019 World Wide Web Conference, San Francisco, May 13-17, 2019. New York: ACM, 2019: 3101-3107. |
[57] | YU L S, ZHENG X P. Research on deep knowledge tracking incorporating rich features and forgetting behaviors[J/OL]. Journal of Harbin Institute of Technology (New Series) ( 2021-01-12)[2021-10-21].http://kns.cnki.net/kcms/detail/23.1378.T.20210112.1403.004.html. |
[58] | XU L, DAVENPORT M A. Dynamic knowledge embedding and tracing[J]. arXiv:2005.09109, 2020. |
[59] | ZHOU Y, Li X, Cao Y, et al. LANA: towards personalized deep knowledge tracing through distinguishable interactive sequences[J]. arXiv:2105.06266, 2021. |
[60] | WANG L, SY A, LIU L, et al. Deep knowledge tracing on programming exercises[C]// Proceedings of the 4th ACM Conference on Learning @Scale, Cambridge, Apr 20-21, 2017. New York: ACM, 2017: 201-204. |
[61] | ZHANG J N, SHI X J, KING I, et al. Dynamic key-value memory networks for knowledge tracing[C]// Proceedings of the 26th International Conference on World Wide Web, Perth, Apr 3-7, 2017. New York: ACM, 2017: 765-774. |
[62] | SANTORO A, BARTUNOV S, BOTVINICK M M, et al. Meta-learning with memory-augmented neural networks[C]// Proceedings of the 33rd International Conference on Ma-chine Learning, New York, Jun 19-24, 2016: 1842-1850. |
[63] | AI F Z, CHEN Y S, GUO Y C, et al. Concept-aware deep knowledge tracing and exercise recommendation in an online learning system[C]// Proceedings of the 12th International Conference on Educational Data Mining, Montréal, Jul 2-5, 2019: 240-245. |
[64] | YEUNG C K. Deep-IRT: make deep learning based know-ledge tracing explainable using item response theory[J]. arXiv:1904.11738, 2019. |
[65] | ZOU Y, YAN X, LI W. Knowledge tracking model based on learning process[J]. Journal of Computer and Communica-tions, 2020, 8(10): 7-17. |
[66] | SUN X, ZHAO X, LI B, et al. Dynamic key-value memory networks with rich features for knowledge tracing[J]. IEEE Transactions on Cybernetics, 2021. DOI: 10.1109/TCYB.2021.3051028. |
[67] | WEBER M, WALD T, ZLLNER J M. Temporal feature networks for CNN based object detection[J]. arXiv:2103. 12213, |
[68] | ALAM M, WANG J F, GUANGPEI C, et al. Convolutional neural network for the semantic segmentation of remote sensing images[J]. Mobile Networks and Applications, 2021, 26(1): 200-215. |
[69] | HARIZI R, WALHA R, DRIRA F, et al. Convolutional neural network with joint stepwise character/word modeling based system for scene text recognition[J]. Multimedia Tools and Applications, 2021, 81(3): 3091-3106. |
[70] | YAN M J, MENG J J, ZHOU C L, et al. Detecting spatio-temporal irregularities in videos via a 3D convolutional autoencoder[J]. Journal of Visual Communication and Image Representation, 2020, 67: 102747. |
[71] | MAQSOOD R, BAJWA U I, SALEEM G, et al. Anomaly recognition from surveillance videos using 3D convolution neural network[J]. Multimedia Tools and Applications, 2021, 80(12): 18693-18716. |
[72] | YANG S H, ZHU M X, HOU J Y, et al. Deep knowledge tracing with convolutions[J]. arXiv:2008.01169, 2020. |
[73] | LEE J, YEUNG D Y. Knowledge query network for know-ledge tracing: how knowledge interacts with skills[C]// Pro-ceedings of the 9th International Conference on Learning Analytics & Knowledge, Tempe, Mar 4-8, 2019. New York: ACM, 2019: 491-500. |
[74] | LIU S Y, ZOU R, SUN J W, et al. A hierarchical memory network for knowledge tracing[J]. Expert Systems with Applications, 2021, 177: 114935. |
[75] | ATKINSON R C, SHIFFRIN R M. Human memory: a proposed system and its control processes[M]//SPENCE K W, SPENCE J T. Psychology of Learning and Motivation. New York: Elsevier Science Inc., 1968. |
[76] | GRAVES A, WAYNE G, REYNOLDS M, et al. Hybrid computing using a neural network with dynamic external memory[J]. Nature, 2016, 538(7626): 471-476. |
[77] | NAKAGAWA H, IWASAWA Y, MATSUO Y. Graph-based knowledge tracing: modeling student proficiency using graph neural network[C]// Proceedings of the 2019 IEEE/WIC/ACM International Conference on Web Intelligence, Thessaloniki, Oct 14-17, 2019. New York: ACM, 2019: 156-163. |
[78] | VELIKOVI P, CUCURULL G, CASANOVA A, et al. Graph attention networks[J]. arXiv:1710.10903, 2017. |
[79] | YANG Y, SHEN J, QU Y R, et al. GIKT: a graph-based interaction model for knowledge tracing[J]. |
[80] | SONG X, LI J, TANG Y, et al. JKT: a joint graph convolu-tional network based deep knowledge tracing[J]. Informa-tion Sciences, 2021, 580: 510-523. |
[81] | TONG H, ZHOU Y, WANG Z. HGKT: introducing problem schema with hierarchical exercise graph for knowledge tracing[J]. arXiv:2006.16915, 2020. |
[82] | ABDELRAHMAN G, WANG Q. Deep graph memory networks for forgetting-robust knowledge tracing[J]. |
[83] | SU Y, LIU Q W, LIU Q, et al. Exercise-enhanced sequential modeling for student performance prediction[C]// Proceedings of the 32nd AAAI Conference on Artificial Intelligence, the 30th Innovative Applications of Artificial Intelligence, and the 8th AAAI Symposium on Educational Advances in Artificial Intelligence, New Orleans, Feb 2-7, 2018. Menlo Park: AAAI, 2018: 2435-2443. |
[84] | MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed representations of words and phrases and their composi-tionality[C]// Proceedings of the 27th Annual Conference on Neural Information Processing Systems, Lake Tahoe Nevada, Dec 5-8, 2013. Red Hook: Curran Associates, 2013: 3111-3119. |
[85] | LIU Q, HUANG Z Y, YIN Y, et al. EKT: exercise-aware knowledge tracing for student performance prediction[J]. IEEE Transactions on Knowledge and Data Engineering, 2021, 33(1): 100-115. |
[86] | PANDEY S, KARYPIS G. A self-attentive model for know-ledge tracing[J]. arXiv:1907.06837, 2019. |
[87] | CHOI Y, LEE Y, CHO J, et al. Towards an appropriate query, key, and value computation for knowledge tracing[C]// Pro-ceedings of the 7th ACM Conference on Learning@Scale. New York: ACM, 2020: 341-344. |
[88] | GHOSH A, HEFFERNAN N T, LAN A S. Context-aware attentive knowledge tracing[C]// Proceedings of the 26th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York: ACM, 2020: 2330-2339. |
[89] | PANDEY S, SRIVASTAVA J. RKT: relation-aware self-attention for knowledge tracing[C]// Proceedings of the 29th ACM International Conference on Information & Knowledge Management. New York: ACM, 2020: 1205-1214. |
[90] | ZHANG M Y, ZHU X N, ZHANG C H, et al. Multi-factors aware dual-attentional knowledge tracing[C]// Proceedings of the 30th ACM International Conference on Information & Knowledge Management. New York: ACM, 2021: 2588-2597. |
[91] | GUO X P, HUANG Z J, GAO J, et al. Enhancing know-ledge tracing via adversarial training[C]// Proceedings of the ACM Multimedia Conference. New York: ACM, 2021: 367-375. |
[92] | LIU Q, WU R Z, CHEN E H, et al. Fuzzy cognitive diagnosis for modelling examinee performance[J]. ACM Transactions on Intelligent Systems and Technology, 2018, 9(4): 1-26. |
[1] | 吕晓琦, 纪科, 陈贞翔, 孙润元, 马坤, 邬俊, 李浥东. 结合注意力与循环神经网络的专家推荐算法[J]. 计算机科学与探索, 2022, 16(9): 2068-2077. |
[2] | 张祥平, 刘建勋. 基于深度学习的代码表征及其应用综述[J]. 计算机科学与探索, 2022, 16(9): 2011-2029. |
[3] | 李冬梅, 罗斯斯, 张小平, 许福. 命名实体识别方法研究综述[J]. 计算机科学与探索, 2022, 16(9): 1954-1968. |
[4] | 任宁, 付岩, 吴艳霞, 梁鹏举, 韩希. 深度学习应用于目标检测中失衡问题研究综述[J]. 计算机科学与探索, 2022, 16(9): 1933-1953. |
[5] | 杨才东, 李承阳, 李忠博, 谢永强, 孙方伟, 齐锦. 深度学习的图像超分辨率重建技术综述[J]. 计算机科学与探索, 2022, 16(9): 1990-2010. |
[6] | 安凤平, 李晓薇, 曹翔. 权重初始化-滑动窗口CNN的医学图像分类[J]. 计算机科学与探索, 2022, 16(8): 1885-1897. |
[7] | 刘艺, 李蒙蒙, 郑奇斌, 秦伟, 任小广. 视频目标跟踪算法综述[J]. 计算机科学与探索, 2022, 16(7): 1504-1515. |
[8] | 赵小明, 杨轶娇, 张石清. 面向深度学习的多模态情感识别研究进展[J]. 计算机科学与探索, 2022, 16(7): 1479-1503. |
[9] | 夏鸿斌, 肖奕飞, 刘渊. 融合自注意力机制的长文本生成对抗网络模型[J]. 计算机科学与探索, 2022, 16(7): 1603-1610. |
[10] | 孙方伟, 李承阳, 谢永强, 李忠博, 杨才东, 齐锦. 深度学习应用于遮挡目标检测算法综述[J]. 计算机科学与探索, 2022, 16(6): 1243-1259. |
[11] | 刘雅芬, 郑艺峰, 江铃燚, 李国和, 张文杰. 深度半监督学习中伪标签方法综述[J]. 计算机科学与探索, 2022, 16(6): 1279-1290. |
[12] | 程卫月, 张雪琴, 林克正, 李骜. 融合全局与局部特征的深度卷积神经网络算法[J]. 计算机科学与探索, 2022, 16(5): 1146-1154. |
[13] | 钟梦圆, 姜麟. 超分辨率图像重建算法综述[J]. 计算机科学与探索, 2022, 16(5): 972-990. |
[14] | 裴利沈, 赵雪专. 群体行为识别深度学习方法研究综述[J]. 计算机科学与探索, 2022, 16(4): 775-790. |
[15] | 许嘉, 韦婷婷, 于戈, 黄欣悦, 吕品. 题目难度评估方法研究综述[J]. 计算机科学与探索, 2022, 16(4): 734-759. |
阅读次数 | ||||||
全文 |
|
|||||
摘要 |
|
|||||