计算机科学与探索 ›› 2016, Vol. 10 ›› Issue (1): 130-141.DOI: 10.3778/j.issn.1673-9418.1505007

• 人工智能与模式识别 • 上一篇    下一篇

容量约束的自组织增量联想记忆模型

孙  桃,谢振平+,王士同,刘  渊   

  1. 江南大学 数字媒体学院,江苏 无锡 214122
  • 出版日期:2016-01-01 发布日期:2016-01-07

Self-Organizing Incremental Associative Memory Model under Capacity Constraint

SUN Tao, XIE Zhenping+, WANG Shitong, LIU Yuan   

  1. School of Digital Media, Jiangnan University, Wuxi, Jiangsu 214122, China
  • Online:2016-01-01 Published:2016-01-07

摘要: 自组织联想记忆神经网络因其并行、容错及自我学习等优点而得到广泛应用,但现有主流模型在增量学习较大规模样本时,网络节点数可能无限增长,从而给实际应用带来不可承受的内存及计算开销。针对该问题,提出了一种容量约束的自组织增量联想记忆模型。以网络节点数为先决控制参数,结合设计新的节点间自竞争学习策略,新模型可满足大规模样本的增量式学习需求,并能以较低的计算容量取得较高的联想记忆性能。理论分析表明了新模型的正确性与有效性,实验分析同时显示了新模型可有效控制计算容量,提升增量样本学习效率,并获得较高的联想记忆性能,从而能更好地满足现实应用需求。

关键词: 联想记忆, 容量约束, 增量学习, 自组织, 神经网络

Abstract: Due to the advantages of self-organizing neural network like parallelism, fault freedom and self-learning, it has been widely used all over the place. However, in traditional associative memory neural networks, the number of network nodes will unlimitedly grow when they incrementally learning more and more samples, which inevitably leads to an unaffordable overhead of computation and storage. To solve this problem, this paper proposes a self-organizing incremental associative memory model under capacity constraint. By limiting the number of network nodes and introducing a self-competition strategy between network nodes, new model is capable of incrementally learning large-scale samples and can gain equivalent associative memory performance only requiring lower computing demand. The reasonability of model is proved by theoretical analysis. Moreover, the experimental results demonstrate that new model can effectively control computing consumption, improve the efficiency of incrementally learning new samples, and obtain comparative associative memory performance, which may preferably satisfy the demands of many practical applications.

Key words: associative memory, capacity constraint, incremental learning, self-organizing, neural network