Journal of Frontiers of Computer Science and Technology ›› 2023, Vol. 17 ›› Issue (3): 678-686.DOI: 10.3778/j.issn.1673-9418.2105039

• Artificial Intelligence·Pattern Recognition • Previous Articles     Next Articles

Online Graph Regularized Non-negative Matrix Factorization Cross-Modal Hashing

LUO Xuemei, ZHENG Haihong, AN Yaqiang, WANG Di   

  1. College of Computer Science and Technology, Xidian University, Xi'an 710071, China
  • Online:2023-03-01 Published:2023-03-01

在线图正则化非负矩阵分解跨模态哈希

罗雪梅,郑海红,安亚强,王笛   

  1. 西安电子科技大学 计算机科学与技术学院,西安 710071

Abstract: Because of low storage cost and fast query speed, cross-modal hash is an effective cross-media retrieval method, which has been widely studied in recent years. However, most of the existing cross-modal hashing methods are based on the batch learning model, which does not effectively handle large datasets, consumes a lot of memory, and is inefficient in training stream data. Online learning can be used for cross-modal hashing to solve the above problems. However, most online cross-modal hashing methods mainly focus on mapping the data of different modes into a common low-dimensional space, so as to eliminate the heterogeneity between different modes and achieve cross-modal retrieval. However, the correlation between the data in the same mode is not fully exploited, and the final hashing code retrieval accuracy is not high. In this paper, online graph regularized non-negative matrix factorization cross-modal hashing (OGNMFH) retrieval method is proposed. By making full use of the local manifold structure information in the data mode and the category label information of the data, the inter-modal similarity and intra-modal similarity of the data are kept, and  higher discriminant hash codes are obtained. A large number of experimental results on three classical datasets demonstrate that the OGNMFH method can improve retrieval accuracy in online hash learning.

Key words: cross-modal retrieval, online hashing, matrix factorization, discrete optimization

摘要: 跨模态哈希具有存储成本低、查询速度快的特性,是一种有效的跨媒体检索方法,近年来得到了广泛的关注。然而,现有的跨模态哈希方法普遍基于批量学习模式来学习哈希函数,该模式不能处理大规模数据集,内存消耗大,训练流数据效率较低。为解决上述问题,提出了在线跨模态哈希方法,但大多数方法均关注于将不同模态的数据映射到一个公共的低维空间中,从而消除不同模态之间异构性,实现跨模态检索。而对于同一模态内数据之间的相关性没有充分挖掘利用,使得最终学习到的哈希编码检索精度不高。提出在线图正则化非负矩阵分解跨模态哈希(OGNMFH)检索方法,充分利用数据模态内的局部流形结构信息和数据的类别标签信息,同时保持数据模态间相似性和模态内相似性,得到判别性更高的哈希码。通过在三个经典数据集上进行大量实验,证明了该方法可有效提升在线哈希学习的检索精度。

关键词: 跨模态检索, 在线哈希, 矩阵分解, 离散优化