计算机科学与探索 ›› 2022, Vol. 16 ›› Issue (2): 296-304.DOI: 10.3778/j.issn.1673-9418.2107031

• 综述·探索 • 上一篇    下一篇

中文命名实体识别综述

赵山, 罗睿, 蔡志平+()   

  1. 国防科技大学 计算机学院,长沙 410073
  • 收稿日期:2021-07-08 修回日期:2021-09-22 出版日期:2022-02-01 发布日期:2021-09-28
  • 通讯作者: + E-mail: zpcai@nudt.edu.cn
  • 作者简介:赵山(1990—),男,安徽六安人,博士研究生,主要研究方向为自然语言处理。
    罗睿(1998—),男,江西高安人,硕士研究生,主要研究方向为自然语言处理。
    蔡志平(1975—),男,湖南益阳人,教授,博士生导师,CCF杰出会员,理论计算机科学专委会秘书长,主要研究方向为网络安全、大数据、分散计算。
  • 基金资助:
    国家重点研发计划(2020YFC2003400)

Survey of Chinese Named Entity Recognition

ZHAO Shan, LUO Rui, CAI Zhiping+()   

  1. College of Computer, National University of Defense Technology, Changsha 410073, China
  • Received:2021-07-08 Revised:2021-09-22 Online:2022-02-01 Published:2021-09-28
  • About author:ZHAO Shan, born in 1990, Ph.D. candidate. His research interest is natural language processing.
    LUO Rui, born in 1998, M.S. candidate. His research interest is natural language processing.
    CAI Zhiping, born in 1975, professor, Ph.D. supervisor, distinguished member of CCF. His research interests include network security, big data and dispersed computing.
  • Supported by:
    National Key Research and Development Program of China(2020YFC2003400)

摘要:

中文命名实体识别(NER)任务是信息抽取领域内的一个子任务,其任务目标是给定一段非结构文本后,从句子中寻找、识别和分类相关实体,例如人名、地名和机构名称。中文命名实体识别是一个自然语言处理(NLP)领域的基本任务,在许多下游NLP任务中,包括信息检索、关系抽取和问答系统中扮演着重要角色。全面回顾了现有的基于神经网络的单词-字符晶格结构的中文NER模型。首先介绍了中文NER相比英语NER难度更大,存在着中文文本相关实体边界难以确定和中文语法结构复杂等难点及挑战。然后调研了在不同神经网络架构下(RNN、CNN、GNN和Transformer)最具代表性的晶格结构的中文NER模型。由于单词序列信息可以给基于字符的序列学习更多边界信息,为了显式地利用每个字符所相关的词汇信息,过去的这些工作提出通过词-字符晶格结构将单词信息整合到字符序列中。这些在中文NER任务上基于神经网络的单词-字符晶格结构的性能要明显优于基于单词或基于字符的方法。最后介绍了中文NER的数据集及评价标准。

关键词: 命名实体识别(NER), 晶格结构, 神经网络

Abstract:

The Chinese named entity recognition (NER) task is a sub-task within the information extraction domain, where the task goal is to find, identify and classify relevant entities, such as names of people, places and organizations, from sentences given a piece of unstructured text. Chinese named entity recognition is a fundamental task in the field of natural language processing (NLP) and plays an important role in many downstream NLP tasks, including information retrieval, relationship extraction and question and answer systems. This paper provides a comprehensive review of existing neural network-based word-character lattice structures for Chinese NER models. Firstly, this paper introduces that Chinese NER is more difficult than English NER, and there are difficulties and challenges such as difficulty in determining the boundaries of Chinese text-related entities and complex Chinese grammatical structures. Secondly, this paper investigates the most representative lattice-structured Chinese NER models under different neural network architectures (RNN (recurrent neural network), CNN (convolutional neural network), GNN (graph neural network) and Transformer). Since word sequence information can capture more boundary information for character-based sequence learning, in order to explicitly exploit the lexical information associated with each character, some prior work has proposed integrating word information into character sequences via word-character lattice structures. These neural network-based word-character lattice structures perform significantly better than word-based or character-based approaches on the Chinese NER task. Finally, this paper introduces the dataset and evaluation criteria of Chinese NER.

Key words: named entity recognition (NER), lattice structure, neural network

中图分类号: