[1] LAMPLE G, BALLESTEROS M, SUBRAMANIAN S, et al. Neural architectures for named entity recognition[C]//Pro-ceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, San Diego, Jun 12-17, 2016. Stroudsburg: ACL, 2016: 260-270.
[2] MA X Z, HOVY E H. End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Ling- uistics, Berlin, Aug 7-12, 2016. Stroudsburg: ACL, 2016: 1064-1074.
[3] ZHANG B L, WHITEHEAD S, HUANG L F, et al. Global attention for name tagging[C]//Proceedings of the 22nd Conference on Computational Natural Language Learning, Brussels, Oct 31-Nov 1, 2018. Stroudsburg: ACL, 2018: 86-96.
[4] LIU L Y, SHANG J B, REN X, et al. Empower sequence labeling with task-aware neural language model[C]//Pro-ceedings of the 32nd AAAI Conference on Artificial Intel-ligence, the 30th Innovative Applications of Artificial Intelligence, and the 8th AAAI Symposium on Educational Advances in Artificial Intelligence, New Orleans, Feb 2-7, 2018. Menlo Park: AAAI, 2018: 5253-5260.
[5] PETERS M E, NEUMANN M, IYYER M, et al. Deep contextualized word representations[C]//Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, New Orleans, Jun 1-6, 2018. Menlo Park:AAAI, 2018: 2227-2237.
[6] DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Com-putational Linguistics: Human Language Technologies, Minneapolis, Jun 2-7, 2019. Menlo Park: AAAI, 2019: 4171-4186.
[7] LIN B Y, LU W. Neural adaptation layers for cross-domain named entity recognition[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Oct 31-Nov 4, 2018. Stroudsburg:ACL, 2018: 2012-2022.
[8] DIAO S Z, XU R J, SU H J, et al. Taming pre-trained language models with N-gram representations for low-resource domain adaptation[C]//Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Aug 1-6, 2021. Stroudsburg:ACL, 2021: 3336-3349.
[9] LI X N, YAN H, QIU X P, et al. FLAT: Chinese NER using flat-lattice transformer[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Jul 5-10, 2020. Stroudsburg: ACL, 2020: 6836-6842.
[10] LAI Y X, LIU Y J, FENG Y S, et al. Lattice-BERT: leveraging multi-granularity representations in Chinese pre-trained language models[C]//Proceedings of the 2021 Con-ference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Jun 6-11, 2021. Stroudsburg: ACL, 2021: 1716-1731.
[11] FENG X C, FENG X C, QIN B, et al. Improving low resource named entity recognition using cross-lingual know-ledge transfer[C]//Proceedings of the 27th International Joint Conference on Artificial Intelligence, Stockholm, Jul 13-19, 2018: 4071-4077.
[12] HUANG L F, JI H, MAY J. Cross-lingual multi-level adversarial transfer to enhance low-resource name tagging [C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis,Jun 2-7, 2019. Stroudsburg: ACL, 2019: 3823-3833.
[13] KRUENGKRAI C, NGUYEN T H, ALJUNIED S M, et al. Improving low-resource named entity recognition using joint sentence and token labeling[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Jul 5-10, 2020. Stroudsburg: ACL, 2020: 5898-5905.
[14] LIN Y, YANG S Q, STOYANOV V, et al. A multi-lingual multi-task architecture for low-resource sequence labeling[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, Melbourne, Jul 15-20, 2018. Stroudsburg: ACL, 2018: 799-809.
[15] ZHANG S, LI S F, JIANG F G, et al. Recognizing small-sample biomedical named entity based on contextual domain relevance[C]//Proceedings of the 2019 IEEE 3rd Information Technology, Networking, Electronic and Automation Control, Chengdu, Mar 15-17, 2019. Piscataway: IEEE, 2019: 1509-1516.
[16] JU M Z, MIWA M, ANANIADOU S. A neural layered model for nested named entity recognition[C]//Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, New Orleans, Jun 1-6, 2018. Stroudsburg: ACL, 2018: 1446-1459.
[17] SOHRAB M G, MIWA M. Deep exhaustive model for nested named entity recognition[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Oct 31-Nov 4, 2018. Stroudsburg: ACL, 2018: 2843-2849.
[18] LUO Y, ZHAO H. Bipartite flat-graph network for nested named entity recognition[J]. arXiv:2005.00436, 2020.
[19] HARRIS Z S. Distributional structure[J]. Word, 1954, 10(2/3): 146-162.
[20] CHURCH K W. Word2Vec[J]. Natural Language Engineering, 2017, 23(1): 155-162.
[21] PENNINGTON J, SOCHER R, MANNING C D. GloVe: global vectors for word representation[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, Doha, Oct 25-29, 2014. Stroudsburg: ACL, 2014: 1532-1543.
[22] XU L, TONG Y, DONG Q Q, et al. CLUENER2020: fine-grained named entity recognition dataset and benchmark for Chinese[J]. arXiv:2001.04351, 2020.
[23] LEVOW G A. The third international Chinese language processing bakeoff: word segmentation and named entity recognition[C]//Proceedings of the 5th Workshop on Chinese Language Processing, Sydney, Jul 22-23, 2006. Stroudsburg: ACL, 2006: 108-117.
[24] ZHANG Y, YANG J. Chinese NER using lattice LSTM[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, Melbourne, Jul 15-20, 2018.Stroudsburg: ACL, 2018: 1554-1564. |