[1] YADAV V, BETHARD S. A survey on recent advances in named entity recognition from deep learning models[C]//Proceedings of the 27th International Conference on Com-putational Linguistics, Santa Fe, Aug 20-26, 2018. Strouds-burg: ACL, 2018: 2145-2158.
[2] COLLOBERT R, WESTON J, BOTTOU L, et al. Natural language processing (almost) from scratch[J]. Journal of Machine Learning Research, 2011, 12: 2493-2537.
[3] HUANG Z, XU W, YU K. Bidirectional LSTM-CRF models for sequence tagging[J]. arXiv:1508.01991, 2015.
[4] LAMPLE G, BALLESTEROS M, SUBRAMANIAN S, et al. Neural architectures for named entity recognition[C]//Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, San Diego, Jun 12-17, 2016. Stroudsburg: ACL, 2016: 260-270.
[5] CHIU J P C, NICHOLS E. Named entity recognition with bidirectional LSTM-CNNs[J]. Transactions of the Association for Computational Linguistics, 2016, 4: 357-370.
[6] MA X Z, HOVY E. End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguis-tics, Berlin, Aug 7-12, 2016. Stroudsburg: ACL, 2016: 1064-1074.
[7] ZHANG B L, PAN X M, WANG T L, et al. Name tagging for low-resource incident languages based on expectation-driven learning[C]//Proceedings of the 2016 Conference of the North American Chapter of the Association for Com-putational Linguistics: Human Language Technologies, San Diego, Jun 12-17, 2016. Stroudsburg: ACL, 2016: 249-259.
[8] ZHUANG F Z, QI Z, DUAN K, et al. A comprehensive survey on transfer learning[J]. arXiv:1911.02685, 2019.
[9] GOODFELLOW I J, POUGET-ABADIE J, MIRZA M, et al. Generative adversarial nets[C]//Proceedings of the Annual Conference on Neural Information Processing Systems, Mon-treal, Dec 8-13, 2014. Red Hook: Curran Associates, 2014: 2672-2680.
[10] LIU L, WANG D B. A review on named entity recognition [J]. Journal of the China Society for Scientific and Technical Information, 2018, 37(3): 329-340.
刘浏, 王东波. 命名实体识别研究综述[J]. 情报学报,2018, 37(3): 329-340.
[11] LI J, SUN A, HAN J, et al. A survey on deep learning for named entity recognition[J]. arXiv:1812.09449, 2018.
[12] PAN S J, YANG Q. A survey on transfer learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2009, 22(10): 1345-1359.
[13] WEISS K, KHOSHGOFTAAR T M, WANG D D. A survey of transfer learning[J]. Journal of Big Data, 2016, 3(1): 9.
[14] NADEAU D, SEKINE S. A survey of named entity recogni-tion and classification[J]. Computational Linguistics, 2007, 30(1): 3-26.
[15] LI J, SUN A X, JOTY S R. SegBot: a generic neural text segmentation model with pointer network[C]//Proceedings of the 27th International Joint Conference on Artificial Intel-ligence, Stockholm, Jul 13-19, 2018: 4166-4172.
[16] HUMPHREYS K, GAIZAUSKAS R, AZZAM S, et al. Uni-versity of Sheffield: description of the LaSIE-II system as used for MUC-7[C]//Proceedings of the 7th Message Unders-tanding Conference, Fairfax, Apr 29-May 1, 1998. Strouds-burg: ACL, 1998: 1-20.
[17] KRUPKA G R, ISOQUEST K H. Description of the NerOwl extractor system as used for MUC-7[C]//Proceedings of the 7th Message Understanding Conference, Virginia, 2005. Stroudsburg: ACL, 2005: 21-28.
[18] BLACK W J, RINALDI F, MOWATT D. FACILE: descrip-tion of the NE system used for MUC-7[C]//Proceedings of the 7th Message Understanding Conference, Fairfax, Apr 29-May 1, 1998. Stroudsburg: ACL, 1998: 1-10.
[19] EDDY S R. Hidden markov models[J]. Current Opinion in Structural Biology, 1996, 6(3): 361-365.
[20] QUINLAN J R. Induction of decision trees[J]. Machine Learning, 1986, 1(1): 81-106.
[21] KAPUR J N. Maximum-entropy models in science and engi-neering[M]. New York: John Wiley & Sons, Inc., 1989.
[22] SHAWE-TAYLOR J, CRISTIANINI N. Support vector mach-ines: an introduction to support vector machines and other kernel-based learning methods[M]. New York: Cambridge University Press, 2000.
[23] LAFFERTY J, MCCALLUM A, PEREIRA F C N. Condi-tional random fields: probabilistic models for segmenting and labeling sequence data[C]//Proceedings of the 18th Inter-national Conference on Machine Learning, Williamstown, Jun 28-Jul 1, 2001. San Francisco: Morgan Kaufmann Pub-lishers Inc, 2001: 282-289.
[24] STRUBELL E, VERGA P, BELANGER D, et al. Fast and accurate entity recognition with iterated dilated convolutions[C]//Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Copenhagen, Sep 9-11, 2017. Stroudsburg: ACL, 2017: 2670-2680.
[25] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Proceedings of the Annual Conference on Neural Information Processing Systems, Long Beach, Dec 4-9, 2017. Red Hook: Curran Associates, 2017: 5998-6008.
[26] SHEN Y Y, YUN H, LIPTON Z C, et al. Deep active learning for named entity recognition[C]//Proceedings of the 2nd Work-shop on Representation Learning for NLP, Vancouver, Aug 3, 2017. Stroudsburg: ACL, 2017: 252-256.
[27] YANG Z, SALAKHUTDINOV R, COHEN W. Multi-task cross-lingual sequence tagging from scratch[J]. arXiv:1603.06270, 2016.
[28] CHEN S D, OUYANG X Y. Overview of named entity recog-nition technology[J]. Radio Communications Technology, 2020, 46(3): 251-260.
陈曙东, 欧阳小叶. 命名实体识别技术综述[J]. 无线电通信技术, 2020, 46(3): 251-260.
[29] LI G, HUANG Y F. An approach to named entity recognition towards micro-blog[J]. Application of Electronic Technique, 2018, 44(1): 118-120.
李刚, 黄永峰. 一种面向微博文本的命名实体识别方法[J]. 电子技术应用, 2018, 44(1): 118-120.
[30] SHENG J. Transfer learning in named entity recognition[D]. Harbin: Harbin Institute of Technology, 2019.
[31] XU H L, LI Y Q, HE Y Q, et al. Research on Chinese nested named entity relation extraction[J]. Acta Scientiarum Natura-lium Universitatis Pekinensis, 2019, 55(1): 8-14.
许浩亮, 李雁群, 何云琪, 等. 中文嵌套命名实体关系抽取研究[J]. 北京大学学报(自然科学版), 2019, 55(1): 8-14.
[32] XIA C Y, ZHANG C W, YANG T, et al. Multi-grained named entity recognition[C]//Proceedings of the 57th Conference of the Association for Computational Linguistics, Florence, Jul 28-Aug 2, 2019. Stroudsburg: ACL, 2019: 1430-1440.
[33] ZHUANG F Z, LUO P, HE Q, et al. Survey on transfer learning research[J]. Journal of Software, 2015, 26(1): 26-39.
庄福振, 罗平, 何清, 等. 迁移学习研究进展[J]. 软件学报,2015, 26(1): 26-39.
[34] DAI W Y, YANG Q, XUE G R, et al. Boosting for transfer learning[C]//Proceedings of the 24th International Conference on Machine Learning, Corvallis, Jun 20-24, 2007. New York: ACM, 2007: 193-200.
[35] HUANG J Y, GRETTON A J, GRETTON A, et al. Correcting sample selection bias by unlabeled data[C]//Proceedings of the 20th Annual Conference on Neural Information Processing Systems, Vancouver, Dec 4-7, 2006. Cambridge: MIT Press, 2006: 601-608.
[36] PAN S J, TSANG I W, KWOK J T, et al. Domain adaptation via transfer component analysis[C]//Proceedings of the 21st International Joint Conference on Artificial Intelligence, Pasa-dena, Jul 11-17, 2009: 1187-1192.
[37] BORGWARDT K M, GRETTON A, RASCH M J, et al. Inte-grating structured biological data by kernel maximum mean discrepancy[J]. Bioinformatics, 2006, 22(14): 49-57.
[38] DUAN L, XU D, TSANG I W, et al. Domain adaptation from multiple sources: a domain-dependent regularization approach[J]. IEEE Transactions on Neural Networks, 2012, 23(3): 504-518.
[39] TOMMASI T, CAPUTO B. The more you know, the less you learn: from knowledge transfer to one-shot learning of object categories[C]//Proceedings of the British Machine Vision Conference, London, Sep 7-10, 2009. British Machine Vision Association, 2009: 1-11.
[40] YAO Y, DORETTO G. Boosting for transfer learning with multiple sources[C]//Proceedings of the 23rd IEEE Conference on Computer Vision and Pattern Recognition, San Francisco, Jun 13-18, 2010. Washington: IEEE Computer Society, 2010: 1855-1862.
[41] TZENG E, HOFFMAN J, ZHANG N, et al. Deep domain confusion: maximizing for domain invariance[J]. arXiv:1412.3474, 2014.
[42] LONG M S, CAO Y, WANG J M, et al. Learning transferable features with deep adaptation networks[C]//Proceedings of the 32nd International Conference on Machine Learning, Lille, Jul 6-11, 2015: 97-105.
[43] GRETTON A, BORGWARDT K M, RASCH M J, et al. A kernel two-sample test[J]. Journal of Machine Learning Re-search, 2012, 13: 723-773.
[44] LONG M S, ZHU H, WANG J M, et al. Deep transfer lear-ning with joint adaptation networks[C]//Proceedings of the 34th International Conference on Machine Learning, Sydney, Aug 6-11, 2017: 2208-2217.
[45] GANIN Y, USTINOVA E, AJAKAN H, et al. Domain-adver-sarial training of neural networks[M]//Csurka G. Domain Adap-tation in Computer Vision Applications. Berlin, Heidelberg:Springer, 2017.
[46] TZENG E, HOFFMAN J, SAENKO K, et al. Adversarial discriminative domain adaptation[C]//Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recogni-tion, Honolulu, Jul 21-26, 2017. Washington: IEEE Computer Society, 2017: 2962-2971.
[47] ZHANG J, DING Z, LI W, et al. Importance weighted adver-sarial nets for partial domain adaptation[C]//Proceedings of the 2018 IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, Jun 18-22, 2018. Washington:IEEE Computer Society, 2018: 8156-8164.
[48] ZIRIKLY A, HAGIWARA M. Cross-lingual transfer of named entity recognizers without parallel corpora[C]//Proceedings of the 53rd Annual Meeting of the Association for Comput-ational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Beijing, Jul 26-31, 2015. Strouds-burg: ACL, 2015: 390-396.
[49] NI J, DINU G, FLORIAN R, et al. Weakly supervised cross-lingual named entity recognition via effective annotation and representation projection[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Lingui-stics, Vancouver, Jul 30-Aug 4, 2017. Stroudsburg: ACL,2017: 1470-1480.
[50] YAROWSKY D, NGAI G, WICENTOWSKI R. Inducing multilingual text analysis tools via robust projection across aligned corpora[C]//Proceedings of the 1st International Con-ference on Human Language Technology Research, San Diego, Mar 18-21, 2001. San Francisco: Morgan Kaufmann Publishers Inc, 2001: 1-8.
[51] FENG X C, FENG X C, QIN B, et al. Improving low resource named entity recognition using cross-lingual knowledge tran-sfer[C]//Proceedings of the 27th International Joint Conference on Artificial Intelligence, Stockholm, Jul 13-19, 2018: 4071-4077.
[52] YANG Z L, SALAKHUTDINOV R, COHEN W W, et al. Transfer learning for sequence tagging with hierarchical recurrent networks[C]//Proceedings of the 5th International Conference on Learning Representations, Toulon, Apr 24-26, 2017: 1-10.
[53] ANDO R K, ZHANG T. A framework for learning pre-dictive structures from multiple tasks and unlabeled data[J]. Journal of Machine Learning Research, 2005, 6: 1817-1853.
[54] WANG Z H, QU Y R, SHEN L H, et al. Label-aware double transfer learning for cross specialty medical named entity recognition[C]//Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, New Orleans, Jun 1-6, 2018. Stroudsburg: ACL, 2018: 1-15.
[55] LIN B Y, LU W. Neural adaptation layers for cross-domain named entity recognition[C]//Proceedings of the 2018 Con-ference on Empirical Methods in Natural Language Process-ing, Brussels, Oct 31-Nov 4, 2018. Stroudsburg: ACL, 2018: 2012-2022.
[56] YANG H Y, HUANG S J, DAI X Y, et al. Fine-grained knowledge fusion for sequence labeling domain adaptation[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th Inter-national Joint Conference on Natural Language Processing, Hong Kong, China, Nov 3-7, 2019. Stroudsburg: ACL, 2019:4195-4204.
[57] CHEN L, MOSCHITTI A. Transfer learning for sequence labe-ling using source model and target data[J]. arXiv:1902.05309, 2019.
[58] ZHOU J T, ZHANG H, JIN D, et al. Dual adversarial neural transfer for low-resource named entity recognition[C]//Pro-ceedings of the 57th Conference of the Association for Com-putational Linguistics, Florence, Jul 28-Aug 2, 2019. Strouds-burg: ACL, 2019: 3461-3471.
[59] CAO P F, CHEN Y B, LIU K, et al. Adversarial transfer lear-ning for Chinese named entity recognition with self-attention mechanism[C]//Proceedings of the 2018 Conference on Empi-rical Methods in Natural Language Processing, Brussels, Oct 31-Nov 4, 2018. Stroudsburg: ACL, 2018: 182-192.
[60] PENG N Y, DREDZE M. Named entity recognition for Chinese social media with jointly trained embeddings[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Sep 17-21, 2015.Stroudsburg: ACL, 2015: 548-554.
[61] LEVOW G A. The third international Chinese language pro-cessing bakeoff: word segmentation and named entity recogni-tion[C]//Proceedings of the 5th Workshop on Chinese Langu-age Processing, Sydney, Jul 22-23, 2006. Stroudsburg: ACL, 2006: 108-117.
[62] GOODFELLOW I J, SHLENS J, SZEGEDY C, et al. Explai-ning and harnessing adversarial examples[C]//Proceedings of the 3rd International Conference on Learning Represen-tations, San Diego, May 7-9, 2015: 6706414.
[63] YANG P Y, LIU W, YANG J Y H. Positive unlabeled learning via wrapper-based adaptive sampling[C]//Proceedings of the 26th International Joint Conference on Artificial Intel-ligence, Melbourne, Aug 19-25, 2017: 3273-3279.
[64] SANG E F, DE MEULDER F. Introduction to the CoNLL-2003 shared task: language-independent named entity recogni-tion[C]//Proceedings of the 7th Conference on Natural Lan-guage Learning, Edmonton, May 31-Jun 1, 2003. Stroudsburg:ACL, 2003: 142-147.
[65] ZEMAN D, POPEL M, STRAKA M, et al. CoNLL 2017 shared task: multilingual parsing from raw text to universal dependencies[C]//Proceedings of the CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Depen-dencies, Vancouver, Aug 3-4, 2017. Stroudsburg: ACL, 2017: 1-19.
[66] LIN Y, YANG S Q, STOYANOV V S, et al. A multi-lingual multi-task architecture for low-resource sequence labeling[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, Melbourne, Jul 15-20, 2018.Stroudsburg: ACL, 2018: 799-809. |