[1] Tang D Y, Qin B, Feng X C, et al. Effective LSTMs for target-dependent sentiment classification[J]. arXiv:1512.01100, 2015.
[2] Lin Z, Feng M, Santos C N, et al. A structured self-attentive sentence embedding[J]. arXiv:1703.03130, 2017.
[3] Kim Y. Convolutional neural networks for sentence classifi-cation[J]. arXiv:1408.5882, 2014.
[4] Xue W, Li T. Aspect based sentiment analysis with gated convolutional networks[J]. arXiv:1805.07043, 2018.
[5] Yang S, Hu X G, Zhang Y H. Multi-marginalized denoising autoencoders for domain adaptation[J]. Journal of Frontiers of Computer Science and Technology, 2019, 13(2): 322-329.杨帅, 胡学钢, 张玉红.用于域适应的多边缘降噪自动编码器[J].计算机科学与探索,2019,13(2):322-329.
[6] Yu T, Luo K. Commentary sentiment classification model combining product features[J]. Computer Engineering and Applications, 2019, 55(16): 108-114.喻涛, 罗可.结合产品特征的评论情感分类模型[J].计算机工程与应用,2019,55(16):108-114.
[7] Zheng C, Qian G L, Zhang J P. Research on sentiment classi-fication of title and TextRank extracting key sentences[J]. Computer Engineering and Applications, 2019, 55(20): 95-100.郑诚, 钱改林, 章金平.Title加TextRank抽取关键句的情感分类研究[J].计算机工程与应用,2019,55(20):95-100.
[8] Yu J, Jiang J. Learning sentence embeddings with auxiliary tasks for cross-domain sentiment classification[C]//Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2016: 236-246.
[9] Mikolov T, Sutskever I, Chen K, et al. Distributed represen-tations of words and phrases and their compositionality[C]//Proceedings of the 26th International Conference on Neural Information Processing Systems. New York: Curran Associates, 2013: 3111-3119.
[10] Pennington J, Socher R, Manning C. Glove: global vectors for word representation[C]//Proceedings of the 2014 Confer-ence on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2014: 1532-1543.
[11] Deng J, Dong W, Socher R, et al. Imagenet: a large-scale hierarchical image database[C]//Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition. Washington: IEEE Computer Society, 2009: 248-255.
[12] Howard J, Ruder S. Universal language model fine-tuning for text classification[J]. arXiv:1801.06146, 2018.
[13] Radford A, Narasimhan K, Salimans T, et al. Improving language understanding by generative pre-training[EB/OL]. [2019-07-16]. https://s3-us-west-2.amazonaws.com/openai-assets/researchcovers/languageunsupervised/language under-standing paper.pdf.
[14] Devlin J, Chang M W, Lee K, et al. Bert: pre-training of deep bidirectional transformers for language understanding[J]. arXiv:1810.04805, 2018.
[15] Liu X, He P, Chen W, et al. Multi-task deep neural networks for natural language understanding[J]. arXiv:1901.11504, 2019.
[16] Sun C, Huang L, Qiu X. Utilizing BERT for aspect-based sentiment analysis via constructing auxiliary sentence[J]. arXiv:1903.09588, 2019.
[17] Sun C, Qiu X, Xu Y, et al. How to fine-tune BERT for text classification?[J]. arXiv:1905.05583, 2019.
[18] McCloskey M, Cohen N J. Catastrophic interference in con-nectionist networks: the sequential learning problem[M]//Psychology of Learning and Motivation. New York: Academic Press, 1989: 109-165.
[19] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems. New York: Curran Associates, 2017: 6000-6010.
[20] Yu T, Luo K. Sentiment analysis with dynamic multi-pooling convolution neural network[J]. Journal of Frontiers of Com-puter Science and Technology, 2018, 12(7): 1182-1190.喻涛, 罗可. 利用动态多池卷积神经网络的情感分析模型[J]. 计算机科学与探索, 2018, 12(7): 1182-1190. |