• EI
    • CSA
    • CABI
    • 卓越期刊
    • CA
    • Scopus
    • CSCD
    • 核心期刊
Zhao Liang, Zhang Zhaoyue, Liao Ziyi, Wang Ling. Relationship extraction in the field of food safety based on BERT and improved PCNN model[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2022, 38(8): 263-270. DOI: 10.11975/j.issn.1002-6819.2022.08.030
Citation: Zhao Liang, Zhang Zhaoyue, Liao Ziyi, Wang Ling. Relationship extraction in the field of food safety based on BERT and improved PCNN model[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2022, 38(8): 263-270. DOI: 10.11975/j.issn.1002-6819.2022.08.030

Relationship extraction in the field of food safety based on BERT and improved PCNN model

More Information
  • Received Date: October 19, 2021
  • Revised Date: March 28, 2022
  • Published Date: April 29, 2022
  • Abstract: A knowledge graph (semantic network) has emerged to organize the real-world entities in a graph database for the relationship between them. Among them, relationship extraction has been one of the most important links in the automatic construction of knowledge graphs. However, there is no public dataset related to knowledge graphs in the food safety field at present. The existing models of relationship extraction are confined to the open standard data set, but most cannot extract the data in the specific domain. In this study, a professional data set was constructed for the relationship extraction in the food safety field using the Bidirectional Encoder Representations from Transformers (BERT) and the improved Piecewise Convolutional Neural Network (PCNN) model. The corpus was firstly collected to annotate the corresponding entities and related categories. At the same time, a relationship extraction model was proposed using BERT-PCNN-Attention-based Neural Networks (ATT)-Jieba for the field of food safety. The BERT pre-training model was selected to generate the input word vector. After that, the segmented maximum pooling layer of the PCNN model was utilized to capture the local information of sentences. An attention mechanism was added between the segmented maximum pooling layer and the classification layer, further to extract the high-level semantics. In addition, Jieba word segmentation was used to segment the Chinese corpus before the random mask segmentation of the BERT model. The segmented maximum pool layer of the PCNN model masked the word unit instead of characters when executing the Masked Language Model (MLM). As such, the semantic loss of sentences was reduced to achieve a more efficient relationship extraction, when inputting into the training model. The performance of the BERT-PCNN-ATT-Jieba model was compared with the classical CNN, PCNN model, as well as the CNN, PCNN, PCNN-ATT, and PCNN-Jieba models combined with BERT under the same data set and the consistent experimental parameters. Comparing the PCNN with the BERT-PCNN model, the precision, recall, and F1 value of BERT-PCNN were slightly improved, indicating that the vector generated by the BERT model can better obtain the semantic feature information of data. Comparing the BERT-PCNN-ATT and BERT-PCNN, the pooled high-level semantic features presented a higher weight value after adding the attention mechanism between the pooling layer and the classification layer, indicating that the attention mechanism can improve the performance of the model. The F1 value of BERT-PCNN-Jieba was better than that of BERT-PCNN because the influence of word length was weakened through sentence preprocessing in the training set for the field of food safety. The position and logical information between words were better analyzed by adding a word segmentation operation. Consequently, the BERT-PCNN-ATT-Jieba model presented the highest precision of 84.72%, recall of 81.78%, and F1 value of 83.22%, indicating that the better performance was achieved in the relationship extraction data set using the field of food safety. The finding can provide a strong reference for knowledge extraction in the cost-saving and automatic construction of knowledge graphs in the field of food safety. The improved model can also lay a foundation for the application of Knowledge Q&A, knowledge retrieval, data sharing, and intelligent supervision of food safety using knowledge graphs.
  • [1]
    Amit S. Introducing the knowledge graph[R]. America: Official Blog of Google, 2012.
    [2]
    Paulheim H, Cimiano P. Knowledge graph refinement: A survey of approaches and evaluation methods[J]. Semantic Web, 2016, 8(3): 489-508.
    [3]
    侯梦薇,卫荣,陆亮,等. 知识图谱研究综述及其在医疗领域的应用[J]. 计算机研究与发展,2018,55(12):2587-2599.Hou Mengwei, Wei Rong, Lu Liang, et al. Overview of knowledge mapping and its application in the medical field[J]. Computer Research and Development, 2018, 55(12): 2587-2599. (in Chinese with English abstract)
    [4]
    夏恩君,宋剑锋. 开放式创新研究的演化路径和热点领域分析:基于科学知识图谱视角[J]. 科研管理,2015,36(7):28-37.Xia Enjun, Song Jianfeng. An analysis of the evolution path and hot topics of open innovation based on the view of the scientific knowledge map[J]. Science Research Management, 2015, 36(7): 28-37. (in Chinese with English abstract)
    [5]
    刘烨宸,李华昱. 领域知识图谱研究综述[J]. 计算机系统应用,2020,29(6):1-12.Liu Yechen, Li Huayu. Survey on domain knowledge graph research[J]. Computer Systems and Applications, 2020, 29(6): 1-12. (in Chinese with English abstract)
    [6]
    刘峤,李杨,段宏,等. 知识图谱构建技术综述[J]. 计算机研究与发展,2016,53(3):582-600.Liu Jiao, Li Yang, Duan Hong, et al. Knowledge graph construction techniques[J]. Journal of Computer Research and Development, 2016, 53(3): 582-600. (in Chinese with English abstract)
    [7]
    李冬梅,张扬,李东远,等. 实体关系抽取方法研究综述[J]. 计算机研究与发展,2020,57(7):1424-1448.Li Dongmei, Zhang Yang, Li Dongyuan, et al. Review of entity relation extraction methods[J]. Computer Research and Development, 2020, 57(7): 1424-1448. (in Chinese with English abstract)
    [8]
    李涓子,侯磊. 知识图谱研究综述[J]. 山西大学学报(自然科学版),2017,40(3):454-459.Li Juanzi, Hou Lei. Reviews on knowledge graph research[J]. Journal of Shanxi University (Natural Science Edition), 2017, 40(3): 454-459. (in Chinese with English abstract)
    [9]
    黄恒琪,于娟,廖晓,等. 知识图谱研究综述[J]. 计算机系统应用,2019,28(6):1-12.Huang Hengqi, Yu Juan, Liao Xiao, et al. Review on knowledge graphs[J]. Computer System Application, 2019, 28(6): 1-12. (in Chinese with English abstract)
    [10]
    Chinchor N, Marsh E. Muc-7 information extraction task definition[C]//Proc of the 7th Message Understanding Conf, Philadelphia, USA, 1998: 359-367.
    [11]
    Aitken J S. Learning information extraction rules: An inductive logic programming approach[C]//Proc of External Credit Assessment Institution, Lyon, Frence, 2002: 355-359.
    [12]
    Zhou G, Su J, Zhang J, et al. Exploring various knowledge in relation extraction[C]//The 43rd Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference, Michigan, USA, 2005: 427-434.
    [13]
    Zelenco D, Aone C, Richardella A. Kernel methods for relation extraction[J]. Journal of Machine Learning Research, 2003, 3(2): 1083-1106.
    [14]
    Jiang J, Zhai C X. A systematic exploration of the feature space for relation extraction[C]//Proc of the Conf of the North American Chapter of the Association for Computational Linguistics, Stroudsburg, USA, 2007: 113-120.
    [15]
    王东波,吴毅,叶文豪,等. 多特征知识下的食品安全事件实体抽取研究[J]. 数据分析与知识发现,2017,1(3):54-61.Wang Dongbo, Wu Yi, Ye Wenhao, et al. Research on food safety event entity extraction based on multi-feature knowledge[J]. Data Analysis and Knowledge Discovery, 2017, 1(3): 54-61. (in Chinese with English abstract)
    [16]
    鄂海红,张文静,肖思琪,等. 深度学习实体关系抽取研究综述[J]. 软件学报,2019,30(6):1793-1818.E Haihong, Zhang Wenjing, Xiao Siqi, et al. Survey of entity-relationship extraction based on deep learning[J]. Journal of Software, 2019, 30(6): 1793-1818. (in Chinese with English abstract)
    [17]
    Zeng D, Liu K, Lai S, et al. Relation classification via convolutional deep neural network[C]//The 25th International Conference on Computational Linguistics, Dublin, Ireland, 2014: 2335-2344.
    [18]
    王庆棒,汪颢懿,左敏,等. 基于CNN-BLSTM的食品舆情实体关系抽取模型研究[J]. 食品科学技术学报,2021,39(2):152-158.Wang Qingbang, Wang Haoyi, Zuo Min, et al. Research on entity-relationship extraction model of food public opinion based on CNN-BLSTM[J]. Journal of Food Science and Technology, 2021, 39(2): 152-158. (in Chinese with English abstract)
    [19]
    Zeng D, Liu K, Chen Y, et al. Distant supervision for relation extraction via piecewise convolutional neural networks[C]//Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal, 2015: 1753-1762.
    [20]
    武小平,张强,赵芳,等. 基于BERT的心血管医疗指南实体关系抽取方法[J]. 计算机应用,2021,41(1):145-149.Wu Xiaoping, Zhang Qiang, Zhao Fang, et al. Entity relation extraction method for guidelines of cardiovascular disease based on bidirectional encoder representation from transformers[J]. Journal of Computer Application, 2021, 41(1): 145-149. (in Chinese with English abstract)
    [21]
    Wang J, Guo Y. Scrapy-based crawling and user-behavior characteristics analysis on taobao[C]// 2012 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery, Sanya, China, 2012: 44-52.
    [22]
    Percuku A, Minkovska D, Stoyanova L. Modeling and processing big data of power transmission grid substation using Neo4j[J]. Procedia Computer Science, 2017, 113: 9-16.
    [23]
    Holzschuher F, Peinl R. Performance of graph query languages: Comparison of cypher, gremlin and native access in Neo4j[C]//Proceedings of the Joint EDBT/ICDT 2013 Workshops, Genoa, Italy, 2013: 195-204.
    [24]
    Buckman J, Roy A, Raffel C, et al. Thermometer encoding: One hot way to resist adversarial examples[C]//International Conference on Learning Representations, British Columbia, Canada, 2018: 1-22.
    [25]
    Mikolov T, Corrado G, Kai C, et al. Efficient Estimation of Word Representations in Vector Space[C]//International Conference on Learning Representations, Arizona, USA, 2013: 1-12.
    [26]
    Peters M E, Neumann M, Iyyer M, et al. Deep contextualized word representations[C]//Proceedings of NAACL-HLT, Louisiana, USA, 2018: 2227-2237.
    [27]
    李舟军,范宇,吴贤杰. 面向自然语言处理的预训练技术研究综述[J]. 计算机科学,2020,47(3):170-181.Li Zhoujun, Fan Yu, Wu Xianjun. A review of research on pre-training technology for natural language processing[J]. Computer Science, 2020, 47(3): 170-181. (in Chinese with English abstract)
    [28]
    Devlin J, Chang M W, Lee K, et al. BERT: Pre-training of deep bidirectional transformers for language understanding[J]. arXiv, 2018,1810,04805.
    [29]
    陈德光,马金林,马自萍,等. 自然语言处理预训综述[J]. 计算机科学与探索,2021,15(8):1359-1389.Chen Deguang, Ma Jinlin, Ma Ziping, et al. Review of pre-training techniques for natural language processing[J]. Computer Science and Exploration, 2021, 15(8): 1359-1389. (in Chinese with English abstract)
    [30]
    郑丽敏,任乐乐. 采用融合规则与BERT-FLAT模型对营养健康领域命名实体识别[J]. 农业工程学报,2021,37(20):211-218.Zheng Limin, Ren Lele. Named entity recognition in human nutrition and health domain using rule and BERT-FLAT[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2021, 37(20): 211-218. (in Chinese with English abstract)
    [31]
    赵鹏飞,赵春江,吴华瑞,等. 基于BERT的多特征融合农业命名实体识别[J]. 农业工程学报,2022,38(3):112-118.Zhao Pengfei, Zhao Chunjiang, Wu Huarui, et al. Recognition of the agricultural named entities with multi-feature fusion based on BERT[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2022, 38(3): 112-118. (in Chinese with English abstract)
    [32]
    任媛,于红,杨鹤,等. 融合注意力机制与BERT+ BiLSTM+CRF模型的渔业标准定量指标识别[J]. 农业工程学报,2021,37(10):135-141.Ren Yuan, Yu Hong, Yang He, et al. Recognition of quantitative indicator of fishery standard using attention mechanism and the BERT+BiLSTM+CRF model[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2021, 37(10): 135-141. (in Chinese with English abstract)
    [33]
    Kingma D P, Ba J. Adam: A method for stochastic optimization[J]. arXiv, 2014,1412,06980.
    [34]
    Mnih, Volodymyr, Heess, et al. Recurrent models of visual attention[J]. arXiv, 2014,1406,06247.
    [35]
    Cui Y, Che W, Liu T, et al. Pre-training with whole word masking for chinese bert[J]. arXiv, 2019,1906. 08101.
    [36]
    Lin Y, Shen S, Liu Z, et al. Neural relation extraction with selective attention over instances[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Berlin, Germany, 2016: 2124-2133.
    [37]
    张彤,宋明艳,王俊,等. 基于PCNN的工业制造领域质量文本实体关系抽取方法[J]. 信息技术与网络安全,2021,40(3):8-13.Zhang Tong, Song Mingyan, Wang Jun, et al. Entity relation extraction method for quality text in industrial manufacturing field based on PCNN[J]. Information Technology and Network Security, 2021, 40(3): 8-13. (in Chinese with English abstract)

Catalog

    Article views (285) PDF downloads (221) Cited by()
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return