首页 | 官方网站   微博 | 高级检索  
     

基于BERTCA的新闻实体与正文语义相关度计算模型
引用本文:向军毅,胡慧君,刘茂福,毛瑞彬.基于BERTCA的新闻实体与正文语义相关度计算模型[J].中文信息学报,2022,36(3):109-119.
作者姓名:向军毅  胡慧君  刘茂福  毛瑞彬
作者单位:1.武汉科技大学 计算机科学与技术学院, 湖北 武汉 430065;
2.智能信息处理与实时工业系统湖北省重点实验室,湖北 武汉 430065;
3.武汉大学 信息资源研究中心,湖北 武汉 430072
基金项目:深圳证券信息有限公司联合研究计划(2018002) ;全军共用信息系统装备预先研究项目(31502030502)
摘    要:目前的搜索引擎仍然存在“重形式,轻语义”的问题,无法做到对搜索关键词和文本的深层次语义理解,因此语义检索成为当前搜索引擎中亟需解决的问题。为了提高搜索引擎的语义理解能力,该文提出一种语义相关度的计算方法。首先,标注了金融类新闻标题实体与新闻正文语义相关度语料1万条,然后建立新闻实体与正文语义相关度计算的BERTCA(Bidirectional Encoder Representation from Transformers Co-Attention)模型,通过使用BERT预训练模型,综合考虑细粒度的实体和粗粒度的正文的语义信息,然后经过协同注意力,实现实体与正文的语义匹配,不仅能计算出金融新闻实体与新闻正文之间的相关度,还能根据相关度阈值来判定相关度类别,实验表明该模型在1万条标注语料上准确率超过95%,优于目前主流模型,最后通过具体搜索示例展示了该模型的优秀性能。

关 键 词:语义相关度计算  BERT模型  协同注意力机制  

BERTCA Based Semantic Relevance Model for News Entity and Text
XIANG Junyi,HU Huijun,LIU Maofu,MAO Ruibin.BERTCA Based Semantic Relevance Model for News Entity and Text[J].Journal of Chinese Information Processing,2022,36(3):109-119.
Authors:XIANG Junyi  HU Huijun  LIU Maofu  MAO Ruibin
Affiliation:1.School of Computer Science and Technology, Wuhan University of Science and Technology, Wuhan, Hubei 430065, China;
2.Hubei Province Key Laboratory of Intelligent Information Processing and Real-time Industrial System, Wuhan, Hubei 430065, China;
3.Center for Studies of Information Resources, Wuhan University, Wuhan, Hubei 430072, China
Abstract:In order to improve the ability of semantic retrieval in search engines, this paper proposes a semantic relevancy model for news entity and text. A corpus 10, 000 financial news with the semantic relatedness between entities in headlines and text has been manually annotated. Then the BERTCA (Bidirectional Encoder Representation from Transformers Co-Attention semantic relevancy computing) model has been established using this corpus. Through the co-attention mechanism, this model can obtain the semantic matching between the entity and text, and it can not only calculate the degree of correlation between entity and text, but also determine the degree of correlation according to the semantic relevancy. The experimental results show that the accuracy of the proposed model surpasses 95%, which is better than the state-of-the-art models.
Keywords:semantic relevance computing  BERT model  co-attention mechanism  
点击此处可从《中文信息学报》浏览原始摘要信息
点击此处可从《中文信息学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号