期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
A new focused crawler using an improved tabu search algorithm incorporating ontology and host information
1
作者 jingfa liu Zhen WANG +1 位作者 Guo ZHONG Zhihe YANG 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2023年第6期859-875,共17页
To solve the problems of incomplete topic description and repetitive crawling of visited hyperlinks in traditional focused crawling methods,in this paper,we propose a novel focused crawler using an improved tabu searc... To solve the problems of incomplete topic description and repetitive crawling of visited hyperlinks in traditional focused crawling methods,in this paper,we propose a novel focused crawler using an improved tabu search algorithm with domain ontology and host information(FCITS_OH),where a domain ontology is constructed by formal concept analysis to describe topics at the semantic and knowledge levels.To avoid crawling visited hyperlinks and expand the search range,we present an improved tabu search(ITS)algorithm and the strategy of host information memory.In addition,a comprehensive priority evaluation method based on Web text and link structure is designed to improve the assessment of topic relevance for unvisited hyperlinks.Experimental results on both tourism and rainstorm disaster domains show that the proposed focused crawlers overmatch the traditional focused crawlers for different performance metrics. 展开更多
关键词 Focused crawler Tabu search algorithm ONTOLOGY Host information Priority evaluation
原文传递
Focused crawling strategies based on ontologies and simulated annealing methods for rainstorm disaster domain knowledge
2
作者 jingfa liu Fan LI +1 位作者 Ruoyao DING Zi’ang liu 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2022年第8期1189-1204,共16页
At present,focused crawler is a crucial method for obtaining effective domain knowledge from massive heterogeneous networks.For most current focused crawling technologies,there are some difficulties in obtaining high-... At present,focused crawler is a crucial method for obtaining effective domain knowledge from massive heterogeneous networks.For most current focused crawling technologies,there are some difficulties in obtaining high-quality crawling results.The main difficulties are the establishment of topic benchmark models,the assessment of topic relevance of hyperlinks,and the design of crawling strategies.In this paper,we use domain ontology to build a topic benchmark model for a specific topic,and propose a novel multiple-filtering strategy based on local ontology and global ontology(MFSLG).A comprehensive priority evaluation method(CPEM)based on the web text and link structure is introduced to improve the computation precision of topic relevance for unvisited hyperlinks,and a simulated annealing(SA)method is used to avoid the focused crawler falling into local optima of the search.By incorporating SA into the focused crawler with MFSLG and CPEM for the first time,two novel focused crawler strategies based on ontology and SA(FCOSA),including FCOSA with only global ontology(FCOSA_G)and FCOSA with both local ontology and global ontology(FCOSA_LG),are proposed to obtain topic-relevant webpages about rainstorm disasters from the network.Experimental results show that the proposed crawlers outperform the other focused crawling strategies on different performance metric indices. 展开更多
关键词 Focused crawler ONTOLOGY Priority evaluation Simulated annealing Rainstorm disaster
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部