Treatment plan selection is a complex process because it sometimes needs sufficient experience and clinical information.Nowadays it is even harder for doctors to select an appropriate treatment plan for certain patien...Treatment plan selection is a complex process because it sometimes needs sufficient experience and clinical information.Nowadays it is even harder for doctors to select an appropriate treatment plan for certain patients since doctors might encounter difficulties in obtaining the right information and analyzing the diverse clinical data.In order to improve the effectiveness of clinical decision making in complicated information system environments,we first propose a linked data-based approach for treatment plan selection.The approach integrates the patients’clinical records in hospitals with open linked data sources out of hospitals.Then,based on the linked data net,treatment plan selection is carried on aided by similar historical therapy cases.Finally,we reorganize the electronic medical records of 97 colon cancer patients using the linked data model and count the similarity of these records to help treatment selecting.The experiment shows the usability of our method in supporting clinical decisions.展开更多
Abundant sensor data are now available online from a wealth of sources,which greatly enhance research efforts on the Digital Earth.The combination of distributed sensor networks and expanding citizen-sensing capabilit...Abundant sensor data are now available online from a wealth of sources,which greatly enhance research efforts on the Digital Earth.The combination of distributed sensor networks and expanding citizen-sensing capabilities provides a more synchronized image of earth’s social and physical landscapes.However,it remains difficult for researchers to use such heterogeneous Sensor Webs for scientific applications since data are published by following different standards and protocols and are in arbitrary formats.In this paper,we investigate the core challenges faced when consuming multiple sources for environmental applications using the Linked Data approach.We design and implement a system to achieve better data interoperability and integration by republishing real-world data into linked geo-sensor data.Our contributions include presenting:(1)best practices of re-using and matching the W3C Semantic Sensor Network(SSN)ontology and other popular ontologies for heterogeneous data modeling in the water resources application domain,(2)a newly developed spatial analysis tool for creating links,and(3)a set of RESTful OGC Sensor Observation Service(SOS)like Linked Data APIs.Our results show how a Linked Sensor Web can be built and used within the integrated water resource decision support application domain.展开更多
How to query Linked Data effectively is a challenge due to its heterogeneous datasets.There are three types of heterogeneities,i.e.,different structures representing entities,different predicates with the same meaning...How to query Linked Data effectively is a challenge due to its heterogeneous datasets.There are three types of heterogeneities,i.e.,different structures representing entities,different predicates with the same meaning and different literal formats used in objects.Approaches based on ontology mapping or Information Retrieval(IR) cannot deal with all types of heterogeneities.Facing these limitations,we propose a hierarchical multi-hop language model(HMPM).It discriminates among three types of predicates,descriptive predicates,out-associated predicates and in-associated predicates,and generates multi-hop models for them respectively.All predicates' similarities between the query and entity are organized into a hierarchy,with predicate types on the first level and predicates of this type on the second level.All candidates are ranked in ascending order.We evaluated HMPM in three datasets,DBpedia,Linked MDB and Yago.The results of experiments show that the effectiveness and generality of HMPM outperform the existing approaches.展开更多
With the rise of linked data and knowledge graphs,the need becomes compelling to find suitable solutions to increase the coverage and correctness of data sets,to add missing knowledge and to identify and remove errors...With the rise of linked data and knowledge graphs,the need becomes compelling to find suitable solutions to increase the coverage and correctness of data sets,to add missing knowledge and to identify and remove errors.Several approaches-mostly relying on machine learning and natural language processing techniques-have been proposed to address this refinement goal;they usually need a partial gold standard,i.e.,some“ground truth”to train automatic models.Gold standards are manually constructed,either by involving domain experts or by adopting crowdsourcing and human computation solutions.In this paper,we present an open source software framework to build Games with a Purpose for linked data refinement,i.e.,Web applications to crowdsource partial ground truth,by motivating user participation through fun incentive.We detail the impact of this new resource by explaining the specific data linking“purposes”supported by the framework(creation,ranking and validation of links)and by defining the respective crowdsourcing tasks to achieve those goals.We also introduce our approach for incremental truth inference over the contributions provided by players of Games with a Purpose(also abbreviated as GWAP):we motivate the need for such a method with the specificity of GWAP vs.traditional crowdsourcing;we explain and formalize the proposed process,explain its positive consequences and illustrate the results of an experimental comparison with state-of-the-art approaches.To show this resource’s versatility,we describe a set of diverse applications that we built on top of it;to demonstrate its reusability and extensibility potential,we provide references to detailed documentation,including an entire tutorial which in a few hours guides new adopters to customize and adapt the framework to a new use case.展开更多
Purpose:To develop a set of metrics and identify criteria for assessing the functionality of LOD KOS products while providing common guiding principles that can be used by LOD KOS producers and users to maximize the f...Purpose:To develop a set of metrics and identify criteria for assessing the functionality of LOD KOS products while providing common guiding principles that can be used by LOD KOS producers and users to maximize the functions and usages of LOD KOS products.Design/methodology/approach:Data collection and analysis were conducted at three time periods in 2015–16,2017 and 2019.The sample data used in the comprehensive data analysis comprises all datasets tagged as types of KOS in the Datahub and extracted through their respective SPARQL endpoints.A comparative study of the LOD KOS collected from terminology services Linked Open Vocabularies(LOV)and BioPortal was also performed.Findings:The study proposes a set of Functional,Impactful and Transformable(FIT)metrics for LOD KOS as value vocabularies.The FAIR principles,with additional recommendations,are presented for LOD KOS as open data.Research limitations:The metrics need to be further tested and aligned with the best practices and international standards of both open data and various types of KOS.Practical implications:Assessment performed with FAIR and FIT metrics support the creation and delivery of user-friendly,discoverable and interoperable LOD KOS datasets which can be used for innovative applications,act as a knowledge base,become a foundation of semantic analysis and entity extractions and enhance research in science and the humanities.Originality/value:Our research provides best practice guidelines for LOD KOS as value vocabularies.展开更多
In light of the escalating demand and intricacy of services in contemporary terrestrial,maritime,and aerial combat operations,there is a compelling need for enhanced service quality and efficiency in airborne cluster ...In light of the escalating demand and intricacy of services in contemporary terrestrial,maritime,and aerial combat operations,there is a compelling need for enhanced service quality and efficiency in airborne cluster communication networks.Software-Defined Networking(SDN)proffers a viable solution for the multifaceted task of cooperative communication transmission and management across different operational domains within complex combat contexts,due to its intrinsic ability to flexibly allocate and centrally administer network resources.This study pivots around the optimization of SDN controller deployment within airborne data link clusters.A collaborative multi-controller architecture predicated on airborne data link clusters is thus proposed.Within this architectural framework,the controller deployment issue is reframed as a two-fold problem:subdomain partition-ing and central interaction node selection.We advocate a subdomain segmentation approach grounded in node value ranking(NDVR)and a central interaction node selection methodology predicated on an enhanced Artificial Fish Swarm Algorithm(AFSA).The advanced NDVR-AFSA(Node value ranking-Improved artificial fish swarm algorithm)algorithm makes use of a chaos algorithm for population initialization,boosting population diversity and circumventing premature algorithm convergence.By the integration of adaptive strategies and incorporation of the genetic algorithm’s crossover and mutation operations,the algorithm’s search range adaptability is enhanced,thereby increasing the possibility of obtaining globally optimal solutions,while concurrently augmenting cluster reliability.The simulation results verify the advantages of the NDVR-IAFSA algorithm,achieve a better load balancing effect,improve the reliability of aviation data link cluster,and significantly reduce the average propagation delay and disconnection rate,respectively,by 12.8%and 11.7%.This shows that the optimization scheme has important significance in practical application,and can meet the high requirements of modern sea,land,and air operations to aviation airborne communication networks.展开更多
Purpose:The interdisciplinary nature and rapid development of the Semantic Web led to the mass publication of RDF data in a large number of widely accepted serialization formats,thus developing out the necessity for R...Purpose:The interdisciplinary nature and rapid development of the Semantic Web led to the mass publication of RDF data in a large number of widely accepted serialization formats,thus developing out the necessity for RDF data processing with specific purposes.The paper reports on an assessment of chief RDF data endpoint challenges and introduces the RDF Adaptor,a set of plugins for RDF data processing which covers the whole life-cycle with high efficiency.Design/methodology/approach:The RDFAdaptor is designed based on the prominent ETL tool—Pentaho Data Integration—which provides a user-friendly and intuitive interface and allows connect to various data sources and formats,and reuses the Java framework RDF4J as middleware that realizes access to data repositories,SPARQL endpoints and all leading RDF database solutions with SPARQL 1.1 support.It can support effortless services with various configuration templates in multi-scenario applications,and help extend data process tasks in other services or tools to complement missing functions.Findings:The proposed comprehensive RDF ETL solution—RDFAdaptor—provides an easy-to-use and intuitive interface,supports data integration and federation over multi-source heterogeneous repositories or endpoints,as well as manage linked data in hybrid storage mode.Research limitations:The plugin set can support several application scenarios of RDF data process,but error detection/check and interaction with other graph repositories remain to be improved.Practical implications:The plugin set can provide user interface and configuration templates which enable its usability in various applications of RDF data generation,multi-format data conversion,remote RDF data migration,and RDF graph update in semantic query process.Originality/value:This is the first attempt to develop components instead of systems that can include extract,consolidate,and store RDF data on the basis of an ecologically mature data warehousing environment.展开更多
Tactical Data Link(TDL)is a communication system that utilizes a particular message format and a protocol to transmit data via wireless channels in an instant,automatic,and secure way.So far,TDL has shown its excellen...Tactical Data Link(TDL)is a communication system that utilizes a particular message format and a protocol to transmit data via wireless channels in an instant,automatic,and secure way.So far,TDL has shown its excellence in military applications.Current TDL adopts a distributed architecture to enhance anti-destruction capacity.However,It still faces a problem of data inconsistency and thus cannot well support cooperation across multiple militarily domains.To tackle this problem,we propose to leverage blockchain to build an automatic and adaptive data transmission control scheme for TDL.It achieves automatic data transmission and realizes information consistency among different TDL entities.Besides,applying smart contracts based on blockchain further enables adjusting data transmission policies automatically.Security analysis and experimental results based on simulations illustrate the effectiveness and efficiency of our proposed scheme.展开更多
According to the analysis of the very high frequency (VHF) self organized time division multiple access (S TDMA) aviation data link, a new dynamic slot assignment scheme is proposed in this paper, which adopts variabl...According to the analysis of the very high frequency (VHF) self organized time division multiple access (S TDMA) aviation data link, a new dynamic slot assignment scheme is proposed in this paper, which adopts variable data frame structure and can eliminate the effect of the idle slot on message delay. By using queue theory, the analysis models of the new scheme and previous scheme are presented, and the performance of message delay and that of system throughput are analyzed under two schemes. The simulation results show that the new scheme has a better performance than the previous one in the message delay and system throughput.展开更多
Discrete global grid systems have become an important component of Digital Earth systems.However,previously there has not existed an easy way to map between named places(toponyms)and the cells of a discrete global gri...Discrete global grid systems have become an important component of Digital Earth systems.However,previously there has not existed an easy way to map between named places(toponyms)and the cells of a discrete global grid system.The lack of such a tool has limited the opportunities to synthesize social place-based data with the more standard Earth and environmental science data currently being analyzed in Digital Earth applications.This paper introduces Wāhi,the first gazetteer to map entities from the GeoNames database to multiple discrete global grid systems.A gazetteer service is presented that exposes the grid system and the associated gazetteer data as Linked Data.A set of use cases for the discrete global grid gazetteer is discussed.展开更多
Linked Data is known as one of the best solutions for multisource and heterogeneous web data integration and discovery in this era of Big Data.However,data interlinking,which is the most valuable contribution of Linke...Linked Data is known as one of the best solutions for multisource and heterogeneous web data integration and discovery in this era of Big Data.However,data interlinking,which is the most valuable contribution of Linked Data,remains incomplete and inaccurate.This study proposes a multidimensional and quantitative interlinking approach for Linked Data in the geospatial domain.According to the characteristics and roles of geospatial data in data discovery,eight elementary data characteristics are adopted as data interlinking types.These elementary characteristics are further combined to form compound and overall data interlinking types.Each data interlinking type possesses one specific predicate to indicate the actual relationship of Linked Data and uses data similarity to represent the correlation degree quantitatively.Therefore,geospatial data interlinking can be expressed by a directed edge associated with a relation predicate and a similarity value.The approach transforms existing simple and qualitative geospatial data interlinking into complete and quantitative interlinking and promotes the establishment of high-quality and trusted Linked Geospatial Data.The approach is applied to build data intra-links in the Chinese National Earth System Scientific Data Sharing Network(NSTI-GEO)and data-links in NSTI-GEO with the Chinese Meteorological Data Network and National Population and Health Scientific Data Sharing Platform.展开更多
The vision of a Digital Earth calls for more dynamic information systems,new sources of information,and stronger capabilities for their integration.Sensor networks have been identified as a major information source fo...The vision of a Digital Earth calls for more dynamic information systems,new sources of information,and stronger capabilities for their integration.Sensor networks have been identified as a major information source for the Digital Earth,while Semantic Web technologies have been proposed to facilitate integration.So far,sensor data are stored and published using the Observations&Measurements standard of the Open Geospatial Consortium(OGC)as data model.With the advent of Volunteered Geographic Information and the Semantic Sensor Web,work on an ontological model gained importance within Sensor Web Enablement(SWE).In contrast to data models,an ontological approach abstracts from implementation details by focusing on modeling the physical world from the perspective of a particular domain.Ontologies restrict the interpretation of vocabularies toward their intended meaning.The ongoing paradigm shift to Linked Sensor Data complements this attempt.Two questions have to be addressed:(1)how to refer to changing and frequently updated data sets using Uniform Resource Identifiers,and(2)how to establish meaningful links between those data sets,that is,observations,sensors,features of interest,and observed properties?In this paper,we present a Linked Data model and a RESTful proxy for OGC’s Sensor Observation Service to improve integration and inter-linkage of observation data for the Digital Earth.展开更多
Semantic Web(SW)provides new opportunities for the study and application of big data,massive ranges of data sets in varied formats from multiple sources.Related studies focus on potential SW technologies for resolving...Semantic Web(SW)provides new opportunities for the study and application of big data,massive ranges of data sets in varied formats from multiple sources.Related studies focus on potential SW technologies for resolving big data problems,such as structurally and semantically heterogeneous data that result from the variety of data formats(structured,semi-structured,numeric,unstructured text data,email,video,audio,stock ticker).SW offers information semantically both for people and machines to retain the vast volume of data and provide a meaningful output of unstructured data.In the current research,we implement a new semantic Extract Transform Load(ETL)model that uses SW technologies for aggregating,integrating,and representing data as linked data.First,geospatial data resources are aggregated from the internet,and then a semantic ETL model is used to store the aggregated data in a semantic model after converting it to Resource Description Framework(RDF)format for successful integration and representation.The principal contribution of this research is the synthesis,aggregation,and semantic representation of geospatial data to solve problems.A case study of city data is used to illustrate the semantic ETL model’s functionalities.The results show that the proposed model solves the structural and semantic heterogeneity problems in diverse data sources for successful data aggregation,integration,and representation.展开更多
Metadata,data about other digital objects,play an important role in FAIR with a direct relation to all FAIR principles.In this paper we present and discuss the FAIR Data Point(FDP),a software architecture aiming to de...Metadata,data about other digital objects,play an important role in FAIR with a direct relation to all FAIR principles.In this paper we present and discuss the FAIR Data Point(FDP),a software architecture aiming to define a common approach to publish semantically-rich and machine-actionable metadata according to the FAIR principles.We present the core components and features of the FDP,its approach to metadata provision,the criteria to evaluate whether an application adheres to the FDP specifications and the service to register,index and allow users to search for metadata content of available FDPs.展开更多
While the FAIR Principles do not specify a technical solution for'FAIRness',it was clear from the outset of the FAIR initiative that it would be useful to have commodity software and tooling that would simplif...While the FAIR Principles do not specify a technical solution for'FAIRness',it was clear from the outset of the FAIR initiative that it would be useful to have commodity software and tooling that would simplify the creation of FAIR-compliant resources.The FAIR Data Point is a metadata repository that follows the DCAT(2)schema,and utilizes the Linked Data Platform to manage the hierarchical metadata layers as LDP Containers.There has been a recent flurry of development activity around the FAIR Data Point that has significantly improved its power and ease-of-use.Here we describe five specific tools—an installer,a loader,two Webbased interfaces,and an indexer-aimed at maximizing the uptake and utility of the FAIR Data Point.展开更多
Standards to describe soil properties are well established,with many ISO specifications and a few international thesauri available for specific applications.Besides,in recent years,the European directive on "Infr...Standards to describe soil properties are well established,with many ISO specifications and a few international thesauri available for specific applications.Besides,in recent years,the European directive on "Infrastructure for Spatial Information in the European Community(INSPIRE)"has brought together most of the existing standards into a well defined model.However,the adoption of these standards so far has not reached the level of semantic interoperability,defined in the paper,which would facilitate the building of data services that reuse and combine data from different sources.This paper reviews standards for describing soil data and reports on the work done within the EC funded agINFRA project to apply Linked Data technologies to existing standards and data in order to improve the interoperability of soil datasets.The main result of this work is twofold.First,an RDF vocabulary for soil concepts based on the UML INSPIRE model was published.Second,a KOS(Knowledge Organization System)for soil data was published and mapped to existing relevant KOS,based on the analysis of the SISI database of the CREA of Italy.This work also has a methodological value,in that it proposes and applies a methodology to standardize metadata used in local scientific databases,a very common situation in the scientific domain.Finally,this work aims at contributing towards a wider adoption of the INSPIRE directive,by providing an RDF version of it.展开更多
Network security has become more of a concern with the rapid growth and expansion of the Internet. While there are several ways to provide security in the application, transport, or network layers of a network, the da...Network security has become more of a concern with the rapid growth and expansion of the Internet. While there are several ways to provide security in the application, transport, or network layers of a network, the data link layer (Layer 2) security has not yet been adequately addressed. Data link layer protocols used in local area networks (LANs) are not designed with security features. Dynamic host configuration protocol (DHCP) is one of the most used network protocols for host configuration that works in data link layer. DHCP is vulnerable to a number of attacks, such as the DHCP rouge server attack, DHCP starvation attack, and malicious DHCP client attack. This work introduces a new scheme called Secure DHCP (S-DHCP) to secure DHCP protocol. The proposed solution consists of two techniques. The first is the authentication and key management technique that is used for entities authentication and management of security key. It is based on using Diffie-Hellman key exchange algorithm supported by the difficulty of Elliptic Curve Discrete Logarithm Problem (ECDLP) and a strong cryptographic one-way hash function. The second technique is the message authentication technique, which uses the digital signature to authenticate the DHCP messages exchanged between the clients and server.展开更多
Purpose:This research project aims to organize the archival information of traditional Korean performing arts in a semantic web environment.Key requirements,which the archival records manager should consider for publi...Purpose:This research project aims to organize the archival information of traditional Korean performing arts in a semantic web environment.Key requirements,which the archival records manager should consider for publishing and distribution of gugak performing archival information in a semantic web environment,are presented in the perspective of linked data.Design/methodology/approach:This study analyzes the metadata provided by the National Gugak Center’s Gugak Archive,the search and browse menus of Gugak Archive’s website and K-PAAN,the performing arts portal site.Findings:The importance of consistency,continuity,and systematicity—crucial qualities in traditional record management practices—is undiminished in a semantic web environment.However,a semantic web environment also requires new tools such as web identifiers(URIs),data models(RDF),and link information(interlinking).Research limitations:The scope of this study does not include practical implementation strategies for the archival records management system and website services.The suggestions also do not discuss issues related to copyright or policy coordination between related organizations.Practical implications:The findings of this study can assist records managers in converting a traditional performing arts information archive into a semantic web environment-based online archival service and system.This can also be useful for collaboration with record managers who are unfamiliar with relational or triple database system.Originality/value:This study analyzed the metadata of the Gugak Archive and its online services to present practical requirements for managing and disseminating gugak performing arts information in a semantic web environment.In the application of the semantic web services’principles and methods to an Gugak Archive,this study can contribute to the improvement of information organization and services in the field of Korean traditional music.展开更多
In the satellite-to-ground high-speed data transmission link,there are signal self-interference problems of symbols in the co-channel,as well as between orthogonal and polarized channels.A multichannel adaptive filter...In the satellite-to-ground high-speed data transmission link,there are signal self-interference problems of symbols in the co-channel,as well as between orthogonal and polarized channels.A multichannel adaptive filter is designed by constructing a multichannel Wiener-Hopf equation,and the influence of five channel nonideal factors is suppressed to improve the BER performance.Experiments show that this method is effective to suppress the signal selfinterference,and the BER floor is optimized from 1E3 to 1E-7.展开更多
基金This work was supported by the National Natural Science Foundation of China,[grant number 71171132,61373030].
文摘Treatment plan selection is a complex process because it sometimes needs sufficient experience and clinical information.Nowadays it is even harder for doctors to select an appropriate treatment plan for certain patients since doctors might encounter difficulties in obtaining the right information and analyzing the diverse clinical data.In order to improve the effectiveness of clinical decision making in complicated information system environments,we first propose a linked data-based approach for treatment plan selection.The approach integrates the patients’clinical records in hospitals with open linked data sources out of hospitals.Then,based on the linked data net,treatment plan selection is carried on aided by similar historical therapy cases.Finally,we reorganize the electronic medical records of 97 colon cancer patients using the linked data model and count the similarity of these records to help treatment selecting.The experiment shows the usability of our method in supporting clinical decisions.
文摘Abundant sensor data are now available online from a wealth of sources,which greatly enhance research efforts on the Digital Earth.The combination of distributed sensor networks and expanding citizen-sensing capabilities provides a more synchronized image of earth’s social and physical landscapes.However,it remains difficult for researchers to use such heterogeneous Sensor Webs for scientific applications since data are published by following different standards and protocols and are in arbitrary formats.In this paper,we investigate the core challenges faced when consuming multiple sources for environmental applications using the Linked Data approach.We design and implement a system to achieve better data interoperability and integration by republishing real-world data into linked geo-sensor data.Our contributions include presenting:(1)best practices of re-using and matching the W3C Semantic Sensor Network(SSN)ontology and other popular ontologies for heterogeneous data modeling in the water resources application domain,(2)a newly developed spatial analysis tool for creating links,and(3)a set of RESTful OGC Sensor Observation Service(SOS)like Linked Data APIs.Our results show how a Linked Sensor Web can be built and used within the integrated water resource decision support application domain.
文摘How to query Linked Data effectively is a challenge due to its heterogeneous datasets.There are three types of heterogeneities,i.e.,different structures representing entities,different predicates with the same meaning and different literal formats used in objects.Approaches based on ontology mapping or Information Retrieval(IR) cannot deal with all types of heterogeneities.Facing these limitations,we propose a hierarchical multi-hop language model(HMPM).It discriminates among three types of predicates,descriptive predicates,out-associated predicates and in-associated predicates,and generates multi-hop models for them respectively.All predicates' similarities between the query and entity are organized into a hierarchy,with predicate types on the first level and predicates of this type on the second level.All candidates are ranked in ascending order.We evaluated HMPM in three datasets,DBpedia,Linked MDB and Yago.The results of experiments show that the effectiveness and generality of HMPM outperform the existing approaches.
基金This work was partially supported by the STARS4ALL project(H2020-688135)co-funded by the European Commission.
文摘With the rise of linked data and knowledge graphs,the need becomes compelling to find suitable solutions to increase the coverage and correctness of data sets,to add missing knowledge and to identify and remove errors.Several approaches-mostly relying on machine learning and natural language processing techniques-have been proposed to address this refinement goal;they usually need a partial gold standard,i.e.,some“ground truth”to train automatic models.Gold standards are manually constructed,either by involving domain experts or by adopting crowdsourcing and human computation solutions.In this paper,we present an open source software framework to build Games with a Purpose for linked data refinement,i.e.,Web applications to crowdsource partial ground truth,by motivating user participation through fun incentive.We detail the impact of this new resource by explaining the specific data linking“purposes”supported by the framework(creation,ranking and validation of links)and by defining the respective crowdsourcing tasks to achieve those goals.We also introduce our approach for incremental truth inference over the contributions provided by players of Games with a Purpose(also abbreviated as GWAP):we motivate the need for such a method with the specificity of GWAP vs.traditional crowdsourcing;we explain and formalize the proposed process,explain its positive consequences and illustrate the results of an experimental comparison with state-of-the-art approaches.To show this resource’s versatility,we describe a set of diverse applications that we built on top of it;to demonstrate its reusability and extensibility potential,we provide references to detailed documentation,including an entire tutorial which in a few hours guides new adopters to customize and adapt the framework to a new use case.
基金College of Communication and Information(CCI)Research and Creative Activity Fund,Kent State University
文摘Purpose:To develop a set of metrics and identify criteria for assessing the functionality of LOD KOS products while providing common guiding principles that can be used by LOD KOS producers and users to maximize the functions and usages of LOD KOS products.Design/methodology/approach:Data collection and analysis were conducted at three time periods in 2015–16,2017 and 2019.The sample data used in the comprehensive data analysis comprises all datasets tagged as types of KOS in the Datahub and extracted through their respective SPARQL endpoints.A comparative study of the LOD KOS collected from terminology services Linked Open Vocabularies(LOV)and BioPortal was also performed.Findings:The study proposes a set of Functional,Impactful and Transformable(FIT)metrics for LOD KOS as value vocabularies.The FAIR principles,with additional recommendations,are presented for LOD KOS as open data.Research limitations:The metrics need to be further tested and aligned with the best practices and international standards of both open data and various types of KOS.Practical implications:Assessment performed with FAIR and FIT metrics support the creation and delivery of user-friendly,discoverable and interoperable LOD KOS datasets which can be used for innovative applications,act as a knowledge base,become a foundation of semantic analysis and entity extractions and enhance research in science and the humanities.Originality/value:Our research provides best practice guidelines for LOD KOS as value vocabularies.
基金supported by the following funds:Defense Industrial Technology Development Program Grant:G20210513Shaanxi Provincal Department of Science and Technology Grant:2021KW-07Shaanxi Provincal Department of Science and Technology Grant:2022 QFY01-14.
文摘In light of the escalating demand and intricacy of services in contemporary terrestrial,maritime,and aerial combat operations,there is a compelling need for enhanced service quality and efficiency in airborne cluster communication networks.Software-Defined Networking(SDN)proffers a viable solution for the multifaceted task of cooperative communication transmission and management across different operational domains within complex combat contexts,due to its intrinsic ability to flexibly allocate and centrally administer network resources.This study pivots around the optimization of SDN controller deployment within airborne data link clusters.A collaborative multi-controller architecture predicated on airborne data link clusters is thus proposed.Within this architectural framework,the controller deployment issue is reframed as a two-fold problem:subdomain partition-ing and central interaction node selection.We advocate a subdomain segmentation approach grounded in node value ranking(NDVR)and a central interaction node selection methodology predicated on an enhanced Artificial Fish Swarm Algorithm(AFSA).The advanced NDVR-AFSA(Node value ranking-Improved artificial fish swarm algorithm)algorithm makes use of a chaos algorithm for population initialization,boosting population diversity and circumventing premature algorithm convergence.By the integration of adaptive strategies and incorporation of the genetic algorithm’s crossover and mutation operations,the algorithm’s search range adaptability is enhanced,thereby increasing the possibility of obtaining globally optimal solutions,while concurrently augmenting cluster reliability.The simulation results verify the advantages of the NDVR-IAFSA algorithm,achieve a better load balancing effect,improve the reliability of aviation data link cluster,and significantly reduce the average propagation delay and disconnection rate,respectively,by 12.8%and 11.7%.This shows that the optimization scheme has important significance in practical application,and can meet the high requirements of modern sea,land,and air operations to aviation airborne communication networks.
基金This work is supported by“National Social Science Foundation in China”Project(19BTQ061)“Integration and Development on A Next Generation of Open Knowledge Services System and Key Technologies”project(2020XM05).
文摘Purpose:The interdisciplinary nature and rapid development of the Semantic Web led to the mass publication of RDF data in a large number of widely accepted serialization formats,thus developing out the necessity for RDF data processing with specific purposes.The paper reports on an assessment of chief RDF data endpoint challenges and introduces the RDF Adaptor,a set of plugins for RDF data processing which covers the whole life-cycle with high efficiency.Design/methodology/approach:The RDFAdaptor is designed based on the prominent ETL tool—Pentaho Data Integration—which provides a user-friendly and intuitive interface and allows connect to various data sources and formats,and reuses the Java framework RDF4J as middleware that realizes access to data repositories,SPARQL endpoints and all leading RDF database solutions with SPARQL 1.1 support.It can support effortless services with various configuration templates in multi-scenario applications,and help extend data process tasks in other services or tools to complement missing functions.Findings:The proposed comprehensive RDF ETL solution—RDFAdaptor—provides an easy-to-use and intuitive interface,supports data integration and federation over multi-source heterogeneous repositories or endpoints,as well as manage linked data in hybrid storage mode.Research limitations:The plugin set can support several application scenarios of RDF data process,but error detection/check and interaction with other graph repositories remain to be improved.Practical implications:The plugin set can provide user interface and configuration templates which enable its usability in various applications of RDF data generation,multi-format data conversion,remote RDF data migration,and RDF graph update in semantic query process.Originality/value:This is the first attempt to develop components instead of systems that can include extract,consolidate,and store RDF data on the basis of an ecologically mature data warehousing environment.
基金This work is sponsored by the open grant of the Tactical Data Link Lab of the 20th Research Institute of China Electronics Technology Group Corporation,P.R.China(Grant CLDL-20182119)the National Natural Science Foundation of China under Grants 61672410 and 61802293+2 种基金the Key Lab of Information Network Security,Ministry of Public Security(Grant C18614)the Academy of Finland(Grants 308087,314203,and 335262)the Shaanxi Innovation Team project under grant 2018TD-007,and the 111 project under grant B16037.
文摘Tactical Data Link(TDL)is a communication system that utilizes a particular message format and a protocol to transmit data via wireless channels in an instant,automatic,and secure way.So far,TDL has shown its excellence in military applications.Current TDL adopts a distributed architecture to enhance anti-destruction capacity.However,It still faces a problem of data inconsistency and thus cannot well support cooperation across multiple militarily domains.To tackle this problem,we propose to leverage blockchain to build an automatic and adaptive data transmission control scheme for TDL.It achieves automatic data transmission and realizes information consistency among different TDL entities.Besides,applying smart contracts based on blockchain further enables adjusting data transmission policies automatically.Security analysis and experimental results based on simulations illustrate the effectiveness and efficiency of our proposed scheme.
基金Aeronautical Science F oundation of China !( N o.98E5 1116)
文摘According to the analysis of the very high frequency (VHF) self organized time division multiple access (S TDMA) aviation data link, a new dynamic slot assignment scheme is proposed in this paper, which adopts variable data frame structure and can eliminate the effect of the idle slot on message delay. By using queue theory, the analysis models of the new scheme and previous scheme are presented, and the performance of message delay and that of system throughput are analyzed under two schemes. The simulation results show that the new scheme has a better performance than the previous one in the message delay and system throughput.
文摘Discrete global grid systems have become an important component of Digital Earth systems.However,previously there has not existed an easy way to map between named places(toponyms)and the cells of a discrete global grid system.The lack of such a tool has limited the opportunities to synthesize social place-based data with the more standard Earth and environmental science data currently being analyzed in Digital Earth applications.This paper introduces Wāhi,the first gazetteer to map entities from the GeoNames database to multiple discrete global grid systems.A gazetteer service is presented that exposes the grid system and the associated gazetteer data as Linked Data.A set of use cases for the discrete global grid gazetteer is discussed.
基金Thiswork was supported by the National Natural Science Foundation of China[grant number 41371381],[grant number 41431177]Natural Science Research Program of Jiangsu[grant number 14KJA170001]+4 种基金National Special Program on Basic Works for Science and Technology of China[grant number 2013FY110900]National Key Technology Innovation Project for Water Pollution Control and Remediation[grant number 2013ZX07103006]National Basic Research Program of China[grant number 2015CB954102]GuiZhou Welfare and Basic Geological Research Program of China[grant number 201423]China Scholarship Council[grant number 201504910358].
文摘Linked Data is known as one of the best solutions for multisource and heterogeneous web data integration and discovery in this era of Big Data.However,data interlinking,which is the most valuable contribution of Linked Data,remains incomplete and inaccurate.This study proposes a multidimensional and quantitative interlinking approach for Linked Data in the geospatial domain.According to the characteristics and roles of geospatial data in data discovery,eight elementary data characteristics are adopted as data interlinking types.These elementary characteristics are further combined to form compound and overall data interlinking types.Each data interlinking type possesses one specific predicate to indicate the actual relationship of Linked Data and uses data similarity to represent the correlation degree quantitatively.Therefore,geospatial data interlinking can be expressed by a directed edge associated with a relation predicate and a similarity value.The approach transforms existing simple and qualitative geospatial data interlinking into complete and quantitative interlinking and promotes the establishment of high-quality and trusted Linked Geospatial Data.The approach is applied to build data intra-links in the Chinese National Earth System Scientific Data Sharing Network(NSTI-GEO)and data-links in NSTI-GEO with the Chinese Meteorological Data Network and National Population and Health Scientific Data Sharing Platform.
基金The presented work is developed within the 528 North semantics community,and partly funded by the European projects UncertWeb(FP7-248488)ENVISION(FP7-249170)the GENESIS project(an Integrated Project,contract number 223996).
文摘The vision of a Digital Earth calls for more dynamic information systems,new sources of information,and stronger capabilities for their integration.Sensor networks have been identified as a major information source for the Digital Earth,while Semantic Web technologies have been proposed to facilitate integration.So far,sensor data are stored and published using the Observations&Measurements standard of the Open Geospatial Consortium(OGC)as data model.With the advent of Volunteered Geographic Information and the Semantic Sensor Web,work on an ontological model gained importance within Sensor Web Enablement(SWE).In contrast to data models,an ontological approach abstracts from implementation details by focusing on modeling the physical world from the perspective of a particular domain.Ontologies restrict the interpretation of vocabularies toward their intended meaning.The ongoing paradigm shift to Linked Sensor Data complements this attempt.Two questions have to be addressed:(1)how to refer to changing and frequently updated data sets using Uniform Resource Identifiers,and(2)how to establish meaningful links between those data sets,that is,observations,sensors,features of interest,and observed properties?In this paper,we present a Linked Data model and a RESTful proxy for OGC’s Sensor Observation Service to improve integration and inter-linkage of observation data for the Digital Earth.
文摘Semantic Web(SW)provides new opportunities for the study and application of big data,massive ranges of data sets in varied formats from multiple sources.Related studies focus on potential SW technologies for resolving big data problems,such as structurally and semantically heterogeneous data that result from the variety of data formats(structured,semi-structured,numeric,unstructured text data,email,video,audio,stock ticker).SW offers information semantically both for people and machines to retain the vast volume of data and provide a meaningful output of unstructured data.In the current research,we implement a new semantic Extract Transform Load(ETL)model that uses SW technologies for aggregating,integrating,and representing data as linked data.First,geospatial data resources are aggregated from the internet,and then a semantic ETL model is used to store the aggregated data in a semantic model after converting it to Resource Description Framework(RDF)format for successful integration and representation.The principal contribution of this research is the synthesis,aggregation,and semantic representation of geospatial data to solve problems.A case study of city data is used to illustrate the semantic ETL model’s functionalities.The results show that the proposed model solves the structural and semantic heterogeneity problems in diverse data sources for successful data aggregation,integration,and representation.
文摘Metadata,data about other digital objects,play an important role in FAIR with a direct relation to all FAIR principles.In this paper we present and discuss the FAIR Data Point(FDP),a software architecture aiming to define a common approach to publish semantically-rich and machine-actionable metadata according to the FAIR principles.We present the core components and features of the FDP,its approach to metadata provision,the criteria to evaluate whether an application adheres to the FDP specifications and the service to register,index and allow users to search for metadata content of available FDPs.
基金supported by Czech Technical University in Prague grant No.SGS20/209/OHK3/3T/18.LOBSS,RK and KB are partially funded by funding from the Horizon2020 projects FAIRsFAIR grant No.831558.
文摘While the FAIR Principles do not specify a technical solution for'FAIRness',it was clear from the outset of the FAIR initiative that it would be useful to have commodity software and tooling that would simplify the creation of FAIR-compliant resources.The FAIR Data Point is a metadata repository that follows the DCAT(2)schema,and utilizes the Linked Data Platform to manage the hierarchical metadata layers as LDP Containers.There has been a recent flurry of development activity around the FAIR Data Point that has significantly improved its power and ease-of-use.Here we describe five specific tools—an installer,a loader,two Webbased interfaces,and an indexer-aimed at maximizing the uptake and utility of the FAIR Data Point.
基金The research leading to these results has received funding from the European Union Seventh Framework Programme(FP7/2007-2013)under grant agreement No.283770.
文摘Standards to describe soil properties are well established,with many ISO specifications and a few international thesauri available for specific applications.Besides,in recent years,the European directive on "Infrastructure for Spatial Information in the European Community(INSPIRE)"has brought together most of the existing standards into a well defined model.However,the adoption of these standards so far has not reached the level of semantic interoperability,defined in the paper,which would facilitate the building of data services that reuse and combine data from different sources.This paper reviews standards for describing soil data and reports on the work done within the EC funded agINFRA project to apply Linked Data technologies to existing standards and data in order to improve the interoperability of soil datasets.The main result of this work is twofold.First,an RDF vocabulary for soil concepts based on the UML INSPIRE model was published.Second,a KOS(Knowledge Organization System)for soil data was published and mapped to existing relevant KOS,based on the analysis of the SISI database of the CREA of Italy.This work also has a methodological value,in that it proposes and applies a methodology to standardize metadata used in local scientific databases,a very common situation in the scientific domain.Finally,this work aims at contributing towards a wider adoption of the INSPIRE directive,by providing an RDF version of it.
文摘Network security has become more of a concern with the rapid growth and expansion of the Internet. While there are several ways to provide security in the application, transport, or network layers of a network, the data link layer (Layer 2) security has not yet been adequately addressed. Data link layer protocols used in local area networks (LANs) are not designed with security features. Dynamic host configuration protocol (DHCP) is one of the most used network protocols for host configuration that works in data link layer. DHCP is vulnerable to a number of attacks, such as the DHCP rouge server attack, DHCP starvation attack, and malicious DHCP client attack. This work introduces a new scheme called Secure DHCP (S-DHCP) to secure DHCP protocol. The proposed solution consists of two techniques. The first is the authentication and key management technique that is used for entities authentication and management of security key. It is based on using Diffie-Hellman key exchange algorithm supported by the difficulty of Elliptic Curve Discrete Logarithm Problem (ECDLP) and a strong cryptographic one-way hash function. The second technique is the message authentication technique, which uses the digital signature to authenticate the DHCP messages exchanged between the clients and server.
基金supported by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(NRF-2016S1A5A2A03927725)
文摘Purpose:This research project aims to organize the archival information of traditional Korean performing arts in a semantic web environment.Key requirements,which the archival records manager should consider for publishing and distribution of gugak performing archival information in a semantic web environment,are presented in the perspective of linked data.Design/methodology/approach:This study analyzes the metadata provided by the National Gugak Center’s Gugak Archive,the search and browse menus of Gugak Archive’s website and K-PAAN,the performing arts portal site.Findings:The importance of consistency,continuity,and systematicity—crucial qualities in traditional record management practices—is undiminished in a semantic web environment.However,a semantic web environment also requires new tools such as web identifiers(URIs),data models(RDF),and link information(interlinking).Research limitations:The scope of this study does not include practical implementation strategies for the archival records management system and website services.The suggestions also do not discuss issues related to copyright or policy coordination between related organizations.Practical implications:The findings of this study can assist records managers in converting a traditional performing arts information archive into a semantic web environment-based online archival service and system.This can also be useful for collaboration with record managers who are unfamiliar with relational or triple database system.Originality/value:This study analyzed the metadata of the Gugak Archive and its online services to present practical requirements for managing and disseminating gugak performing arts information in a semantic web environment.In the application of the semantic web services’principles and methods to an Gugak Archive,this study can contribute to the improvement of information organization and services in the field of Korean traditional music.
基金supported by the Natural Science Foundation for Outstanding Young Scholars of Heilongjiang Province under Grant YQ2020F001the National Key Research and Development Program of China under Grant 2021YFB2900500the Fundamental Research Funds for the Central Universities under Grant FRFCU 9803503821
文摘In the satellite-to-ground high-speed data transmission link,there are signal self-interference problems of symbols in the co-channel,as well as between orthogonal and polarized channels.A multichannel adaptive filter is designed by constructing a multichannel Wiener-Hopf equation,and the influence of five channel nonideal factors is suppressed to improve the BER performance.Experiments show that this method is effective to suppress the signal selfinterference,and the BER floor is optimized from 1E3 to 1E-7.