With the rapid development of information technology,IoT devices play a huge role in physiological health data detection.The exponential growth of medical data requires us to reasonably allocate storage space for clou...With the rapid development of information technology,IoT devices play a huge role in physiological health data detection.The exponential growth of medical data requires us to reasonably allocate storage space for cloud servers and edge nodes.The storage capacity of edge nodes close to users is limited.We should store hotspot data in edge nodes as much as possible,so as to ensure response timeliness and access hit rate;However,the current scheme cannot guarantee that every sub-message in a complete data stored by the edge node meets the requirements of hot data;How to complete the detection and deletion of redundant data in edge nodes under the premise of protecting user privacy and data dynamic integrity has become a challenging problem.Our paper proposes a redundant data detection method that meets the privacy protection requirements.By scanning the cipher text,it is determined whether each sub-message of the data in the edge node meets the requirements of the hot data.It has the same effect as zero-knowledge proof,and it will not reveal the privacy of users.In addition,for redundant sub-data that does not meet the requirements of hot data,our paper proposes a redundant data deletion scheme that meets the dynamic integrity of the data.We use Content Extraction Signature(CES)to generate the remaining hot data signature after the redundant data is deleted.The feasibility of the scheme is proved through safety analysis and efficiency analysis.展开更多
Genome-wide association mapping studies(GWAS)based on Big Data are a potential approach to improve marker-assisted selection in plant breeding.The number of available phenotypic and genomic data sets in which medium-s...Genome-wide association mapping studies(GWAS)based on Big Data are a potential approach to improve marker-assisted selection in plant breeding.The number of available phenotypic and genomic data sets in which medium-sized populations of several hundred individuals have been studied is rapidly increasing.Combining these data and using them in GWAS could increase both the power of QTL discovery and the accuracy of estimation of underlying genetic effects,but is hindered by data heterogeneity and lack of interoperability.In this study,we used genomic and phenotypic data sets,focusing on Central European winter wheat populations evaluated for heading date.We explored strategies for integrating these data and subsequently the resulting potential for GWAS.Establishing interoperability between data sets was greatly aided by some overlapping genotypes and a linear relationship between the different phenotyping protocols,resulting in high quality integrated phenotypic data.In this context,genomic prediction proved to be a suitable tool to study relevance of interactions between genotypes and experimental series,which was low in our case.Contrary to expectations,fewer associations between markers and traits were found in the larger combined data than in the individual experimental series.However,the predictive power based on the marker-trait associations of the integrated data set was higher across data sets.Therefore,the results show that the integration of medium-sized to Big Data is an approach to increase the power to detect QTL in GWAS.The results encourage further efforts to standardize and share data in the plant breeding community.展开更多
Plant morphogenesis relies on precise gene expression programs at the proper time and position which is orchestrated by transcription factors(TFs)in intricate regulatory networks in a cell-type specific manner.Here we...Plant morphogenesis relies on precise gene expression programs at the proper time and position which is orchestrated by transcription factors(TFs)in intricate regulatory networks in a cell-type specific manner.Here we introduced a comprehensive single-cell transcriptomic atlas of Arabidopsis seedlings.This atlas is the result of meticulous integration of 63 previously published scRNA-seq datasets,addressing batch effects and conserving biological variance.This integration spans a broad spectrum of tissues,including both below-and above-ground parts.Utilizing a rigorous approach for cell type annotation,we identified 47 distinct cell types or states,largely expanding our current view of plant cell compositions.We systematically constructed cell-type specific gene regulatory networks and uncovered key regulators that act in a coordinated manner to control cell-type specific gene expression.Taken together,our study not only offers extensive plant cell atlas exploration that serves as a valuable resource,but also provides molecular insights into gene-regulatory programs that varies from different cell types.展开更多
Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This a...Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This article presents an effective mechanism to preserve the secrecy and integrity of data stored on the public cloud by leveraging blockchain technology,smart contracts,and cryptographic primitives.The proposed approach utilizes a Solidity-based smart contract as an auditor for maintaining and verifying the integrity of outsourced data.To preserve data secrecy,symmetric encryption systems are employed to encrypt user data before outsourcing it.An extensive performance analysis is conducted to illustrate the efficiency of the proposed mechanism.Additionally,a rigorous assessment is conducted to ensure that the developed smart contract is free from vulnerabilities and to measure its associated running costs.The security analysis of the proposed system confirms that our approach can securely maintain the confidentiality and integrity of cloud storage,even in the presence of malicious entities.The proposed mechanism contributes to enhancing data security in cloud computing environments and can be used as a foundation for developing more secure cloud storage systems.展开更多
Currently,there is a growing trend among users to store their data in the cloud.However,the cloud is vulnerable to persistent data corruption risks arising from equipment failures and hacker attacks.Additionally,when ...Currently,there is a growing trend among users to store their data in the cloud.However,the cloud is vulnerable to persistent data corruption risks arising from equipment failures and hacker attacks.Additionally,when users perform file operations,the semantic integrity of the data can be compromised.Ensuring both data integrity and semantic correctness has become a critical issue that requires attention.We introduce a pioneering solution called Sec-Auditor,the first of its kind with the ability to verify data integrity and semantic correctness simultaneously,while maintaining a constant communication cost independent of the audited data volume.Sec-Auditor also supports public auditing,enabling anyone with access to public information to conduct data audits.This feature makes Sec-Auditor highly adaptable to open data environments,such as the cloud.In Sec-Auditor,users are assigned specific rules that are utilized to verify the accuracy of data semantic.Furthermore,users are given the flexibility to update their own rules as needed.We conduct in-depth analyses of the correctness and security of Sec-Auditor.We also compare several important security attributes with existing schemes,demonstrating the superior properties of Sec-Auditor.Evaluation results demonstrate that even for time-consuming file upload operations,our solution is more efficient than the comparison one.展开更多
With the popularization of the Internet and the development of technology,cyber threats are increasing day by day.Threats such as malware,hacking,and data breaches have had a serious impact on cybersecurity.The networ...With the popularization of the Internet and the development of technology,cyber threats are increasing day by day.Threats such as malware,hacking,and data breaches have had a serious impact on cybersecurity.The network security environment in the era of big data presents the characteristics of large amounts of data,high diversity,and high real-time requirements.Traditional security defense methods and tools have been unable to cope with the complex and changing network security threats.This paper proposes a machine-learning security defense algorithm based on metadata association features.Emphasize control over unauthorized users through privacy,integrity,and availability.The user model is established and the mapping between the user model and the metadata of the data source is generated.By analyzing the user model and its corresponding mapping relationship,the query of the user model can be decomposed into the query of various heterogeneous data sources,and the integration of heterogeneous data sources based on the metadata association characteristics can be realized.Define and classify customer information,automatically identify and perceive sensitive data,build a behavior audit and analysis platform,analyze user behavior trajectories,and complete the construction of a machine learning customer information security defense system.The experimental results show that when the data volume is 5×103 bit,the data storage integrity of the proposed method is 92%.The data accuracy is 98%,and the success rate of data intrusion is only 2.6%.It can be concluded that the data storage method in this paper is safe,the data accuracy is always at a high level,and the data disaster recovery performance is good.This method can effectively resist data intrusion and has high air traffic control security.It can not only detect all viruses in user data storage,but also realize integrated virus processing,and further optimize the security defense effect of user big data.展开更多
Integrated data and energy transfer(IDET)enables the electromagnetic waves to transmit wireless energy at the same time of data delivery for lowpower devices.In this paper,an energy harvesting modulation(EHM)assisted ...Integrated data and energy transfer(IDET)enables the electromagnetic waves to transmit wireless energy at the same time of data delivery for lowpower devices.In this paper,an energy harvesting modulation(EHM)assisted multi-user IDET system is studied,where all the received signals at the users are exploited for energy harvesting without the degradation of wireless data transfer(WDT)performance.The joint IDET performance is then analysed theoretically by conceiving a practical time-dependent wireless channel.With the aid of the AO based algorithm,the average effective data rate among users are maximized by ensuring the BER and the wireless energy transfer(WET)performance.Simulation results validate and evaluate the IDET performance of the EHM assisted system,which also demonstrates that the optimal number of user clusters and IDET time slots should be allocated,in order to improve the WET and WDT performance.展开更多
Building model data organization is often programmed to solve a specific problem,resulting in the inability to organize indoor and outdoor 3D scenes in an integrated manner.In this paper,existing building spatial data...Building model data organization is often programmed to solve a specific problem,resulting in the inability to organize indoor and outdoor 3D scenes in an integrated manner.In this paper,existing building spatial data models are studied,and the characteristics of building information modeling standards(IFC),city geographic modeling language(CityGML),indoor modeling language(IndoorGML),and other models are compared and analyzed.CityGML and IndoorGML models face challenges in satisfying diverse application scenarios and requirements due to limitations in their expression capabilities.It is proposed to combine the semantic information of the model objects to effectively partition and organize the indoor and outdoor spatial 3D model data and to construct the indoor and outdoor data organization mechanism of“chunk-layer-subobject-entrances-area-detail object.”This method is verified by proposing a 3D data organization method for indoor and outdoor space and constructing a 3D visualization system based on it.展开更多
To solve the query processing correctness problem for semantic-based relational data integration,the semantics of SAPRQL(simple protocol and RDF query language) queries is defined.In the course of query rewriting,al...To solve the query processing correctness problem for semantic-based relational data integration,the semantics of SAPRQL(simple protocol and RDF query language) queries is defined.In the course of query rewriting,all relative tables are found and decomposed into minimal connectable units.Minimal connectable units are joined according to semantic queries to produce the semantically correct query plans.Algorithms for query rewriting and transforming are presented.Computational complexity of the algorithms is discussed.Under the worst case,the query decomposing algorithm can be finished in O(n2) time and the query rewriting algorithm requires O(nm) time.And the performance of the algorithms is verified by experiments,and experimental results show that when the length of query is less than 8,the query processing algorithms can provide satisfactory performance.展开更多
The karst mountainous area is an ecologically fragile region with prominent humanland contradictions.The resource-environment carrying capacity(RECC)of this region needs to be further clarified.The development of remo...The karst mountainous area is an ecologically fragile region with prominent humanland contradictions.The resource-environment carrying capacity(RECC)of this region needs to be further clarified.The development of remote sensing(RS)and geographic information system(GIS)provides data sources and processing platform for RECC monitoring.This study analyzed and established the evaluation index system of RECC by considering particularity in the karst mountainous area of Southwest China;processed multisource RS data(Sentinel-2,Aster-DEM and Landsat-8)to extract the spatial distributions of nine key indexes by GIS techniques(information classification,overlay analysis and raster calculation);proposed the methods of index integration and fuzzy comprehensive evaluation of the RECC by GIS;and took a typical area,Guangnan County in Yunnan Province of China,as an experimental area to explore the effectiveness of the indexes and methods.The results showed that:(1)The important indexes affecting the RECC of karst mountainous area are water resources,tourism resources,position resources,geographical environment and soil erosion environment.(2)Data on cultivated land,construction land,minerals,transportation,water conservancy,ecosystem services,topography,soil erosion and rocky desertification can be obtained from RS data.GIS techniques integrate the information into the RECC results.The data extraction and processing methods are feasible on evaluating RECC.(3)The RECC of Guangnan County was in the mid-carrying level in 2018.The midcarrying and low-carrying levels were the main types,accounting for more than 80.00%of the total study area.The areas with high carrying capacity were mainly distributed in the northern regions of the northwest-southeast line of the county,and other areas have a low carrying capacity comparatively.The coordination between regional resource-environment status and socioeconomic development is the key to improve RECC.This study explores the evaluation index system of RECC in karst mountainous area and the application of multisource RS data and GIS techniques in the comprehensive evaluation.The methods can be applied in related fields to provide suggestions for data/information extraction and integration,and sustainable development.展开更多
Cloud storage is one of the main application of the cloud computing.With the data services in the cloud,users is able to outsource their data to the cloud,access and share their outsourced data from the cloud server a...Cloud storage is one of the main application of the cloud computing.With the data services in the cloud,users is able to outsource their data to the cloud,access and share their outsourced data from the cloud server anywhere and anytime.However,this new paradigm of data outsourcing services also introduces new security challenges,among which is how to ensure the integrity of the outsourced data.Although the cloud storage providers commit a reliable and secure environment to users,the integrity of data can still be damaged owing to the carelessness of humans and failures of hardwares/softwares or the attacks from external adversaries.Therefore,it is of great importance for users to audit the integrity of their data outsourced to the cloud.In this paper,we first design an auditing framework for cloud storage and proposed an algebraic signature based remote data possession checking protocol,which allows a third-party to auditing the integrity of the outsourced data on behalf of the users and supports unlimited number of verifications.Then we extends our auditing protocol to support data dynamic operations,including data update,data insertion and data deletion.The analysis and experiment results demonstrate that our proposed schemes are secure and efficient.展开更多
There are two key issues in distributed intrusion detection system,that is,maintaining load balance of system and protecting data integrity.To address these issues,this paper proposes a new distributed intrusion detec...There are two key issues in distributed intrusion detection system,that is,maintaining load balance of system and protecting data integrity.To address these issues,this paper proposes a new distributed intrusion detection model for big data based on nondestructive partitioning and balanced allocation.A data allocation strategy based on capacity and workload is introduced to achieve local load balance,and a dynamic load adjustment strategy is adopted to maintain global load balance of cluster.Moreover,data integrity is protected by using session reassemble and session partitioning.The simulation results show that the new model enjoys favorable advantages such as good load balance,higher detection rate and detection efficiency.展开更多
With the rapid development of Web, there are more and more Web databases available for users to access. At the same time, job searchers often have difficulties in first finding the right sources and then querying over...With the rapid development of Web, there are more and more Web databases available for users to access. At the same time, job searchers often have difficulties in first finding the right sources and then querying over them, providing such an integrated job search system over Web databases has become a Web application in high demand. Based on such consideration, we build a deep Web data integration system that supports unified access for users to multiple job Web sites as a job meta-search engine. In this paper, the architecture of the system is given first, and the key components in the system are introduced.展开更多
Cloud computing and storage services allow clients to move their data center and applications to centralized large data centers and thus avoid the burden of local data storage and maintenance.However,this poses new ch...Cloud computing and storage services allow clients to move their data center and applications to centralized large data centers and thus avoid the burden of local data storage and maintenance.However,this poses new challenges related to creating secure and reliable data storage over unreliable service providers.In this study,we address the problem of ensuring the integrity of data storage in cloud computing.In particular,we consider methods for reducing the burden of generating a constant amount of metadata at the client side.By exploiting some good attributes of the bilinear group,we can devise a simple and efficient audit service for public verification of untrusted and outsourced storage,which can be important for achieving widespread deployment of cloud computing.Whereas many prior studies on ensuring remote data integrity did not consider the burden of generating verification metadata at the client side,the objective of this study is to resolve this issue.Moreover,our scheme also supports data dynamics and public verifiability.Extensive security and performance analysis shows that the proposed scheme is highly efficient and provably secure.展开更多
Integration between file systems and multidatabase systems is a necessary approach to support data sharing from distributed and heterogeneous data sources. We first analyses problems about data integration between fil...Integration between file systems and multidatabase systems is a necessary approach to support data sharing from distributed and heterogeneous data sources. We first analyses problems about data integration between file systems and multidatabase systems. Then, A common data model named XIDM(XML\|based Integrating Dada Model), which is XML oriented, is presented. XIDM bases on a series of XML standards, especially XML Schema, and can well describe semistructured data. So XIDM is powerfully practicable and multipurpose.展开更多
Compaction correction is a key part of paleogeomorphic recovery methods. Yet, the influence of lithology on the porosity evolution is not usually taken into account. Present methods merely classify the lithologies as ...Compaction correction is a key part of paleogeomorphic recovery methods. Yet, the influence of lithology on the porosity evolution is not usually taken into account. Present methods merely classify the lithologies as sandstone and mudstone to undertake separate porositydepth compaction modeling. However, using just two lithologies is an oversimplification that cannot represent the compaction history. In such schemes, the precision of the compaction recovery is inadequate. To improve the precision of compaction recovery, a depth compaction model has been proposed that involves both porosity and clay content. A clastic lithological compaction unit classification method, based on clay content, has been designed to identify lithological boundaries and establish sets of compaction units. Also, on the basis of the clastic compaction unit classification, two methods of compaction recovery that integrate well and seismic data are employed to extrapolate well-based compaction information outward along seismic lines and recover the paleo-topography of the clastic strata in the region. The examples presented here show that a better understanding of paleo-geomorphology can be gained by applying the proposed compaction recovery technology.展开更多
Blast furnace (BF) ironmaking is the most typical “black box” process, and its complexity and uncertainty bring forth great challenges for furnace condition judgment and BF operation. Rich data resources for BF iron...Blast furnace (BF) ironmaking is the most typical “black box” process, and its complexity and uncertainty bring forth great challenges for furnace condition judgment and BF operation. Rich data resources for BF ironmaking are available, and the rapid development of data science and intelligent technology will provide an effective means to solve the uncertainty problem in the BF ironmaking process. This work focused on the application of artificial intelligence technology in BF ironmaking. The current intelligent BF ironmaking technology was summarized and analyzed from five aspects. These aspects include BF data management, the analyses of time delay and correlation, the prediction of BF key variables, the evaluation of BF status, and the multi-objective intelligent optimization of BF operations. Solutions and suggestions were offered for the problems in the current progress, and some outlooks for future prospects and technological breakthroughs were added. To effectively improve the BF data quality, we comprehensively considered the data problems and the characteristics of algorithms and selected the data processing method scientifically. For analyzing important BF characteristics, the effect of the delay was eliminated to ensure an accurate logical relationship between the BF parameters and economic indicators. As for BF parameter prediction and BF status evaluation,a BF intelligence model that integrates data information and process mechanism was built to effectively achieve the accurate prediction of BF key indexes and the scientific evaluation of BF status. During the optimization of BF parameters, low risk, low cost, and high return were used as the optimization criteria, and while pursuing the optimization effect, the feasibility and site operation cost were considered comprehensively.This work will help increase the process operator’s overall awareness and understanding of intelligent BF technology. Additionally, combining big data technology with the process will improve the practicality of data models in actual production and promote the application of intelligent technology in BF ironmaking.展开更多
How to integrate heterogeneous semi-structured Web records into relational database is an important and challengeable research topic. An improved model of conditional random fields was presented to combine the learnin...How to integrate heterogeneous semi-structured Web records into relational database is an important and challengeable research topic. An improved model of conditional random fields was presented to combine the learning of labeled samples and unlabeled database records in order to reduce the dependence on tediously hand-labeled training data. The pro- posed model was used to solve the problem of schema matching between data source schema and database schema. Experimental results using a large number of Web pages from diverse domains show the novel approach's effectiveness.展开更多
Progress in cloud computing makes group data sharing in outsourced storage a reality.People join in group and share data with each other,making team work more convenient.This new application scenario also faces data s...Progress in cloud computing makes group data sharing in outsourced storage a reality.People join in group and share data with each other,making team work more convenient.This new application scenario also faces data security threats,even more complex.When a user quit its group,remaining data block signatures must be re-signed to ensure security.Some researchers noticed this problem and proposed a few works to relieve computing overhead on user side.However,considering the privacy and security need of group auditing,there still lacks a comprehensive solution to implement secure group user revocation,supporting identity privacy preserving and collusion attack resistance.Aiming at this target,we construct a concrete scheme based on ring signature and smart contracts.We introduce linkable ring signature to build a kind of novel meta data for integrity proof enabling anonymous verification.And the new meta data supports secure revocation.Meanwhile,smart contracts are using for resisting possible collusion attack and malicious re-signing computation.Under the combined effectiveness of both signature method and blockchain smart contracts,our proposal supports reliable user revocation and signature re-signing,without revealing any user identity in the whole process.Security and performance analysis compared with previous works prove that the proposed scheme is feasible and efficient.展开更多
In order to satisfy the ever-increasing energy appetite of the massive battery-powered and batteryless communication devices,radio frequency(RF)signals have been relied upon for transferring wireless power to them.The...In order to satisfy the ever-increasing energy appetite of the massive battery-powered and batteryless communication devices,radio frequency(RF)signals have been relied upon for transferring wireless power to them.The joint coordination of wireless power transfer(WPT)and wireless information transfer(WIT)yields simultaneous wireless information and power transfer(SWIPT)as well as data and energy integrated communication network(DEIN).However,as a promising technique,few efforts are invested in the hardware implementation of DEIN.In order to make DEIN a reality,this paper focuses on hardware implementation of a DEIN.It firstly provides a brief tutorial on SWIPT,while summarising the latest hardware design of WPT transceiver and the existing commercial solutions.Then,a prototype design in DEIN with full protocol stack is elaborated,followed by its performance evaluation.展开更多
基金sponsored by the National Natural Science Foundation of China under grant number No. 62172353, No. 62302114, No. U20B2046 and No. 62172115Innovation Fund Program of the Engineering Research Center for Integration and Application of Digital Learning Technology of Ministry of Education No.1331007 and No. 1311022+1 种基金Natural Science Foundation of the Jiangsu Higher Education Institutions Grant No. 17KJB520044Six Talent Peaks Project in Jiangsu Province No.XYDXX-108
文摘With the rapid development of information technology,IoT devices play a huge role in physiological health data detection.The exponential growth of medical data requires us to reasonably allocate storage space for cloud servers and edge nodes.The storage capacity of edge nodes close to users is limited.We should store hotspot data in edge nodes as much as possible,so as to ensure response timeliness and access hit rate;However,the current scheme cannot guarantee that every sub-message in a complete data stored by the edge node meets the requirements of hot data;How to complete the detection and deletion of redundant data in edge nodes under the premise of protecting user privacy and data dynamic integrity has become a challenging problem.Our paper proposes a redundant data detection method that meets the privacy protection requirements.By scanning the cipher text,it is determined whether each sub-message of the data in the edge node meets the requirements of the hot data.It has the same effect as zero-knowledge proof,and it will not reveal the privacy of users.In addition,for redundant sub-data that does not meet the requirements of hot data,our paper proposes a redundant data deletion scheme that meets the dynamic integrity of the data.We use Content Extraction Signature(CES)to generate the remaining hot data signature after the redundant data is deleted.The feasibility of the scheme is proved through safety analysis and efficiency analysis.
基金funding within the Wheat BigData Project(German Federal Ministry of Food and Agriculture,FKZ2818408B18)。
文摘Genome-wide association mapping studies(GWAS)based on Big Data are a potential approach to improve marker-assisted selection in plant breeding.The number of available phenotypic and genomic data sets in which medium-sized populations of several hundred individuals have been studied is rapidly increasing.Combining these data and using them in GWAS could increase both the power of QTL discovery and the accuracy of estimation of underlying genetic effects,but is hindered by data heterogeneity and lack of interoperability.In this study,we used genomic and phenotypic data sets,focusing on Central European winter wheat populations evaluated for heading date.We explored strategies for integrating these data and subsequently the resulting potential for GWAS.Establishing interoperability between data sets was greatly aided by some overlapping genotypes and a linear relationship between the different phenotyping protocols,resulting in high quality integrated phenotypic data.In this context,genomic prediction proved to be a suitable tool to study relevance of interactions between genotypes and experimental series,which was low in our case.Contrary to expectations,fewer associations between markers and traits were found in the larger combined data than in the individual experimental series.However,the predictive power based on the marker-trait associations of the integrated data set was higher across data sets.Therefore,the results show that the integration of medium-sized to Big Data is an approach to increase the power to detect QTL in GWAS.The results encourage further efforts to standardize and share data in the plant breeding community.
基金supported by the National Natural Science Foundation of China (No.32070656)the Nanjing University Deng Feng Scholars Program+1 种基金the Priority Academic Program Development (PAPD) of Jiangsu Higher Education Institutions,China Postdoctoral Science Foundation funded project (No.2022M711563)Jiangsu Funding Program for Excellent Postdoctoral Talent (No.2022ZB50)
文摘Plant morphogenesis relies on precise gene expression programs at the proper time and position which is orchestrated by transcription factors(TFs)in intricate regulatory networks in a cell-type specific manner.Here we introduced a comprehensive single-cell transcriptomic atlas of Arabidopsis seedlings.This atlas is the result of meticulous integration of 63 previously published scRNA-seq datasets,addressing batch effects and conserving biological variance.This integration spans a broad spectrum of tissues,including both below-and above-ground parts.Utilizing a rigorous approach for cell type annotation,we identified 47 distinct cell types or states,largely expanding our current view of plant cell compositions.We systematically constructed cell-type specific gene regulatory networks and uncovered key regulators that act in a coordinated manner to control cell-type specific gene expression.Taken together,our study not only offers extensive plant cell atlas exploration that serves as a valuable resource,but also provides molecular insights into gene-regulatory programs that varies from different cell types.
文摘Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This article presents an effective mechanism to preserve the secrecy and integrity of data stored on the public cloud by leveraging blockchain technology,smart contracts,and cryptographic primitives.The proposed approach utilizes a Solidity-based smart contract as an auditor for maintaining and verifying the integrity of outsourced data.To preserve data secrecy,symmetric encryption systems are employed to encrypt user data before outsourcing it.An extensive performance analysis is conducted to illustrate the efficiency of the proposed mechanism.Additionally,a rigorous assessment is conducted to ensure that the developed smart contract is free from vulnerabilities and to measure its associated running costs.The security analysis of the proposed system confirms that our approach can securely maintain the confidentiality and integrity of cloud storage,even in the presence of malicious entities.The proposed mechanism contributes to enhancing data security in cloud computing environments and can be used as a foundation for developing more secure cloud storage systems.
基金This research was supported by the Qinghai Provincial High-End Innovative and Entrepreneurial Talents Project.
文摘Currently,there is a growing trend among users to store their data in the cloud.However,the cloud is vulnerable to persistent data corruption risks arising from equipment failures and hacker attacks.Additionally,when users perform file operations,the semantic integrity of the data can be compromised.Ensuring both data integrity and semantic correctness has become a critical issue that requires attention.We introduce a pioneering solution called Sec-Auditor,the first of its kind with the ability to verify data integrity and semantic correctness simultaneously,while maintaining a constant communication cost independent of the audited data volume.Sec-Auditor also supports public auditing,enabling anyone with access to public information to conduct data audits.This feature makes Sec-Auditor highly adaptable to open data environments,such as the cloud.In Sec-Auditor,users are assigned specific rules that are utilized to verify the accuracy of data semantic.Furthermore,users are given the flexibility to update their own rules as needed.We conduct in-depth analyses of the correctness and security of Sec-Auditor.We also compare several important security attributes with existing schemes,demonstrating the superior properties of Sec-Auditor.Evaluation results demonstrate that even for time-consuming file upload operations,our solution is more efficient than the comparison one.
基金This work was supported by the National Natural Science Foundation of China(U2133208,U20A20161).
文摘With the popularization of the Internet and the development of technology,cyber threats are increasing day by day.Threats such as malware,hacking,and data breaches have had a serious impact on cybersecurity.The network security environment in the era of big data presents the characteristics of large amounts of data,high diversity,and high real-time requirements.Traditional security defense methods and tools have been unable to cope with the complex and changing network security threats.This paper proposes a machine-learning security defense algorithm based on metadata association features.Emphasize control over unauthorized users through privacy,integrity,and availability.The user model is established and the mapping between the user model and the metadata of the data source is generated.By analyzing the user model and its corresponding mapping relationship,the query of the user model can be decomposed into the query of various heterogeneous data sources,and the integration of heterogeneous data sources based on the metadata association characteristics can be realized.Define and classify customer information,automatically identify and perceive sensitive data,build a behavior audit and analysis platform,analyze user behavior trajectories,and complete the construction of a machine learning customer information security defense system.The experimental results show that when the data volume is 5×103 bit,the data storage integrity of the proposed method is 92%.The data accuracy is 98%,and the success rate of data intrusion is only 2.6%.It can be concluded that the data storage method in this paper is safe,the data accuracy is always at a high level,and the data disaster recovery performance is good.This method can effectively resist data intrusion and has high air traffic control security.It can not only detect all viruses in user data storage,but also realize integrated virus processing,and further optimize the security defense effect of user big data.
基金supported in part by the MOST Major Research and Development Project(Grant No.2021YFB2900204)the National Natural Science Foundation of China(NSFC)(Grant No.62201123,No.62132004,No.61971102)+3 种基金China Postdoctoral Science Foundation(Grant No.2022TQ0056)in part by the financial support of the Sichuan Science and Technology Program(Grant No.2022YFH0022)Sichuan Major R&D Project(Grant No.22QYCX0168)the Municipal Government of Quzhou(Grant No.2022D031)。
文摘Integrated data and energy transfer(IDET)enables the electromagnetic waves to transmit wireless energy at the same time of data delivery for lowpower devices.In this paper,an energy harvesting modulation(EHM)assisted multi-user IDET system is studied,where all the received signals at the users are exploited for energy harvesting without the degradation of wireless data transfer(WDT)performance.The joint IDET performance is then analysed theoretically by conceiving a practical time-dependent wireless channel.With the aid of the AO based algorithm,the average effective data rate among users are maximized by ensuring the BER and the wireless energy transfer(WET)performance.Simulation results validate and evaluate the IDET performance of the EHM assisted system,which also demonstrates that the optimal number of user clusters and IDET time slots should be allocated,in order to improve the WET and WDT performance.
文摘Building model data organization is often programmed to solve a specific problem,resulting in the inability to organize indoor and outdoor 3D scenes in an integrated manner.In this paper,existing building spatial data models are studied,and the characteristics of building information modeling standards(IFC),city geographic modeling language(CityGML),indoor modeling language(IndoorGML),and other models are compared and analyzed.CityGML and IndoorGML models face challenges in satisfying diverse application scenarios and requirements due to limitations in their expression capabilities.It is proposed to combine the semantic information of the model objects to effectively partition and organize the indoor and outdoor spatial 3D model data and to construct the indoor and outdoor data organization mechanism of“chunk-layer-subobject-entrances-area-detail object.”This method is verified by proposing a 3D data organization method for indoor and outdoor space and constructing a 3D visualization system based on it.
基金Weaponry Equipment Pre-Research Foundation of PLA Equipment Ministry (No. 9140A06050409JB8102)Pre-Research Foundation of PLA University of Science and Technology (No. 2009JSJ11)
文摘To solve the query processing correctness problem for semantic-based relational data integration,the semantics of SAPRQL(simple protocol and RDF query language) queries is defined.In the course of query rewriting,all relative tables are found and decomposed into minimal connectable units.Minimal connectable units are joined according to semantic queries to produce the semantically correct query plans.Algorithms for query rewriting and transforming are presented.Computational complexity of the algorithms is discussed.Under the worst case,the query decomposing algorithm can be finished in O(n2) time and the query rewriting algorithm requires O(nm) time.And the performance of the algorithms is verified by experiments,and experimental results show that when the length of query is less than 8,the query processing algorithms can provide satisfactory performance.
基金the support given by the government and official in Guangnan Countyfunded by[National Natural Science Foundation of China]grant number[41361020,40961031]+3 种基金[Joint Fund of Yunnan Provincial Science and Technology Department and Yunnan University]grant number[2018FY001(-017)][Project of Innovative Talents Cultivation for Graduate Students of Yunnan University]grant number[C176230200][Project of Internationalization and Cultural Inheritance and Innovation of Yunnan University]grant number[C176250202][Science Research Fund of Yunnan Provincial Education Department in 2020:Postgraduate]grant number[2020Y0030]。
文摘The karst mountainous area is an ecologically fragile region with prominent humanland contradictions.The resource-environment carrying capacity(RECC)of this region needs to be further clarified.The development of remote sensing(RS)and geographic information system(GIS)provides data sources and processing platform for RECC monitoring.This study analyzed and established the evaluation index system of RECC by considering particularity in the karst mountainous area of Southwest China;processed multisource RS data(Sentinel-2,Aster-DEM and Landsat-8)to extract the spatial distributions of nine key indexes by GIS techniques(information classification,overlay analysis and raster calculation);proposed the methods of index integration and fuzzy comprehensive evaluation of the RECC by GIS;and took a typical area,Guangnan County in Yunnan Province of China,as an experimental area to explore the effectiveness of the indexes and methods.The results showed that:(1)The important indexes affecting the RECC of karst mountainous area are water resources,tourism resources,position resources,geographical environment and soil erosion environment.(2)Data on cultivated land,construction land,minerals,transportation,water conservancy,ecosystem services,topography,soil erosion and rocky desertification can be obtained from RS data.GIS techniques integrate the information into the RECC results.The data extraction and processing methods are feasible on evaluating RECC.(3)The RECC of Guangnan County was in the mid-carrying level in 2018.The midcarrying and low-carrying levels were the main types,accounting for more than 80.00%of the total study area.The areas with high carrying capacity were mainly distributed in the northern regions of the northwest-southeast line of the county,and other areas have a low carrying capacity comparatively.The coordination between regional resource-environment status and socioeconomic development is the key to improve RECC.This study explores the evaluation index system of RECC in karst mountainous area and the application of multisource RS data and GIS techniques in the comprehensive evaluation.The methods can be applied in related fields to provide suggestions for data/information extraction and integration,and sustainable development.
基金The authors would like to thank the reviewers for their detailed reviews and constructive comments, which have helped improve the quality of this paper. This work is supported by National Natural Science Foundation of China (No: 61379144), Foundation of Science and Technology on Information Assurance Laboratory (No: KJ-13-002) and the Graduate Innovation Fund of the National University of Defense Technology.
文摘Cloud storage is one of the main application of the cloud computing.With the data services in the cloud,users is able to outsource their data to the cloud,access and share their outsourced data from the cloud server anywhere and anytime.However,this new paradigm of data outsourcing services also introduces new security challenges,among which is how to ensure the integrity of the outsourced data.Although the cloud storage providers commit a reliable and secure environment to users,the integrity of data can still be damaged owing to the carelessness of humans and failures of hardwares/softwares or the attacks from external adversaries.Therefore,it is of great importance for users to audit the integrity of their data outsourced to the cloud.In this paper,we first design an auditing framework for cloud storage and proposed an algebraic signature based remote data possession checking protocol,which allows a third-party to auditing the integrity of the outsourced data on behalf of the users and supports unlimited number of verifications.Then we extends our auditing protocol to support data dynamic operations,including data update,data insertion and data deletion.The analysis and experiment results demonstrate that our proposed schemes are secure and efficient.
文摘There are two key issues in distributed intrusion detection system,that is,maintaining load balance of system and protecting data integrity.To address these issues,this paper proposes a new distributed intrusion detection model for big data based on nondestructive partitioning and balanced allocation.A data allocation strategy based on capacity and workload is introduced to achieve local load balance,and a dynamic load adjustment strategy is adopted to maintain global load balance of cluster.Moreover,data integrity is protected by using session reassemble and session partitioning.The simulation results show that the new model enjoys favorable advantages such as good load balance,higher detection rate and detection efficiency.
基金Supportted by the Natural Science Foundation ofChina (60573091 ,60273018) National Basic Research and Develop-ment Programof China (2003CB317000) the Key Project of Minis-try of Education of China (03044) .
文摘With the rapid development of Web, there are more and more Web databases available for users to access. At the same time, job searchers often have difficulties in first finding the right sources and then querying over them, providing such an integrated job search system over Web databases has become a Web application in high demand. Based on such consideration, we build a deep Web data integration system that supports unified access for users to multiple job Web sites as a job meta-search engine. In this paper, the architecture of the system is given first, and the key components in the system are introduced.
基金the National Natural Science Foundation of China,the National Basic Research Program of China ("973" Program) the National High Technology Research and Development Program of China ("863" Program)
文摘Cloud computing and storage services allow clients to move their data center and applications to centralized large data centers and thus avoid the burden of local data storage and maintenance.However,this poses new challenges related to creating secure and reliable data storage over unreliable service providers.In this study,we address the problem of ensuring the integrity of data storage in cloud computing.In particular,we consider methods for reducing the burden of generating a constant amount of metadata at the client side.By exploiting some good attributes of the bilinear group,we can devise a simple and efficient audit service for public verification of untrusted and outsourced storage,which can be important for achieving widespread deployment of cloud computing.Whereas many prior studies on ensuring remote data integrity did not consider the burden of generating verification metadata at the client side,the objective of this study is to resolve this issue.Moreover,our scheme also supports data dynamics and public verifiability.Extensive security and performance analysis shows that the proposed scheme is highly efficient and provably secure.
基金Supported by the Beforehand Research for National Defense of China(94J3. 4. 2. JW0 5 15 )
文摘Integration between file systems and multidatabase systems is a necessary approach to support data sharing from distributed and heterogeneous data sources. We first analyses problems about data integration between file systems and multidatabase systems. Then, A common data model named XIDM(XML\|based Integrating Dada Model), which is XML oriented, is presented. XIDM bases on a series of XML standards, especially XML Schema, and can well describe semistructured data. So XIDM is powerfully practicable and multipurpose.
文摘Compaction correction is a key part of paleogeomorphic recovery methods. Yet, the influence of lithology on the porosity evolution is not usually taken into account. Present methods merely classify the lithologies as sandstone and mudstone to undertake separate porositydepth compaction modeling. However, using just two lithologies is an oversimplification that cannot represent the compaction history. In such schemes, the precision of the compaction recovery is inadequate. To improve the precision of compaction recovery, a depth compaction model has been proposed that involves both porosity and clay content. A clastic lithological compaction unit classification method, based on clay content, has been designed to identify lithological boundaries and establish sets of compaction units. Also, on the basis of the clastic compaction unit classification, two methods of compaction recovery that integrate well and seismic data are employed to extrapolate well-based compaction information outward along seismic lines and recover the paleo-topography of the clastic strata in the region. The examples presented here show that a better understanding of paleo-geomorphology can be gained by applying the proposed compaction recovery technology.
基金financially supported by the General Program of the National Natural Science Foundation of China(No.52274326)the Fundamental Research Funds for the Central Universities (Nos.2125018 and 2225008)China Baowu Low Carbon Metallurgy Innovation Foundation(BWLCF202109)。
文摘Blast furnace (BF) ironmaking is the most typical “black box” process, and its complexity and uncertainty bring forth great challenges for furnace condition judgment and BF operation. Rich data resources for BF ironmaking are available, and the rapid development of data science and intelligent technology will provide an effective means to solve the uncertainty problem in the BF ironmaking process. This work focused on the application of artificial intelligence technology in BF ironmaking. The current intelligent BF ironmaking technology was summarized and analyzed from five aspects. These aspects include BF data management, the analyses of time delay and correlation, the prediction of BF key variables, the evaluation of BF status, and the multi-objective intelligent optimization of BF operations. Solutions and suggestions were offered for the problems in the current progress, and some outlooks for future prospects and technological breakthroughs were added. To effectively improve the BF data quality, we comprehensively considered the data problems and the characteristics of algorithms and selected the data processing method scientifically. For analyzing important BF characteristics, the effect of the delay was eliminated to ensure an accurate logical relationship between the BF parameters and economic indicators. As for BF parameter prediction and BF status evaluation,a BF intelligence model that integrates data information and process mechanism was built to effectively achieve the accurate prediction of BF key indexes and the scientific evaluation of BF status. During the optimization of BF parameters, low risk, low cost, and high return were used as the optimization criteria, and while pursuing the optimization effect, the feasibility and site operation cost were considered comprehensively.This work will help increase the process operator’s overall awareness and understanding of intelligent BF technology. Additionally, combining big data technology with the process will improve the practicality of data models in actual production and promote the application of intelligent technology in BF ironmaking.
基金Supported by the National Defense Pre-ResearchFoundation of China(4110105018)
文摘How to integrate heterogeneous semi-structured Web records into relational database is an important and challengeable research topic. An improved model of conditional random fields was presented to combine the learning of labeled samples and unlabeled database records in order to reduce the dependence on tediously hand-labeled training data. The pro- posed model was used to solve the problem of schema matching between data source schema and database schema. Experimental results using a large number of Web pages from diverse domains show the novel approach's effectiveness.
基金The work is supported by the National Key Research and Development Program of China(No.2018YFC1604002)the National Natural Science Foundation of China(No.U1836204,No.U1936208,No.U1936216,No.62002197).
文摘Progress in cloud computing makes group data sharing in outsourced storage a reality.People join in group and share data with each other,making team work more convenient.This new application scenario also faces data security threats,even more complex.When a user quit its group,remaining data block signatures must be re-signed to ensure security.Some researchers noticed this problem and proposed a few works to relieve computing overhead on user side.However,considering the privacy and security need of group auditing,there still lacks a comprehensive solution to implement secure group user revocation,supporting identity privacy preserving and collusion attack resistance.Aiming at this target,we construct a concrete scheme based on ring signature and smart contracts.We introduce linkable ring signature to build a kind of novel meta data for integrity proof enabling anonymous verification.And the new meta data supports secure revocation.Meanwhile,smart contracts are using for resisting possible collusion attack and malicious re-signing computation.Under the combined effectiveness of both signature method and blockchain smart contracts,our proposal supports reliable user revocation and signature re-signing,without revealing any user identity in the whole process.Security and performance analysis compared with previous works prove that the proposed scheme is feasible and efficient.
基金financial support of National Natural Science Foundation of China(NSFC),No.U1705263 and 61971102GF Innovative Research Programthe Sichuan Science and Technology Program,No.2019YJ0194。
文摘In order to satisfy the ever-increasing energy appetite of the massive battery-powered and batteryless communication devices,radio frequency(RF)signals have been relied upon for transferring wireless power to them.The joint coordination of wireless power transfer(WPT)and wireless information transfer(WIT)yields simultaneous wireless information and power transfer(SWIPT)as well as data and energy integrated communication network(DEIN).However,as a promising technique,few efforts are invested in the hardware implementation of DEIN.In order to make DEIN a reality,this paper focuses on hardware implementation of a DEIN.It firstly provides a brief tutorial on SWIPT,while summarising the latest hardware design of WPT transceiver and the existing commercial solutions.Then,a prototype design in DEIN with full protocol stack is elaborated,followed by its performance evaluation.