Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This a...Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This article presents an effective mechanism to preserve the secrecy and integrity of data stored on the public cloud by leveraging blockchain technology,smart contracts,and cryptographic primitives.The proposed approach utilizes a Solidity-based smart contract as an auditor for maintaining and verifying the integrity of outsourced data.To preserve data secrecy,symmetric encryption systems are employed to encrypt user data before outsourcing it.An extensive performance analysis is conducted to illustrate the efficiency of the proposed mechanism.Additionally,a rigorous assessment is conducted to ensure that the developed smart contract is free from vulnerabilities and to measure its associated running costs.The security analysis of the proposed system confirms that our approach can securely maintain the confidentiality and integrity of cloud storage,even in the presence of malicious entities.The proposed mechanism contributes to enhancing data security in cloud computing environments and can be used as a foundation for developing more secure cloud storage systems.展开更多
Currently,there is a growing trend among users to store their data in the cloud.However,the cloud is vulnerable to persistent data corruption risks arising from equipment failures and hacker attacks.Additionally,when ...Currently,there is a growing trend among users to store their data in the cloud.However,the cloud is vulnerable to persistent data corruption risks arising from equipment failures and hacker attacks.Additionally,when users perform file operations,the semantic integrity of the data can be compromised.Ensuring both data integrity and semantic correctness has become a critical issue that requires attention.We introduce a pioneering solution called Sec-Auditor,the first of its kind with the ability to verify data integrity and semantic correctness simultaneously,while maintaining a constant communication cost independent of the audited data volume.Sec-Auditor also supports public auditing,enabling anyone with access to public information to conduct data audits.This feature makes Sec-Auditor highly adaptable to open data environments,such as the cloud.In Sec-Auditor,users are assigned specific rules that are utilized to verify the accuracy of data semantic.Furthermore,users are given the flexibility to update their own rules as needed.We conduct in-depth analyses of the correctness and security of Sec-Auditor.We also compare several important security attributes with existing schemes,demonstrating the superior properties of Sec-Auditor.Evaluation results demonstrate that even for time-consuming file upload operations,our solution is more efficient than the comparison one.展开更多
Plant morphogenesis relies on precise gene expression programs at the proper time and position which is orchestrated by transcription factors(TFs)in intricate regulatory networks in a cell-type specific manner.Here we...Plant morphogenesis relies on precise gene expression programs at the proper time and position which is orchestrated by transcription factors(TFs)in intricate regulatory networks in a cell-type specific manner.Here we introduced a comprehensive single-cell transcriptomic atlas of Arabidopsis seedlings.This atlas is the result of meticulous integration of 63 previously published scRNA-seq datasets,addressing batch effects and conserving biological variance.This integration spans a broad spectrum of tissues,including both below-and above-ground parts.Utilizing a rigorous approach for cell type annotation,we identified 47 distinct cell types or states,largely expanding our current view of plant cell compositions.We systematically constructed cell-type specific gene regulatory networks and uncovered key regulators that act in a coordinated manner to control cell-type specific gene expression.Taken together,our study not only offers extensive plant cell atlas exploration that serves as a valuable resource,but also provides molecular insights into gene-regulatory programs that varies from different cell types.展开更多
Integrated data and energy transfer(IDET)enables the electromagnetic waves to transmit wireless energy at the same time of data delivery for lowpower devices.In this paper,an energy harvesting modulation(EHM)assisted ...Integrated data and energy transfer(IDET)enables the electromagnetic waves to transmit wireless energy at the same time of data delivery for lowpower devices.In this paper,an energy harvesting modulation(EHM)assisted multi-user IDET system is studied,where all the received signals at the users are exploited for energy harvesting without the degradation of wireless data transfer(WDT)performance.The joint IDET performance is then analysed theoretically by conceiving a practical time-dependent wireless channel.With the aid of the AO based algorithm,the average effective data rate among users are maximized by ensuring the BER and the wireless energy transfer(WET)performance.Simulation results validate and evaluate the IDET performance of the EHM assisted system,which also demonstrates that the optimal number of user clusters and IDET time slots should be allocated,in order to improve the WET and WDT performance.展开更多
Microsoft Excel is essential for the End-User Approach (EUA), offering versatility in data organization, analysis, and visualization, as well as widespread accessibility. It fosters collaboration and informed decision...Microsoft Excel is essential for the End-User Approach (EUA), offering versatility in data organization, analysis, and visualization, as well as widespread accessibility. It fosters collaboration and informed decision-making across diverse domains. Conversely, Python is indispensable for professional programming due to its versatility, readability, extensive libraries, and robust community support. It enables efficient development, advanced data analysis, data mining, and automation, catering to diverse industries and applications. However, one primary issue when using Microsoft Excel with Python libraries is compatibility and interoperability. While Excel is a widely used tool for data storage and analysis, it may not seamlessly integrate with Python libraries, leading to challenges in reading and writing data, especially in complex or large datasets. Additionally, manipulating Excel files with Python may not always preserve formatting or formulas accurately, potentially affecting data integrity. Moreover, dependency on Excel’s graphical user interface (GUI) for automation can limit scalability and reproducibility compared to Python’s scripting capabilities. This paper covers the integration solution of empowering non-programmers to leverage Python’s capabilities within the familiar Excel environment. This enables users to perform advanced data analysis and automation tasks without requiring extensive programming knowledge. Based on Soliciting feedback from non-programmers who have tested the integration solution, the case study shows how the solution evaluates the ease of implementation, performance, and compatibility of Python with Excel versions.展开更多
As one of the major threats to the current DeFi(Decentralized Finance)ecosystem,reentrant attack induces data inconsistency of the victim smart contract,enabling attackers to steal on-chain assets from DeFi projects,w...As one of the major threats to the current DeFi(Decentralized Finance)ecosystem,reentrant attack induces data inconsistency of the victim smart contract,enabling attackers to steal on-chain assets from DeFi projects,which could terribly do harm to the confidence of the blockchain investors.However,protecting DeFi projects from the reentrant attack is very difficult,since generating a call loop within the highly automatic DeFi ecosystem could be very practicable.Existing researchers mainly focus on the detection of the reentrant vulnerabilities in the code testing,and no method could promise the non-existent of reentrant vulnerabilities.In this paper,we introduce the database lock mechanism to isolate the correlated smart contract states from other operations in the same contract,so that we can prevent the attackers from abusing the inconsistent smart contract state.Compared to the existing resolutions of front-running,code audit,andmodifier,our method guarantees protection resultswith better flexibility.And we further evaluate our method on a number of de facto reentrant attacks observed from Etherscan.The results prove that our method could efficiently prevent the reentrant attack with less running cost.展开更多
Data protection in databases is critical for any organization,as unauthorized access or manipulation can have severe negative consequences.Intrusion detection systems are essential for keeping databases secure.Advance...Data protection in databases is critical for any organization,as unauthorized access or manipulation can have severe negative consequences.Intrusion detection systems are essential for keeping databases secure.Advancements in technology will lead to significant changes in the medical field,improving healthcare services through real-time information sharing.However,reliability and consistency still need to be solved.Safeguards against cyber-attacks are necessary due to the risk of unauthorized access to sensitive information and potential data corruption.Dis-ruptions to data items can propagate throughout the database,making it crucial to reverse fraudulent transactions without delay,especially in the healthcare industry,where real-time data access is vital.This research presents a role-based access control architecture for an anomaly detection technique.Additionally,the Structured Query Language(SQL)queries are stored in a new data structure called Pentaplet.These pentaplets allow us to maintain the correlation between SQL statements within the same transaction by employing the transaction-log entry information,thereby increasing detection accuracy,particularly for individuals within the company exhibiting unusual behavior.To identify anomalous queries,this system employs a supervised machine learning technique called Support Vector Machine(SVM).According to experimental findings,the proposed model performed well in terms of detection accuracy,achieving 99.92%through SVM with One Hot Encoding and Principal Component Analysis(PCA).展开更多
Nowadays,numerous applications are associated with cloud and user data gets collected globally and stored in cloud units.In addition to shared data storage,cloud computing technique offers multiple advantages for the ...Nowadays,numerous applications are associated with cloud and user data gets collected globally and stored in cloud units.In addition to shared data storage,cloud computing technique offers multiple advantages for the user through different distribution designs like hybrid cloud,public cloud,community cloud and private cloud.Though cloud-based computing solutions are highly con-venient to the users,it also brings a challenge i.e.,security of the data shared.Hence,in current research paper,blockchain with data integrity authentication technique is developed for an efficient and secure operation with user authentica-tion process.Blockchain technology is utilized in this study to enable efficient and secure operation which not only empowers cloud security but also avoids threats and attacks.Additionally,the data integrity authentication technique is also uti-lized to limit the unwanted access of data in cloud storage unit.The major objec-tive of the projected technique is to empower data security and user authentication in cloud computing environment.To improve the proposed authentication pro-cess,cuckoofilter and Merkle Hash Tree(MHT)are utilized.The proposed meth-odology was validated using few performance metrics such as processing time,uploading time,downloading time,authentication time,consensus time,waiting time,initialization time,in addition to storage overhead.The proposed method was compared with conventional cloud security techniques and the outcomes establish the supremacy of the proposed method.展开更多
Data Integrity is a critical component of Data lifecycle management. Its importance increases even more in a complex and dynamic landscape. Actions like unauthorized access, unauthorized modifications, data manipulati...Data Integrity is a critical component of Data lifecycle management. Its importance increases even more in a complex and dynamic landscape. Actions like unauthorized access, unauthorized modifications, data manipulations, audit tampering, data backdating, data falsification, phishing and spoofing are no longer restricted to rogue individuals but in fact also prevalent in systematic organizations and states as well. Therefore, data security requires strong data integrity measures and associated technical controls in place. Without proper customized framework in place, organizations are prone to high risk of financial, reputational, revenue losses, bankruptcies, and legal penalties which we shall discuss further throughout this paper. We will also explore some of the improvised and innovative techniques in product development to better tackle the challenges and requirements of data security and integrity.展开更多
Bioinformatic analysis of large and complex omics datasets has become increasingly useful in modern day biology by providing a great depth of information,with its application to neuroscience termed neuroinformatics.Da...Bioinformatic analysis of large and complex omics datasets has become increasingly useful in modern day biology by providing a great depth of information,with its application to neuroscience termed neuroinformatics.Data mining of omics datasets has enabled the generation of new hypotheses based on differentially regulated biological molecules associated with disease mechanisms,which can be tested experimentally for improved diagnostic and therapeutic targeting of neurodegenerative diseases.Importantly,integrating multi-omics data using a systems bioinformatics approach will advance the understanding of the layered and interactive network of biological regulation that exchanges systemic knowledge to facilitate the development of a comprehensive human brain profile.In this review,we first summarize data mining studies utilizing datasets from the individual type of omics analysis,including epigenetics/epigenomics,transcriptomics,proteomics,metabolomics,lipidomics,and spatial omics,pertaining to Alzheimer's disease,Parkinson's disease,and multiple sclerosis.We then discuss multi-omics integration approaches,including independent biological integration and unsupervised integration methods,for more intuitive and informative interpretation of the biological data obtained across different omics layers.We further assess studies that integrate multi-omics in data mining which provide convoluted biological insights and offer proof-of-concept proposition towards systems bioinformatics in the reconstruction of brain networks.Finally,we recommend a combination of high dimensional bioinformatics analysis with experimental validation to achieve translational neuroscience applications including biomarker discovery,therapeutic development,and elucidation of disease mechanisms.We conclude by providing future perspectives and opportunities in applying integrative multi-omics and systems bioinformatics to achieve precision phenotyping of neurodegenerative diseases and towards personalized medicine.展开更多
Integrated data and energy transfer(IDET)is capable of simultaneously delivering on-demand data and energy to low-power Internet of Everything(Io E)devices.We propose a multi-carrier IDET transceiver relying on superp...Integrated data and energy transfer(IDET)is capable of simultaneously delivering on-demand data and energy to low-power Internet of Everything(Io E)devices.We propose a multi-carrier IDET transceiver relying on superposition waveforms consisting of multi-sinusoidal signals for wireless energy transfer(WET)and orthogonal-frequency-divisionmultiplexing(OFDM)signals for wireless data transfer(WDT).The outdated channel state information(CSI)in aging channels is employed by the transmitter to shape IDET waveforms.With the constraints of transmission power and WDT requirement,the amplitudes and phases of the IDET waveform at the transmitter and the power splitter at the receiver are jointly optimised for maximising the average directcurrent(DC)among a limited number of transmission frames with the existence of carrier-frequencyoffset(CFO).For the amplitude optimisation,the original non-convex problem can be transformed into a reversed geometric programming problem,then it can be effectively solved with existing tools.As for the phase optimisation,the artificial bee colony(ABC)algorithm is invoked in order to deal with the nonconvexity.Iteration between the amplitude optimisation and phase optimisation yields our joint design.Numerical results demonstrate the advantage of our joint design for the IDET waveform shaping with the existence of the CFO and the outdated CSI.展开更多
As the sixth generation network(6G)emerges,the Internet of remote things(IoRT)has become a critical issue.However,conventional terrestrial networks cannot meet the delay-sensitive data collection needs of IoRT network...As the sixth generation network(6G)emerges,the Internet of remote things(IoRT)has become a critical issue.However,conventional terrestrial networks cannot meet the delay-sensitive data collection needs of IoRT networks,and the Space-Air-Ground integrated network(SAGIN)holds promise.We propose a novel setup that integrates non-orthogonal multiple access(NOMA)and wireless power transfer(WPT)to collect latency-sensitive data from IoRT networks.To extend the lifetime of devices,we aim to minimize the maximum energy consumption among all IoRT devices.Due to the coupling between variables,the resulting problem is non-convex.We first decouple the variables and split the original problem into four subproblems.Then,we propose an iterative algorithm to solve the corresponding subproblems based on successive convex approximation(SCA)techniques and slack variables.Finally,simulation results show that the NOMA strategy has a tremendous advantage over the OMA scheme in terms of network lifetime and energy efficiency,providing valuable insights.展开更多
Pipeline integrity is a cornerstone of the operation of many industrial systems, and maintaining pipeline integrity is essential for preventing economic losses and ecological damage caused by oil and gas leaks. Based ...Pipeline integrity is a cornerstone of the operation of many industrial systems, and maintaining pipeline integrity is essential for preventing economic losses and ecological damage caused by oil and gas leaks. Based on integritymanagement data published by the US Pipeline and Hazardous Materials Safety Administration, this study applied the k-means clustering and data envelopment analysis(DEA) methods to both explore the characteristics of pipeline-integrity management and evaluate its efficiency. The k-means clustering algorithm was found to be scientifically valid for classifying pipeline companies as either low-, medium-, or high-difficulty companies according to their integrity-management requirements. Regardless of a pipeline company's classification, equipment failure was found to be the main cause of pipeline failure. In-line inspection corrosion and dent tools were the two most-used tools for pipeline inspection. Among the types of repair, 180-day condition repairs were a key concern for pipeline companies. The results of the DEA analysis indicate that only three out of 34 companies were deemed to be DEA-effective. To improve the effectiveness of pipeline integrity management, we propose targeted directions and scales of improvement for non-DEA-effective companies.展开更多
Cloud computing and storage services allow clients to move their data center and applications to centralized large data centers and thus avoid the burden of local data storage and maintenance.However,this poses new ch...Cloud computing and storage services allow clients to move their data center and applications to centralized large data centers and thus avoid the burden of local data storage and maintenance.However,this poses new challenges related to creating secure and reliable data storage over unreliable service providers.In this study,we address the problem of ensuring the integrity of data storage in cloud computing.In particular,we consider methods for reducing the burden of generating a constant amount of metadata at the client side.By exploiting some good attributes of the bilinear group,we can devise a simple and efficient audit service for public verification of untrusted and outsourced storage,which can be important for achieving widespread deployment of cloud computing.Whereas many prior studies on ensuring remote data integrity did not consider the burden of generating verification metadata at the client side,the objective of this study is to resolve this issue.Moreover,our scheme also supports data dynamics and public verifiability.Extensive security and performance analysis shows that the proposed scheme is highly efficient and provably secure.展开更多
In this note we consider some basic, yet unusual, issues pertaining to the accuracy and stability of numerical integration methods to follow the solution of first order and second order initial value problems (IVP). I...In this note we consider some basic, yet unusual, issues pertaining to the accuracy and stability of numerical integration methods to follow the solution of first order and second order initial value problems (IVP). Included are remarks on multiple solutions, multi-step methods, effect of initial value perturbations, as well as slowing and advancing the computed motion in second order problems.展开更多
Terminal devices deployed in outdoor environments are facing a thorny problem of power supply.Data and energy integrated network(DEIN)is a promising technology to solve the problem,which simultaneously transfers data ...Terminal devices deployed in outdoor environments are facing a thorny problem of power supply.Data and energy integrated network(DEIN)is a promising technology to solve the problem,which simultaneously transfers data and energy through radio frequency signals.State-of-the-art researches mostly focus on theoretical aspects.By contrast,we provide a complete design and implementation of a fully functioning DEIN system with the support of an unmanned aerial vehicle(UAV).The UAV can be dispatched to areas of interest to remotely recharge batteryless terminals,while collecting essential information from them.Then,the UAV uploads the information to remote base stations.Our system verifies the feasibility of the DEIN in practical applications.展开更多
Wireless Sensor Networks (WSNs) typically use in-network processing to reduce the communication overhead. Due to the fusion of data items sourced at different nodes into a single one during in-network processing, the ...Wireless Sensor Networks (WSNs) typically use in-network processing to reduce the communication overhead. Due to the fusion of data items sourced at different nodes into a single one during in-network processing, the sanctity of the aggregated data needs to be ensured. Especially, the data integrity of the aggregated result is critical as any malicious update to it can jeopardize not one, but many sensor readings. In this paper, we analyse three different approaches to providing integrity support for SDA in WSNs. The first one is traditional MAC, in which each leaf node and intermediate node share a key with parent (symmetric key). The second is aggregate MAC (AMAC), in which a base station shares a unique key with all the other sensor nodes. The third is homomorphic MAC (Homo MAC) that is purely symmetric key-based approach. These approaches exhibit diverse trade-off in resource consumption and security assumptions. Adding together to that, we also propose a probabilistic and improved variant of homomorphic MAC that improves the security strength for secure data aggregation in WSNs. We carry out simulations in TinyOS environment to experimentally evaluate the impact of each of these on the resource consumption in WSNs.展开更多
With the rapid development of information technology,IoT devices play a huge role in physiological health data detection.The exponential growth of medical data requires us to reasonably allocate storage space for clou...With the rapid development of information technology,IoT devices play a huge role in physiological health data detection.The exponential growth of medical data requires us to reasonably allocate storage space for cloud servers and edge nodes.The storage capacity of edge nodes close to users is limited.We should store hotspot data in edge nodes as much as possible,so as to ensure response timeliness and access hit rate;However,the current scheme cannot guarantee that every sub-message in a complete data stored by the edge node meets the requirements of hot data;How to complete the detection and deletion of redundant data in edge nodes under the premise of protecting user privacy and data dynamic integrity has become a challenging problem.Our paper proposes a redundant data detection method that meets the privacy protection requirements.By scanning the cipher text,it is determined whether each sub-message of the data in the edge node meets the requirements of the hot data.It has the same effect as zero-knowledge proof,and it will not reveal the privacy of users.In addition,for redundant sub-data that does not meet the requirements of hot data,our paper proposes a redundant data deletion scheme that meets the dynamic integrity of the data.We use Content Extraction Signature(CES)to generate the remaining hot data signature after the redundant data is deleted.The feasibility of the scheme is proved through safety analysis and efficiency analysis.展开更多
Genome-wide association mapping studies(GWAS)based on Big Data are a potential approach to improve marker-assisted selection in plant breeding.The number of available phenotypic and genomic data sets in which medium-s...Genome-wide association mapping studies(GWAS)based on Big Data are a potential approach to improve marker-assisted selection in plant breeding.The number of available phenotypic and genomic data sets in which medium-sized populations of several hundred individuals have been studied is rapidly increasing.Combining these data and using them in GWAS could increase both the power of QTL discovery and the accuracy of estimation of underlying genetic effects,but is hindered by data heterogeneity and lack of interoperability.In this study,we used genomic and phenotypic data sets,focusing on Central European winter wheat populations evaluated for heading date.We explored strategies for integrating these data and subsequently the resulting potential for GWAS.Establishing interoperability between data sets was greatly aided by some overlapping genotypes and a linear relationship between the different phenotyping protocols,resulting in high quality integrated phenotypic data.In this context,genomic prediction proved to be a suitable tool to study relevance of interactions between genotypes and experimental series,which was low in our case.Contrary to expectations,fewer associations between markers and traits were found in the larger combined data than in the individual experimental series.However,the predictive power based on the marker-trait associations of the integrated data set was higher across data sets.Therefore,the results show that the integration of medium-sized to Big Data is an approach to increase the power to detect QTL in GWAS.The results encourage further efforts to standardize and share data in the plant breeding community.展开更多
Soursop(Annona muricata L.)is a tropical fruit highly valued for its uniqueflavor,nutritional value,and health-promoting properties.The ripening process of soursop involves complex changes in gene expression and metabo...Soursop(Annona muricata L.)is a tropical fruit highly valued for its uniqueflavor,nutritional value,and health-promoting properties.The ripening process of soursop involves complex changes in gene expression and metabo-lite accumulation,which have been studied using various omics technologies.Transcriptome analysis has provided insights into the regulation of key genes involved in ripening,while metabolic compound analysis has revealed the presence of numerous bioactive compounds with potential health benefits.However,the integration of transcrip-tome and metabolite compound data has not been extensively explored in soursop.Therefore,in this paper,we present a comprehensive analysis of the transcriptome and phenolic compound profiles of soursop during ripen-ing.The integration analysis showed that the genes and phenolic compounds were mainly involved in the starch and sucrose metabolism pathways during soursop ripening.Further,the phenolic compounds Kaempferol 3-Q-galactoside,Procyanidin C1,Procyanidin trimmer C1,and m-Coumaric,as well as the genes Ubiquitin-like protein 5(UBL5_ARATH),ATP-dependent zinc metalloprotease FTSH8(FTSH8_ORYSJ),Zinc transporter 4(ZIP4_AR-ATH),Thioredoxin-like 3-1(TRL31_ORYSJ),Mitogen-activated protein kinase YODA(YODA_ARATH),R-man-delonitrile lyase-like(MGL_ARATH),26s protease regulatory subunit 6A homolog(PRS6_SOLLC),Cytochrome P45072A13(C7A13ARATH),Cytochrome P45084A1(C84A1_ARATH)and Homoserine O-trans-acetylase(MET2-ORYSJ)were correlated and differentially accumulated and expressed,respectively.Our study provides new insights into the molecular mechanisms underlying soursop ripening and may contribute to the development of strategies for improving the nutritional quality and shelf life of this important fruit.展开更多
文摘Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This article presents an effective mechanism to preserve the secrecy and integrity of data stored on the public cloud by leveraging blockchain technology,smart contracts,and cryptographic primitives.The proposed approach utilizes a Solidity-based smart contract as an auditor for maintaining and verifying the integrity of outsourced data.To preserve data secrecy,symmetric encryption systems are employed to encrypt user data before outsourcing it.An extensive performance analysis is conducted to illustrate the efficiency of the proposed mechanism.Additionally,a rigorous assessment is conducted to ensure that the developed smart contract is free from vulnerabilities and to measure its associated running costs.The security analysis of the proposed system confirms that our approach can securely maintain the confidentiality and integrity of cloud storage,even in the presence of malicious entities.The proposed mechanism contributes to enhancing data security in cloud computing environments and can be used as a foundation for developing more secure cloud storage systems.
基金This research was supported by the Qinghai Provincial High-End Innovative and Entrepreneurial Talents Project.
文摘Currently,there is a growing trend among users to store their data in the cloud.However,the cloud is vulnerable to persistent data corruption risks arising from equipment failures and hacker attacks.Additionally,when users perform file operations,the semantic integrity of the data can be compromised.Ensuring both data integrity and semantic correctness has become a critical issue that requires attention.We introduce a pioneering solution called Sec-Auditor,the first of its kind with the ability to verify data integrity and semantic correctness simultaneously,while maintaining a constant communication cost independent of the audited data volume.Sec-Auditor also supports public auditing,enabling anyone with access to public information to conduct data audits.This feature makes Sec-Auditor highly adaptable to open data environments,such as the cloud.In Sec-Auditor,users are assigned specific rules that are utilized to verify the accuracy of data semantic.Furthermore,users are given the flexibility to update their own rules as needed.We conduct in-depth analyses of the correctness and security of Sec-Auditor.We also compare several important security attributes with existing schemes,demonstrating the superior properties of Sec-Auditor.Evaluation results demonstrate that even for time-consuming file upload operations,our solution is more efficient than the comparison one.
基金supported by the National Natural Science Foundation of China (No.32070656)the Nanjing University Deng Feng Scholars Program+1 种基金the Priority Academic Program Development (PAPD) of Jiangsu Higher Education Institutions,China Postdoctoral Science Foundation funded project (No.2022M711563)Jiangsu Funding Program for Excellent Postdoctoral Talent (No.2022ZB50)
文摘Plant morphogenesis relies on precise gene expression programs at the proper time and position which is orchestrated by transcription factors(TFs)in intricate regulatory networks in a cell-type specific manner.Here we introduced a comprehensive single-cell transcriptomic atlas of Arabidopsis seedlings.This atlas is the result of meticulous integration of 63 previously published scRNA-seq datasets,addressing batch effects and conserving biological variance.This integration spans a broad spectrum of tissues,including both below-and above-ground parts.Utilizing a rigorous approach for cell type annotation,we identified 47 distinct cell types or states,largely expanding our current view of plant cell compositions.We systematically constructed cell-type specific gene regulatory networks and uncovered key regulators that act in a coordinated manner to control cell-type specific gene expression.Taken together,our study not only offers extensive plant cell atlas exploration that serves as a valuable resource,but also provides molecular insights into gene-regulatory programs that varies from different cell types.
基金supported in part by the MOST Major Research and Development Project(Grant No.2021YFB2900204)the National Natural Science Foundation of China(NSFC)(Grant No.62201123,No.62132004,No.61971102)+3 种基金China Postdoctoral Science Foundation(Grant No.2022TQ0056)in part by the financial support of the Sichuan Science and Technology Program(Grant No.2022YFH0022)Sichuan Major R&D Project(Grant No.22QYCX0168)the Municipal Government of Quzhou(Grant No.2022D031)。
文摘Integrated data and energy transfer(IDET)enables the electromagnetic waves to transmit wireless energy at the same time of data delivery for lowpower devices.In this paper,an energy harvesting modulation(EHM)assisted multi-user IDET system is studied,where all the received signals at the users are exploited for energy harvesting without the degradation of wireless data transfer(WDT)performance.The joint IDET performance is then analysed theoretically by conceiving a practical time-dependent wireless channel.With the aid of the AO based algorithm,the average effective data rate among users are maximized by ensuring the BER and the wireless energy transfer(WET)performance.Simulation results validate and evaluate the IDET performance of the EHM assisted system,which also demonstrates that the optimal number of user clusters and IDET time slots should be allocated,in order to improve the WET and WDT performance.
文摘Microsoft Excel is essential for the End-User Approach (EUA), offering versatility in data organization, analysis, and visualization, as well as widespread accessibility. It fosters collaboration and informed decision-making across diverse domains. Conversely, Python is indispensable for professional programming due to its versatility, readability, extensive libraries, and robust community support. It enables efficient development, advanced data analysis, data mining, and automation, catering to diverse industries and applications. However, one primary issue when using Microsoft Excel with Python libraries is compatibility and interoperability. While Excel is a widely used tool for data storage and analysis, it may not seamlessly integrate with Python libraries, leading to challenges in reading and writing data, especially in complex or large datasets. Additionally, manipulating Excel files with Python may not always preserve formatting or formulas accurately, potentially affecting data integrity. Moreover, dependency on Excel’s graphical user interface (GUI) for automation can limit scalability and reproducibility compared to Python’s scripting capabilities. This paper covers the integration solution of empowering non-programmers to leverage Python’s capabilities within the familiar Excel environment. This enables users to perform advanced data analysis and automation tasks without requiring extensive programming knowledge. Based on Soliciting feedback from non-programmers who have tested the integration solution, the case study shows how the solution evaluates the ease of implementation, performance, and compatibility of Python with Excel versions.
基金supported byNationalKeyResearch andDevelopment Plan(Grant No.2018YFB1800701)Key-Area Research and Development Program of Guangdong Province 2020B0101090003,CCF-NSFOCUS Kunpeng Scientific Research Fund(CCF-NSFOCUS 2021010)+2 种基金National Natural Science Foundation of China(Grant Nos.61902083,62172115,61976064)Guangdong Higher Education Innovation Group 2020KCXTD007 and Guangzhou Higher Education Innovation Group(No.202032854)Guangzhou Fundamental Research Plan of“Municipalschool”Jointly Funded Projects(No.202102010445).
文摘As one of the major threats to the current DeFi(Decentralized Finance)ecosystem,reentrant attack induces data inconsistency of the victim smart contract,enabling attackers to steal on-chain assets from DeFi projects,which could terribly do harm to the confidence of the blockchain investors.However,protecting DeFi projects from the reentrant attack is very difficult,since generating a call loop within the highly automatic DeFi ecosystem could be very practicable.Existing researchers mainly focus on the detection of the reentrant vulnerabilities in the code testing,and no method could promise the non-existent of reentrant vulnerabilities.In this paper,we introduce the database lock mechanism to isolate the correlated smart contract states from other operations in the same contract,so that we can prevent the attackers from abusing the inconsistent smart contract state.Compared to the existing resolutions of front-running,code audit,andmodifier,our method guarantees protection resultswith better flexibility.And we further evaluate our method on a number of de facto reentrant attacks observed from Etherscan.The results prove that our method could efficiently prevent the reentrant attack with less running cost.
基金thankful to the Dean of Scientific Research at Najran University for funding this work under the Research Groups Funding Program,Grant Code(NU/RG/SERC/12/6).
文摘Data protection in databases is critical for any organization,as unauthorized access or manipulation can have severe negative consequences.Intrusion detection systems are essential for keeping databases secure.Advancements in technology will lead to significant changes in the medical field,improving healthcare services through real-time information sharing.However,reliability and consistency still need to be solved.Safeguards against cyber-attacks are necessary due to the risk of unauthorized access to sensitive information and potential data corruption.Dis-ruptions to data items can propagate throughout the database,making it crucial to reverse fraudulent transactions without delay,especially in the healthcare industry,where real-time data access is vital.This research presents a role-based access control architecture for an anomaly detection technique.Additionally,the Structured Query Language(SQL)queries are stored in a new data structure called Pentaplet.These pentaplets allow us to maintain the correlation between SQL statements within the same transaction by employing the transaction-log entry information,thereby increasing detection accuracy,particularly for individuals within the company exhibiting unusual behavior.To identify anomalous queries,this system employs a supervised machine learning technique called Support Vector Machine(SVM).According to experimental findings,the proposed model performed well in terms of detection accuracy,achieving 99.92%through SVM with One Hot Encoding and Principal Component Analysis(PCA).
文摘Nowadays,numerous applications are associated with cloud and user data gets collected globally and stored in cloud units.In addition to shared data storage,cloud computing technique offers multiple advantages for the user through different distribution designs like hybrid cloud,public cloud,community cloud and private cloud.Though cloud-based computing solutions are highly con-venient to the users,it also brings a challenge i.e.,security of the data shared.Hence,in current research paper,blockchain with data integrity authentication technique is developed for an efficient and secure operation with user authentica-tion process.Blockchain technology is utilized in this study to enable efficient and secure operation which not only empowers cloud security but also avoids threats and attacks.Additionally,the data integrity authentication technique is also uti-lized to limit the unwanted access of data in cloud storage unit.The major objec-tive of the projected technique is to empower data security and user authentication in cloud computing environment.To improve the proposed authentication pro-cess,cuckoofilter and Merkle Hash Tree(MHT)are utilized.The proposed meth-odology was validated using few performance metrics such as processing time,uploading time,downloading time,authentication time,consensus time,waiting time,initialization time,in addition to storage overhead.The proposed method was compared with conventional cloud security techniques and the outcomes establish the supremacy of the proposed method.
文摘Data Integrity is a critical component of Data lifecycle management. Its importance increases even more in a complex and dynamic landscape. Actions like unauthorized access, unauthorized modifications, data manipulations, audit tampering, data backdating, data falsification, phishing and spoofing are no longer restricted to rogue individuals but in fact also prevalent in systematic organizations and states as well. Therefore, data security requires strong data integrity measures and associated technical controls in place. Without proper customized framework in place, organizations are prone to high risk of financial, reputational, revenue losses, bankruptcies, and legal penalties which we shall discuss further throughout this paper. We will also explore some of the improvised and innovative techniques in product development to better tackle the challenges and requirements of data security and integrity.
基金supported by a Lee Kong Chian School of Medicine Dean’s Postdoctoral Fellowship(021207-00001)from Nanyang Technological University(NTU)Singapore and a Mistletoe Research Fellowship(022522-00001)from the Momental Foundation USA.Jialiu Zeng is supported by a Presidential Postdoctoral Fellowship(021229-00001)from NTU Singapore and an Open Fund Young Investigator Research Grant(OF-YIRG)(MOH-001147)from the National Medical Research Council(NMRC)SingaporeSu Bin Lim is supported by the National Research Foundation(NRF)of Korea(Grant Nos.:2020R1A6A1A03043539,2020M3A9D8037604,2022R1C1C1004756)a grant of the Korea Health Technology R&D Project through the Korea Health Industry Development Institute(KHIDI),funded by the Ministry of Health&Welfare,Republic of Korea(Grant No.:HR22C1734).
文摘Bioinformatic analysis of large and complex omics datasets has become increasingly useful in modern day biology by providing a great depth of information,with its application to neuroscience termed neuroinformatics.Data mining of omics datasets has enabled the generation of new hypotheses based on differentially regulated biological molecules associated with disease mechanisms,which can be tested experimentally for improved diagnostic and therapeutic targeting of neurodegenerative diseases.Importantly,integrating multi-omics data using a systems bioinformatics approach will advance the understanding of the layered and interactive network of biological regulation that exchanges systemic knowledge to facilitate the development of a comprehensive human brain profile.In this review,we first summarize data mining studies utilizing datasets from the individual type of omics analysis,including epigenetics/epigenomics,transcriptomics,proteomics,metabolomics,lipidomics,and spatial omics,pertaining to Alzheimer's disease,Parkinson's disease,and multiple sclerosis.We then discuss multi-omics integration approaches,including independent biological integration and unsupervised integration methods,for more intuitive and informative interpretation of the biological data obtained across different omics layers.We further assess studies that integrate multi-omics in data mining which provide convoluted biological insights and offer proof-of-concept proposition towards systems bioinformatics in the reconstruction of brain networks.Finally,we recommend a combination of high dimensional bioinformatics analysis with experimental validation to achieve translational neuroscience applications including biomarker discovery,therapeutic development,and elucidation of disease mechanisms.We conclude by providing future perspectives and opportunities in applying integrative multi-omics and systems bioinformatics to achieve precision phenotyping of neurodegenerative diseases and towards personalized medicine.
基金financial support of Natural Science Foundation of China(No.61971102,62132004)MOST Major Research and Development Project(No.2021YFB2900204)+1 种基金Sichuan Science and Technology Program(No.2022YFH0022)Key Research and Development Program of Zhejiang Province(No.2022C01093)。
文摘Integrated data and energy transfer(IDET)is capable of simultaneously delivering on-demand data and energy to low-power Internet of Everything(Io E)devices.We propose a multi-carrier IDET transceiver relying on superposition waveforms consisting of multi-sinusoidal signals for wireless energy transfer(WET)and orthogonal-frequency-divisionmultiplexing(OFDM)signals for wireless data transfer(WDT).The outdated channel state information(CSI)in aging channels is employed by the transmitter to shape IDET waveforms.With the constraints of transmission power and WDT requirement,the amplitudes and phases of the IDET waveform at the transmitter and the power splitter at the receiver are jointly optimised for maximising the average directcurrent(DC)among a limited number of transmission frames with the existence of carrier-frequencyoffset(CFO).For the amplitude optimisation,the original non-convex problem can be transformed into a reversed geometric programming problem,then it can be effectively solved with existing tools.As for the phase optimisation,the artificial bee colony(ABC)algorithm is invoked in order to deal with the nonconvexity.Iteration between the amplitude optimisation and phase optimisation yields our joint design.Numerical results demonstrate the advantage of our joint design for the IDET waveform shaping with the existence of the CFO and the outdated CSI.
基金supported by National Natural Science Foundation of China(No.62171158)the project“The Major Key Project of PCL(PCL2021A03-1)”from Peng Cheng Laboratorysupported by the Science and the Research Fund Program of Guangdong Key Laboratory of Aerospace Communication and Networking Technology(2018B030322004).
文摘As the sixth generation network(6G)emerges,the Internet of remote things(IoRT)has become a critical issue.However,conventional terrestrial networks cannot meet the delay-sensitive data collection needs of IoRT networks,and the Space-Air-Ground integrated network(SAGIN)holds promise.We propose a novel setup that integrates non-orthogonal multiple access(NOMA)and wireless power transfer(WPT)to collect latency-sensitive data from IoRT networks.To extend the lifetime of devices,we aim to minimize the maximum energy consumption among all IoRT devices.Due to the coupling between variables,the resulting problem is non-convex.We first decouple the variables and split the original problem into four subproblems.Then,we propose an iterative algorithm to solve the corresponding subproblems based on successive convex approximation(SCA)techniques and slack variables.Finally,simulation results show that the NOMA strategy has a tremendous advantage over the OMA scheme in terms of network lifetime and energy efficiency,providing valuable insights.
基金funded by the National Natural Science Foundation of China (Grant No. 71871018)。
文摘Pipeline integrity is a cornerstone of the operation of many industrial systems, and maintaining pipeline integrity is essential for preventing economic losses and ecological damage caused by oil and gas leaks. Based on integritymanagement data published by the US Pipeline and Hazardous Materials Safety Administration, this study applied the k-means clustering and data envelopment analysis(DEA) methods to both explore the characteristics of pipeline-integrity management and evaluate its efficiency. The k-means clustering algorithm was found to be scientifically valid for classifying pipeline companies as either low-, medium-, or high-difficulty companies according to their integrity-management requirements. Regardless of a pipeline company's classification, equipment failure was found to be the main cause of pipeline failure. In-line inspection corrosion and dent tools were the two most-used tools for pipeline inspection. Among the types of repair, 180-day condition repairs were a key concern for pipeline companies. The results of the DEA analysis indicate that only three out of 34 companies were deemed to be DEA-effective. To improve the effectiveness of pipeline integrity management, we propose targeted directions and scales of improvement for non-DEA-effective companies.
基金the National Natural Science Foundation of China,the National Basic Research Program of China ("973" Program) the National High Technology Research and Development Program of China ("863" Program)
文摘Cloud computing and storage services allow clients to move their data center and applications to centralized large data centers and thus avoid the burden of local data storage and maintenance.However,this poses new challenges related to creating secure and reliable data storage over unreliable service providers.In this study,we address the problem of ensuring the integrity of data storage in cloud computing.In particular,we consider methods for reducing the burden of generating a constant amount of metadata at the client side.By exploiting some good attributes of the bilinear group,we can devise a simple and efficient audit service for public verification of untrusted and outsourced storage,which can be important for achieving widespread deployment of cloud computing.Whereas many prior studies on ensuring remote data integrity did not consider the burden of generating verification metadata at the client side,the objective of this study is to resolve this issue.Moreover,our scheme also supports data dynamics and public verifiability.Extensive security and performance analysis shows that the proposed scheme is highly efficient and provably secure.
文摘In this note we consider some basic, yet unusual, issues pertaining to the accuracy and stability of numerical integration methods to follow the solution of first order and second order initial value problems (IVP). Included are remarks on multiple solutions, multi-step methods, effect of initial value perturbations, as well as slowing and advancing the computed motion in second order problems.
基金partly funded by Natural Science Foundation of China(No.61971102 and 62132004)Sichuan Science and Technology Program(No.22QYCX0168)the Municipal Government of Quzhou(Grant No.2021D003)。
文摘Terminal devices deployed in outdoor environments are facing a thorny problem of power supply.Data and energy integrated network(DEIN)is a promising technology to solve the problem,which simultaneously transfers data and energy through radio frequency signals.State-of-the-art researches mostly focus on theoretical aspects.By contrast,we provide a complete design and implementation of a fully functioning DEIN system with the support of an unmanned aerial vehicle(UAV).The UAV can be dispatched to areas of interest to remotely recharge batteryless terminals,while collecting essential information from them.Then,the UAV uploads the information to remote base stations.Our system verifies the feasibility of the DEIN in practical applications.
文摘Wireless Sensor Networks (WSNs) typically use in-network processing to reduce the communication overhead. Due to the fusion of data items sourced at different nodes into a single one during in-network processing, the sanctity of the aggregated data needs to be ensured. Especially, the data integrity of the aggregated result is critical as any malicious update to it can jeopardize not one, but many sensor readings. In this paper, we analyse three different approaches to providing integrity support for SDA in WSNs. The first one is traditional MAC, in which each leaf node and intermediate node share a key with parent (symmetric key). The second is aggregate MAC (AMAC), in which a base station shares a unique key with all the other sensor nodes. The third is homomorphic MAC (Homo MAC) that is purely symmetric key-based approach. These approaches exhibit diverse trade-off in resource consumption and security assumptions. Adding together to that, we also propose a probabilistic and improved variant of homomorphic MAC that improves the security strength for secure data aggregation in WSNs. We carry out simulations in TinyOS environment to experimentally evaluate the impact of each of these on the resource consumption in WSNs.
基金sponsored by the National Natural Science Foundation of China under grant number No. 62172353, No. 62302114, No. U20B2046 and No. 62172115Innovation Fund Program of the Engineering Research Center for Integration and Application of Digital Learning Technology of Ministry of Education No.1331007 and No. 1311022+1 种基金Natural Science Foundation of the Jiangsu Higher Education Institutions Grant No. 17KJB520044Six Talent Peaks Project in Jiangsu Province No.XYDXX-108
文摘With the rapid development of information technology,IoT devices play a huge role in physiological health data detection.The exponential growth of medical data requires us to reasonably allocate storage space for cloud servers and edge nodes.The storage capacity of edge nodes close to users is limited.We should store hotspot data in edge nodes as much as possible,so as to ensure response timeliness and access hit rate;However,the current scheme cannot guarantee that every sub-message in a complete data stored by the edge node meets the requirements of hot data;How to complete the detection and deletion of redundant data in edge nodes under the premise of protecting user privacy and data dynamic integrity has become a challenging problem.Our paper proposes a redundant data detection method that meets the privacy protection requirements.By scanning the cipher text,it is determined whether each sub-message of the data in the edge node meets the requirements of the hot data.It has the same effect as zero-knowledge proof,and it will not reveal the privacy of users.In addition,for redundant sub-data that does not meet the requirements of hot data,our paper proposes a redundant data deletion scheme that meets the dynamic integrity of the data.We use Content Extraction Signature(CES)to generate the remaining hot data signature after the redundant data is deleted.The feasibility of the scheme is proved through safety analysis and efficiency analysis.
基金funding within the Wheat BigData Project(German Federal Ministry of Food and Agriculture,FKZ2818408B18)。
文摘Genome-wide association mapping studies(GWAS)based on Big Data are a potential approach to improve marker-assisted selection in plant breeding.The number of available phenotypic and genomic data sets in which medium-sized populations of several hundred individuals have been studied is rapidly increasing.Combining these data and using them in GWAS could increase both the power of QTL discovery and the accuracy of estimation of underlying genetic effects,but is hindered by data heterogeneity and lack of interoperability.In this study,we used genomic and phenotypic data sets,focusing on Central European winter wheat populations evaluated for heading date.We explored strategies for integrating these data and subsequently the resulting potential for GWAS.Establishing interoperability between data sets was greatly aided by some overlapping genotypes and a linear relationship between the different phenotyping protocols,resulting in high quality integrated phenotypic data.In this context,genomic prediction proved to be a suitable tool to study relevance of interactions between genotypes and experimental series,which was low in our case.Contrary to expectations,fewer associations between markers and traits were found in the larger combined data than in the individual experimental series.However,the predictive power based on the marker-trait associations of the integrated data set was higher across data sets.Therefore,the results show that the integration of medium-sized to Big Data is an approach to increase the power to detect QTL in GWAS.The results encourage further efforts to standardize and share data in the plant breeding community.
基金funding from CONAHCYT by the grant Ciencia Básica y/o Ciencia de Frontera Modalidad Paradigmas y Controversias de la Ciencia,Grant Number 319996:“Análisis integral de datos transcriptómicos y metabolómicos asociados a la calidad de los frutos de guanábana(Annona muricata L.)durante almacenamiento poscosecha”.
文摘Soursop(Annona muricata L.)is a tropical fruit highly valued for its uniqueflavor,nutritional value,and health-promoting properties.The ripening process of soursop involves complex changes in gene expression and metabo-lite accumulation,which have been studied using various omics technologies.Transcriptome analysis has provided insights into the regulation of key genes involved in ripening,while metabolic compound analysis has revealed the presence of numerous bioactive compounds with potential health benefits.However,the integration of transcrip-tome and metabolite compound data has not been extensively explored in soursop.Therefore,in this paper,we present a comprehensive analysis of the transcriptome and phenolic compound profiles of soursop during ripen-ing.The integration analysis showed that the genes and phenolic compounds were mainly involved in the starch and sucrose metabolism pathways during soursop ripening.Further,the phenolic compounds Kaempferol 3-Q-galactoside,Procyanidin C1,Procyanidin trimmer C1,and m-Coumaric,as well as the genes Ubiquitin-like protein 5(UBL5_ARATH),ATP-dependent zinc metalloprotease FTSH8(FTSH8_ORYSJ),Zinc transporter 4(ZIP4_AR-ATH),Thioredoxin-like 3-1(TRL31_ORYSJ),Mitogen-activated protein kinase YODA(YODA_ARATH),R-man-delonitrile lyase-like(MGL_ARATH),26s protease regulatory subunit 6A homolog(PRS6_SOLLC),Cytochrome P45072A13(C7A13ARATH),Cytochrome P45084A1(C84A1_ARATH)and Homoserine O-trans-acetylase(MET2-ORYSJ)were correlated and differentially accumulated and expressed,respectively.Our study provides new insights into the molecular mechanisms underlying soursop ripening and may contribute to the development of strategies for improving the nutritional quality and shelf life of this important fruit.