Cloud storage is one of the main application of the cloud computing.With the data services in the cloud,users is able to outsource their data to the cloud,access and share their outsourced data from the cloud server a...Cloud storage is one of the main application of the cloud computing.With the data services in the cloud,users is able to outsource their data to the cloud,access and share their outsourced data from the cloud server anywhere and anytime.However,this new paradigm of data outsourcing services also introduces new security challenges,among which is how to ensure the integrity of the outsourced data.Although the cloud storage providers commit a reliable and secure environment to users,the integrity of data can still be damaged owing to the carelessness of humans and failures of hardwares/softwares or the attacks from external adversaries.Therefore,it is of great importance for users to audit the integrity of their data outsourced to the cloud.In this paper,we first design an auditing framework for cloud storage and proposed an algebraic signature based remote data possession checking protocol,which allows a third-party to auditing the integrity of the outsourced data on behalf of the users and supports unlimited number of verifications.Then we extends our auditing protocol to support data dynamic operations,including data update,data insertion and data deletion.The analysis and experiment results demonstrate that our proposed schemes are secure and efficient.展开更多
Cloud computing is becoming an important solution for providing scalable computing resources via Internet. Because there are tens of thousands of nodes in data center, the probability of server failures is nontrivial....Cloud computing is becoming an important solution for providing scalable computing resources via Internet. Because there are tens of thousands of nodes in data center, the probability of server failures is nontrivial. Therefore, it is a critical challenge to guarantee the service reliability. Fault-tolerance strategies, such as checkpoint, are commonly employed. Because of the failure of the edge switches, the checkpoint image may become inaccessible. Therefore, current checkpoint-based fault tolerance method cannot achieve the best effect. In this paper, we propose an optimal checkpoint method with edge switch failure-aware. The edge switch failure-aware checkpoint method includes two algorithms. The first algorithm employs the data center topology and communication characteristic for checkpoint image storage server selection. The second algorithm employs the checkpoint image storage characteristic as well as the data center topology to select the recovery server. Simulation experiments are performed to demonstrate the effectiveness of the proposed method.展开更多
Cloud computing and storage services allow clients to move their data center and applications to centralized large data centers and thus avoid the burden of local data storage and maintenance.However,this poses new ch...Cloud computing and storage services allow clients to move their data center and applications to centralized large data centers and thus avoid the burden of local data storage and maintenance.However,this poses new challenges related to creating secure and reliable data storage over unreliable service providers.In this study,we address the problem of ensuring the integrity of data storage in cloud computing.In particular,we consider methods for reducing the burden of generating a constant amount of metadata at the client side.By exploiting some good attributes of the bilinear group,we can devise a simple and efficient audit service for public verification of untrusted and outsourced storage,which can be important for achieving widespread deployment of cloud computing.Whereas many prior studies on ensuring remote data integrity did not consider the burden of generating verification metadata at the client side,the objective of this study is to resolve this issue.Moreover,our scheme also supports data dynamics and public verifiability.Extensive security and performance analysis shows that the proposed scheme is highly efficient and provably secure.展开更多
BACKGROUND As one of the fatal diseases with high incidence,lung cancer has seriously endangered public health and safety.Elderly patients usually have poor self-care and are more likely to show a series of psychologi...BACKGROUND As one of the fatal diseases with high incidence,lung cancer has seriously endangered public health and safety.Elderly patients usually have poor self-care and are more likely to show a series of psychological problems.AIM To investigate the effectiveness of the initial check,information exchange,final accuracy check,reaction(IIFAR)information care model on the mental health status of elderly patients with lung cancer.METHODS This study is a single-centre study.We randomly recruited 60 elderly patients with lung cancer who attended our hospital from January 2021 to January 2022.These elderly patients with lung cancer were randomly divided into two groups,with the control group taking the conventional propaganda and education and the observation group taking the IIFAR information care model based on the conventional care protocol.The differences in psychological distress,anxiety and depression,life quality,fatigue,and the locus of control in psychology were compared between these two groups,and the causes of psychological distress were analyzed.RESULTS After the intervention,Distress Thermometer,Hospital Anxiety and Depression Scale(HADS)for anxiety and the HADS for depression,Revised Piper’s Fatigue Scale,and Chance Health Locus of Control scores were lower in the observation group compared to the pre-intervention period in the same group and were significantly lower in the observation group compared to those of the control group(P<0.05).After the intervention,Quality of Life Questionnaire Core 30(QLQ-C30),Internal Health Locus of Control,and Powerful Others Health Locus of Control scores were significantly higher in the observation and the control groups compared to the pre-intervention period in their same group,and QLQ-C30 scores were significantly higher in the observation group compared to those of the control group(P<0.05).CONCLUSION The IIFAR information care model can help elderly patients with lung cancer by reducing their anxiety and depression,psychological distress,and fatigue,improving their tendencies on the locus of control in psychology,and enhancing their life qualities.展开更多
Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to est...Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.展开更多
The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of...The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of failure and lack of trust in the centralized system.However,it also brings new problems to the health information in the cloud storage environment,such as attribute leakage,low consensus efficiency,complex permission updates,and so on.This paper proposes an access control scheme with fine-grained attribute revocation,keyword search,and traceability of the attribute private key distribution process.Blockchain technology tracks the authorization of attribute private keys.The credit scoring method improves the Raft protocol in consensus efficiency.Besides,the interplanetary file system(IPFS)addresses the capacity deficit of blockchain.Under the premise of hiding policy,the research proposes a fine-grained access control method based on users,user attributes,and file structure.It optimizes the data-sharing mode.At the same time,Proxy Re-Encryption(PRE)technology is used to update the access rights.The proposed scheme proved to be secure.Comparative analysis and experimental results show that the proposed scheme has higher efficiency and more functions.It can meet the needs of medical institutions.展开更多
We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and c...We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and comprehensive workflow that utilizes the quantum approximate optimization algorithm(QAOA).It facilitates the automatic conversion of the original problem into a quadratic unconstrained binary optimization(QUBO)model and its corresponding Ising model,which can be subsequently transformed into a weight graph.The core of Qcover relies on a graph decomposition-based classical algorithm,which efficiently derives the optimal parameters for the shallow QAOA circuit.Quafu-Qcover incorporates a dedicated compiler capable of translating QAOA circuits into physical quantum circuits that can be executed on Quafu cloud quantum computers.Compared to a general-purpose compiler,our compiler demonstrates the ability to generate shorter circuit depths,while also exhibiting superior speed performance.Additionally,the Qcover compiler has the capability to dynamically create a library of qubits coupling substructures in real-time,utilizing the most recent calibration data from the superconducting quantum devices.This ensures that computational tasks can be assigned to connected physical qubits with the highest fidelity.The Quafu-Qcover allows us to retrieve quantum computing sampling results using a task ID at any time,enabling asynchronous processing.Moreover,it incorporates modules for results preprocessing and visualization,facilitating an intuitive display of solutions for combinatorial optimization problems.We hope that Quafu-Qcover can serve as an instructive illustration for how to explore application problems on the Quafu cloud quantum computers.展开更多
Cloud computing provides a diverse and adaptable resource pool over the internet,allowing users to tap into various resources as needed.It has been seen as a robust solution to relevant challenges.A significant delay ...Cloud computing provides a diverse and adaptable resource pool over the internet,allowing users to tap into various resources as needed.It has been seen as a robust solution to relevant challenges.A significant delay can hamper the performance of IoT-enabled cloud platforms.However,efficient task scheduling can lower the cloud infrastructure’s energy consumption,thus maximizing the service provider’s revenue by decreasing user job processing times.The proposed Modified Chimp-Whale Optimization Algorithm called Modified Chimp-Whale Optimization Algorithm(MCWOA),combines elements of the Chimp Optimization Algorithm(COA)and the Whale Optimization Algorithm(WOA).To enhance MCWOA’s identification precision,the Sobol sequence is used in the population initialization phase,ensuring an even distribution of the population across the solution space.Moreover,the traditional MCWOA’s local search capabilities are augmented by incorporating the whale optimization algorithm’s bubble-net hunting and random search mechanisms into MCWOA’s position-updating process.This study demonstrates the effectiveness of the proposed approach using a two-story rigid frame and a simply supported beam model.Simulated outcomes reveal that the new method outperforms the original MCWOA,especially in multi-damage detection scenarios.MCWOA excels in avoiding false positives and enhancing computational speed,making it an optimal choice for structural damage detection.The efficiency of the proposed MCWOA is assessed against metrics such as energy usage,computational expense,task duration,and delay.The simulated data indicates that the new MCWOA outpaces other methods across all metrics.The study also references the Whale Optimization Algorithm(WOA),Chimp Algorithm(CA),Ant Lion Optimizer(ALO),Genetic Algorithm(GA)and Grey Wolf Optimizer(GWO).展开更多
We introduce a model for provable data possession (PDP) which allows a client that has stored data at an untrusted server to verify that the server possesses the original data without retrieving it. In a previous work...We introduce a model for provable data possession (PDP) which allows a client that has stored data at an untrusted server to verify that the server possesses the original data without retrieving it. In a previous work, Ateniese et al. proposed a remote data integrity checking protocol that supports data partial dynamics. In this paper, we present a new remote data possession checking protocol which allows an unlimited number of file integrity verifications and efficiently supports dynamic operations, such as data modification, deletion, insertion and append. The proposed protocol supports public verifiability. In addition, the proposed protocol does not leak any private information to third-party verifiers. Through a specific analysis, we show the correctness and security of the protocol. After that, we demonstrate the proposed protocol has a good performance.展开更多
With the rapid advancement of quantum computing,hybrid quantum–classical machine learning has shown numerous potential applications at the current stage,with expectations of being achievable in the noisy intermediate...With the rapid advancement of quantum computing,hybrid quantum–classical machine learning has shown numerous potential applications at the current stage,with expectations of being achievable in the noisy intermediate-scale quantum(NISQ)era.Quantum reinforcement learning,as an indispensable study,has recently demonstrated its ability to solve standard benchmark environments with formally provable theoretical advantages over classical counterparts.However,despite the progress of quantum processors and the emergence of quantum computing clouds,implementing quantum reinforcement learning algorithms utilizing parameterized quantum circuits(PQCs)on NISQ devices remains infrequent.In this work,we take the first step towards executing benchmark quantum reinforcement problems on real devices equipped with at most 136 qubits on the BAQIS Quafu quantum computing cloud.The experimental results demonstrate that the policy agents can successfully accomplish objectives under modified conditions in both the training and inference phases.Moreover,we design hardware-efficient PQC architectures in the quantum model using a multi-objective evolutionary algorithm and develop a learning algorithm that is adaptable to quantum devices.We hope that the Quafu-RL can be a guiding example to show how to realize machine learning tasks by taking advantage of quantum computers on the quantum cloud platform.展开更多
Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduct...Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.展开更多
Security issues in cloud networks and edge computing have become very common. This research focuses on analyzing such issues and developing the best solutions. A detailed literature review has been conducted in this r...Security issues in cloud networks and edge computing have become very common. This research focuses on analyzing such issues and developing the best solutions. A detailed literature review has been conducted in this regard. The findings have shown that many challenges are linked to edge computing, such as privacy concerns, security breaches, high costs, low efficiency, etc. Therefore, there is a need to implement proper security measures to overcome these issues. Using emerging trends, like machine learning, encryption, artificial intelligence, real-time monitoring, etc., can help mitigate security issues. They can also develop a secure and safe future in cloud computing. It was concluded that the security implications of edge computing can easily be covered with the help of new technologies and techniques.展开更多
Cloud vertical structure(CVS)strongly affects atmospheric circulation and radiative transfer.Yet,long-term,groundbased observations are scarce over the Tibetan Plateau(TP)despite its vital role in global climate.This ...Cloud vertical structure(CVS)strongly affects atmospheric circulation and radiative transfer.Yet,long-term,groundbased observations are scarce over the Tibetan Plateau(TP)despite its vital role in global climate.This study utilizes ground-based lidar and Ka-band cloud profiling radar(KaCR)measurements at Yangbajain(YBJ),TP,from October 2021 to September 2022 to characterize cloud properties.A satisfactorily performing novel anomaly detection algorithm(LevelShiftAD)is proposed for lidar and KaCR profiles to identify cloud boundaries.Cloud base heights(CBH)retrieved from KaCR and lidar observations show good consistency,with a correlation coefficient of 0.78 and a mean difference of-0.06 km.Cloud top heights(CTH)derived from KaCR match the FengYun-4A and Himawari-8 products well.Thus,KaCR measurements serve as the primary dataset for investigating CVSs over the TP.Different diurnal cycles occur in summer and winter.The diurnal cycle is characterized by a pronounced increase in cloud occurrence frequency in the afternoon with an early-morning decrease in winter,while cloud amounts remain high all day,with scattered nocturnal increases in summer.Summer features more frequent clouds with larger geometrical thicknesses,a higher multi-layer ratio,and greater inter-cloud spacing.Around 26%of the cloud bases occur below 0.5 km.Winter exhibits a bimodal distribution of cloud base heights with peaks at 0-0.5 km and 2-2.5 km.Single-layer and geometrically thin clouds prevail at YBJ.This study enriches long-term measurements of CVSs over the TP,and the robust anomaly detection method helps quantify cloud macro-physical properties via synergistic lidar and radar observations.展开更多
In a convective scheme featuring a discretized cloud size density, the assumed lateral mixing rate is inversely proportional to the exponential coefficient of plume size. This follows a typical assumption of-1, but it...In a convective scheme featuring a discretized cloud size density, the assumed lateral mixing rate is inversely proportional to the exponential coefficient of plume size. This follows a typical assumption of-1, but it has unveiled inherent uncertainties, especially for deep layer clouds. Addressing this knowledge gap, we conducted comprehensive large eddy simulations and comparative analyses focused on terrestrial regions. Our investigation revealed that cloud formation adheres to the tenets of Bernoulli trials, illustrating power-law scaling that remains consistent regardless of the inherent deep layer cloud attributes existing between cloud size and the number of clouds. This scaling paradigm encompasses liquid, ice, and mixed phases in deep layer clouds. The exponent characterizing the interplay between cloud scale and number in the deep layer cloud, specifically for liquid, ice, or mixed-phase clouds, resembles that of shallow convection,but converges closely to zero. This convergence signifies a propensity for diminished cloud numbers and sizes within deep layer clouds. Notably, the infusion of abundant moisture and the release of latent heat by condensation within the lower atmospheric strata make substantial contributions. However, this role in ice phase formation is limited. The emergence of liquid and ice phases in deep layer clouds is facilitated by the latent heat and influenced by the wind shear inherent in the middle levels. These interrelationships hold potential applications in formulating parameterizations and post-processing model outcomes.展开更多
In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding ...In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding the risk of data loss and data overlapping.The development of data flow scheduling approaches in the cloud environment taking security parameters into account is insufficient.In our work,we propose a data scheduling model for the cloud environment.Themodel is made up of three parts that together help dispatch user data flow to the appropriate cloudVMs.The first component is the Collector Agent whichmust periodically collect information on the state of the network links.The second one is the monitoring agent which must then analyze,classify,and make a decision on the state of the link and finally transmit this information to the scheduler.The third one is the scheduler who must consider previous information to transfer user data,including fair distribution and reliable paths.It should be noted that each part of the proposedmodel requires the development of its algorithms.In this article,we are interested in the development of data transfer algorithms,including fairness distribution with the consideration of a stable link state.These algorithms are based on the grouping of transmitted files and the iterative method.The proposed algorithms showthe performances to obtain an approximate solution to the studied problem which is an NP-hard(Non-Polynomial solution)problem.The experimental results show that the best algorithm is the half-grouped minimum excluding(HME),with a percentage of 91.3%,an average deviation of 0.042,and an execution time of 0.001 s.展开更多
This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering...This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.展开更多
基金The authors would like to thank the reviewers for their detailed reviews and constructive comments, which have helped improve the quality of this paper. This work is supported by National Natural Science Foundation of China (No: 61379144), Foundation of Science and Technology on Information Assurance Laboratory (No: KJ-13-002) and the Graduate Innovation Fund of the National University of Defense Technology.
文摘Cloud storage is one of the main application of the cloud computing.With the data services in the cloud,users is able to outsource their data to the cloud,access and share their outsourced data from the cloud server anywhere and anytime.However,this new paradigm of data outsourcing services also introduces new security challenges,among which is how to ensure the integrity of the outsourced data.Although the cloud storage providers commit a reliable and secure environment to users,the integrity of data can still be damaged owing to the carelessness of humans and failures of hardwares/softwares or the attacks from external adversaries.Therefore,it is of great importance for users to audit the integrity of their data outsourced to the cloud.In this paper,we first design an auditing framework for cloud storage and proposed an algebraic signature based remote data possession checking protocol,which allows a third-party to auditing the integrity of the outsourced data on behalf of the users and supports unlimited number of verifications.Then we extends our auditing protocol to support data dynamic operations,including data update,data insertion and data deletion.The analysis and experiment results demonstrate that our proposed schemes are secure and efficient.
基金supported by Beijing Natural Science Foundation (4174100)NSFC(61602054)the Fundamental Research Funds for the Central Universities
文摘Cloud computing is becoming an important solution for providing scalable computing resources via Internet. Because there are tens of thousands of nodes in data center, the probability of server failures is nontrivial. Therefore, it is a critical challenge to guarantee the service reliability. Fault-tolerance strategies, such as checkpoint, are commonly employed. Because of the failure of the edge switches, the checkpoint image may become inaccessible. Therefore, current checkpoint-based fault tolerance method cannot achieve the best effect. In this paper, we propose an optimal checkpoint method with edge switch failure-aware. The edge switch failure-aware checkpoint method includes two algorithms. The first algorithm employs the data center topology and communication characteristic for checkpoint image storage server selection. The second algorithm employs the checkpoint image storage characteristic as well as the data center topology to select the recovery server. Simulation experiments are performed to demonstrate the effectiveness of the proposed method.
基金the National Natural Science Foundation of China,the National Basic Research Program of China ("973" Program) the National High Technology Research and Development Program of China ("863" Program)
文摘Cloud computing and storage services allow clients to move their data center and applications to centralized large data centers and thus avoid the burden of local data storage and maintenance.However,this poses new challenges related to creating secure and reliable data storage over unreliable service providers.In this study,we address the problem of ensuring the integrity of data storage in cloud computing.In particular,we consider methods for reducing the burden of generating a constant amount of metadata at the client side.By exploiting some good attributes of the bilinear group,we can devise a simple and efficient audit service for public verification of untrusted and outsourced storage,which can be important for achieving widespread deployment of cloud computing.Whereas many prior studies on ensuring remote data integrity did not consider the burden of generating verification metadata at the client side,the objective of this study is to resolve this issue.Moreover,our scheme also supports data dynamics and public verifiability.Extensive security and performance analysis shows that the proposed scheme is highly efficient and provably secure.
文摘BACKGROUND As one of the fatal diseases with high incidence,lung cancer has seriously endangered public health and safety.Elderly patients usually have poor self-care and are more likely to show a series of psychological problems.AIM To investigate the effectiveness of the initial check,information exchange,final accuracy check,reaction(IIFAR)information care model on the mental health status of elderly patients with lung cancer.METHODS This study is a single-centre study.We randomly recruited 60 elderly patients with lung cancer who attended our hospital from January 2021 to January 2022.These elderly patients with lung cancer were randomly divided into two groups,with the control group taking the conventional propaganda and education and the observation group taking the IIFAR information care model based on the conventional care protocol.The differences in psychological distress,anxiety and depression,life quality,fatigue,and the locus of control in psychology were compared between these two groups,and the causes of psychological distress were analyzed.RESULTS After the intervention,Distress Thermometer,Hospital Anxiety and Depression Scale(HADS)for anxiety and the HADS for depression,Revised Piper’s Fatigue Scale,and Chance Health Locus of Control scores were lower in the observation group compared to the pre-intervention period in the same group and were significantly lower in the observation group compared to those of the control group(P<0.05).After the intervention,Quality of Life Questionnaire Core 30(QLQ-C30),Internal Health Locus of Control,and Powerful Others Health Locus of Control scores were significantly higher in the observation and the control groups compared to the pre-intervention period in their same group,and QLQ-C30 scores were significantly higher in the observation group compared to those of the control group(P<0.05).CONCLUSION The IIFAR information care model can help elderly patients with lung cancer by reducing their anxiety and depression,psychological distress,and fatigue,improving their tendencies on the locus of control in psychology,and enhancing their life qualities.
基金supported in part by the Nationa Natural Science Foundation of China (61876011)the National Key Research and Development Program of China (2022YFB4703700)+1 种基金the Key Research and Development Program 2020 of Guangzhou (202007050002)the Key-Area Research and Development Program of Guangdong Province (2020B090921003)。
文摘Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.
基金This research was funded by the National Natural Science Foundation of China,Grant Number 62162039the Shaanxi Provincial Key R&D Program,China with Grant Number 2020GY-041.
文摘The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of failure and lack of trust in the centralized system.However,it also brings new problems to the health information in the cloud storage environment,such as attribute leakage,low consensus efficiency,complex permission updates,and so on.This paper proposes an access control scheme with fine-grained attribute revocation,keyword search,and traceability of the attribute private key distribution process.Blockchain technology tracks the authorization of attribute private keys.The credit scoring method improves the Raft protocol in consensus efficiency.Besides,the interplanetary file system(IPFS)addresses the capacity deficit of blockchain.Under the premise of hiding policy,the research proposes a fine-grained access control method based on users,user attributes,and file structure.It optimizes the data-sharing mode.At the same time,Proxy Re-Encryption(PRE)technology is used to update the access rights.The proposed scheme proved to be secure.Comparative analysis and experimental results show that the proposed scheme has higher efficiency and more functions.It can meet the needs of medical institutions.
基金supported by the National Natural Science Foundation of China(Grant No.92365206)the support of the China Postdoctoral Science Foundation(Certificate Number:2023M740272)+1 种基金supported by the National Natural Science Foundation of China(Grant No.12247168)China Postdoctoral Science Foundation(Certificate Number:2022TQ0036)。
文摘We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and comprehensive workflow that utilizes the quantum approximate optimization algorithm(QAOA).It facilitates the automatic conversion of the original problem into a quadratic unconstrained binary optimization(QUBO)model and its corresponding Ising model,which can be subsequently transformed into a weight graph.The core of Qcover relies on a graph decomposition-based classical algorithm,which efficiently derives the optimal parameters for the shallow QAOA circuit.Quafu-Qcover incorporates a dedicated compiler capable of translating QAOA circuits into physical quantum circuits that can be executed on Quafu cloud quantum computers.Compared to a general-purpose compiler,our compiler demonstrates the ability to generate shorter circuit depths,while also exhibiting superior speed performance.Additionally,the Qcover compiler has the capability to dynamically create a library of qubits coupling substructures in real-time,utilizing the most recent calibration data from the superconducting quantum devices.This ensures that computational tasks can be assigned to connected physical qubits with the highest fidelity.The Quafu-Qcover allows us to retrieve quantum computing sampling results using a task ID at any time,enabling asynchronous processing.Moreover,it incorporates modules for results preprocessing and visualization,facilitating an intuitive display of solutions for combinatorial optimization problems.We hope that Quafu-Qcover can serve as an instructive illustration for how to explore application problems on the Quafu cloud quantum computers.
文摘Cloud computing provides a diverse and adaptable resource pool over the internet,allowing users to tap into various resources as needed.It has been seen as a robust solution to relevant challenges.A significant delay can hamper the performance of IoT-enabled cloud platforms.However,efficient task scheduling can lower the cloud infrastructure’s energy consumption,thus maximizing the service provider’s revenue by decreasing user job processing times.The proposed Modified Chimp-Whale Optimization Algorithm called Modified Chimp-Whale Optimization Algorithm(MCWOA),combines elements of the Chimp Optimization Algorithm(COA)and the Whale Optimization Algorithm(WOA).To enhance MCWOA’s identification precision,the Sobol sequence is used in the population initialization phase,ensuring an even distribution of the population across the solution space.Moreover,the traditional MCWOA’s local search capabilities are augmented by incorporating the whale optimization algorithm’s bubble-net hunting and random search mechanisms into MCWOA’s position-updating process.This study demonstrates the effectiveness of the proposed approach using a two-story rigid frame and a simply supported beam model.Simulated outcomes reveal that the new method outperforms the original MCWOA,especially in multi-damage detection scenarios.MCWOA excels in avoiding false positives and enhancing computational speed,making it an optimal choice for structural damage detection.The efficiency of the proposed MCWOA is assessed against metrics such as energy usage,computational expense,task duration,and delay.The simulated data indicates that the new MCWOA outpaces other methods across all metrics.The study also references the Whale Optimization Algorithm(WOA),Chimp Algorithm(CA),Ant Lion Optimizer(ALO),Genetic Algorithm(GA)and Grey Wolf Optimizer(GWO).
文摘We introduce a model for provable data possession (PDP) which allows a client that has stored data at an untrusted server to verify that the server possesses the original data without retrieving it. In a previous work, Ateniese et al. proposed a remote data integrity checking protocol that supports data partial dynamics. In this paper, we present a new remote data possession checking protocol which allows an unlimited number of file integrity verifications and efficiently supports dynamic operations, such as data modification, deletion, insertion and append. The proposed protocol supports public verifiability. In addition, the proposed protocol does not leak any private information to third-party verifiers. Through a specific analysis, we show the correctness and security of the protocol. After that, we demonstrate the proposed protocol has a good performance.
基金supported by the Beijing Academy of Quantum Information Sciencessupported by the National Natural Science Foundation of China(Grant No.92365206)+2 种基金the support of the China Postdoctoral Science Foundation(Certificate Number:2023M740272)supported by the National Natural Science Foundation of China(Grant No.12247168)China Postdoctoral Science Foundation(Certificate Number:2022TQ0036)。
文摘With the rapid advancement of quantum computing,hybrid quantum–classical machine learning has shown numerous potential applications at the current stage,with expectations of being achievable in the noisy intermediate-scale quantum(NISQ)era.Quantum reinforcement learning,as an indispensable study,has recently demonstrated its ability to solve standard benchmark environments with formally provable theoretical advantages over classical counterparts.However,despite the progress of quantum processors and the emergence of quantum computing clouds,implementing quantum reinforcement learning algorithms utilizing parameterized quantum circuits(PQCs)on NISQ devices remains infrequent.In this work,we take the first step towards executing benchmark quantum reinforcement problems on real devices equipped with at most 136 qubits on the BAQIS Quafu quantum computing cloud.The experimental results demonstrate that the policy agents can successfully accomplish objectives under modified conditions in both the training and inference phases.Moreover,we design hardware-efficient PQC architectures in the quantum model using a multi-objective evolutionary algorithm and develop a learning algorithm that is adaptable to quantum devices.We hope that the Quafu-RL can be a guiding example to show how to realize machine learning tasks by taking advantage of quantum computers on the quantum cloud platform.
基金National Natural Science Foundation of China(Nos.42071444,42101444)。
文摘Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.
文摘Security issues in cloud networks and edge computing have become very common. This research focuses on analyzing such issues and developing the best solutions. A detailed literature review has been conducted in this regard. The findings have shown that many challenges are linked to edge computing, such as privacy concerns, security breaches, high costs, low efficiency, etc. Therefore, there is a need to implement proper security measures to overcome these issues. Using emerging trends, like machine learning, encryption, artificial intelligence, real-time monitoring, etc., can help mitigate security issues. They can also develop a secure and safe future in cloud computing. It was concluded that the security implications of edge computing can easily be covered with the help of new technologies and techniques.
基金jointly funded by the Second Tibetan Plateau Scientific Expedition and Research Program of China under Grant 2019QZKK0604the National Natural Science Foundation of China(Grant Nos.92044303 and 42001294).
文摘Cloud vertical structure(CVS)strongly affects atmospheric circulation and radiative transfer.Yet,long-term,groundbased observations are scarce over the Tibetan Plateau(TP)despite its vital role in global climate.This study utilizes ground-based lidar and Ka-band cloud profiling radar(KaCR)measurements at Yangbajain(YBJ),TP,from October 2021 to September 2022 to characterize cloud properties.A satisfactorily performing novel anomaly detection algorithm(LevelShiftAD)is proposed for lidar and KaCR profiles to identify cloud boundaries.Cloud base heights(CBH)retrieved from KaCR and lidar observations show good consistency,with a correlation coefficient of 0.78 and a mean difference of-0.06 km.Cloud top heights(CTH)derived from KaCR match the FengYun-4A and Himawari-8 products well.Thus,KaCR measurements serve as the primary dataset for investigating CVSs over the TP.Different diurnal cycles occur in summer and winter.The diurnal cycle is characterized by a pronounced increase in cloud occurrence frequency in the afternoon with an early-morning decrease in winter,while cloud amounts remain high all day,with scattered nocturnal increases in summer.Summer features more frequent clouds with larger geometrical thicknesses,a higher multi-layer ratio,and greater inter-cloud spacing.Around 26%of the cloud bases occur below 0.5 km.Winter exhibits a bimodal distribution of cloud base heights with peaks at 0-0.5 km and 2-2.5 km.Single-layer and geometrically thin clouds prevail at YBJ.This study enriches long-term measurements of CVSs over the TP,and the robust anomaly detection method helps quantify cloud macro-physical properties via synergistic lidar and radar observations.
基金supported by the Second Tibetan Plateau Scientific Expedition and Research Program (STEP) (Grant No.2019QZKK010203)the National Natural Science Foundation of China (Grant No.42175174 and 41975130)+1 种基金the Natural Science Foundation of Sichuan Province (Grant No.2022NSFSC1092)the Sichuan Provincial Innovation Training Program for College Students (Grant No.S202210621009)。
文摘In a convective scheme featuring a discretized cloud size density, the assumed lateral mixing rate is inversely proportional to the exponential coefficient of plume size. This follows a typical assumption of-1, but it has unveiled inherent uncertainties, especially for deep layer clouds. Addressing this knowledge gap, we conducted comprehensive large eddy simulations and comparative analyses focused on terrestrial regions. Our investigation revealed that cloud formation adheres to the tenets of Bernoulli trials, illustrating power-law scaling that remains consistent regardless of the inherent deep layer cloud attributes existing between cloud size and the number of clouds. This scaling paradigm encompasses liquid, ice, and mixed phases in deep layer clouds. The exponent characterizing the interplay between cloud scale and number in the deep layer cloud, specifically for liquid, ice, or mixed-phase clouds, resembles that of shallow convection,but converges closely to zero. This convergence signifies a propensity for diminished cloud numbers and sizes within deep layer clouds. Notably, the infusion of abundant moisture and the release of latent heat by condensation within the lower atmospheric strata make substantial contributions. However, this role in ice phase formation is limited. The emergence of liquid and ice phases in deep layer clouds is facilitated by the latent heat and influenced by the wind shear inherent in the middle levels. These interrelationships hold potential applications in formulating parameterizations and post-processing model outcomes.
基金the deputyship for Research&Innovation,Ministry of Education in Saudi Arabia for funding this research work through the Project Number(IFP-2022-34).
文摘In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding the risk of data loss and data overlapping.The development of data flow scheduling approaches in the cloud environment taking security parameters into account is insufficient.In our work,we propose a data scheduling model for the cloud environment.Themodel is made up of three parts that together help dispatch user data flow to the appropriate cloudVMs.The first component is the Collector Agent whichmust periodically collect information on the state of the network links.The second one is the monitoring agent which must then analyze,classify,and make a decision on the state of the link and finally transmit this information to the scheduler.The third one is the scheduler who must consider previous information to transfer user data,including fair distribution and reliable paths.It should be noted that each part of the proposedmodel requires the development of its algorithms.In this article,we are interested in the development of data transfer algorithms,including fairness distribution with the consideration of a stable link state.These algorithms are based on the grouping of transmitted files and the iterative method.The proposed algorithms showthe performances to obtain an approximate solution to the studied problem which is an NP-hard(Non-Polynomial solution)problem.The experimental results show that the best algorithm is the half-grouped minimum excluding(HME),with a percentage of 91.3%,an average deviation of 0.042,and an execution time of 0.001 s.
文摘This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.