Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduct...Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.展开更多
As the extensive use of cloud computing raises questions about the security of any personal data stored there,cryptography is being used more frequently as a security tool to protect data confidentiality and privacy i...As the extensive use of cloud computing raises questions about the security of any personal data stored there,cryptography is being used more frequently as a security tool to protect data confidentiality and privacy in the cloud environment.A hypervisor is a virtualization software used in cloud hosting to divide and allocate resources on various pieces of hardware.The choice of hypervisor can significantly impact the performance of cryptographic operations in the cloud environment.An important issue that must be carefully examined is that no hypervisor is completely superior in terms of performance;Each hypervisor should be examined to meet specific needs.The main objective of this study is to provide accurate results to compare the performance of Hyper-V and Kernel-based Virtual Machine(KVM)while implementing different cryptographic algorithms to guide cloud service providers and end users in choosing the most suitable hypervisor for their cryptographic needs.This study evaluated the efficiency of two hypervisors,Hyper-V and KVM,in implementing six cryptographic algorithms:Rivest,Shamir,Adleman(RSA),Advanced Encryption Standard(AES),Triple Data Encryption Standard(TripleDES),Carlisle Adams and Stafford Tavares(CAST-128),BLOWFISH,and TwoFish.The study’s findings show that KVM outperforms Hyper-V,with 12.2%less Central Processing Unit(CPU)use and 12.95%less time overall for encryption and decryption operations with various file sizes.The study’s findings emphasize how crucial it is to pick a hypervisor that is appropriate for cryptographic needs in a cloud environment,which could assist both cloud service providers and end users.Future research may focus more on how various hypervisors perform while handling cryptographic workloads.展开更多
Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This a...Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This article presents an effective mechanism to preserve the secrecy and integrity of data stored on the public cloud by leveraging blockchain technology,smart contracts,and cryptographic primitives.The proposed approach utilizes a Solidity-based smart contract as an auditor for maintaining and verifying the integrity of outsourced data.To preserve data secrecy,symmetric encryption systems are employed to encrypt user data before outsourcing it.An extensive performance analysis is conducted to illustrate the efficiency of the proposed mechanism.Additionally,a rigorous assessment is conducted to ensure that the developed smart contract is free from vulnerabilities and to measure its associated running costs.The security analysis of the proposed system confirms that our approach can securely maintain the confidentiality and integrity of cloud storage,even in the presence of malicious entities.The proposed mechanism contributes to enhancing data security in cloud computing environments and can be used as a foundation for developing more secure cloud storage systems.展开更多
We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and c...We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and comprehensive workflow that utilizes the quantum approximate optimization algorithm(QAOA).It facilitates the automatic conversion of the original problem into a quadratic unconstrained binary optimization(QUBO)model and its corresponding Ising model,which can be subsequently transformed into a weight graph.The core of Qcover relies on a graph decomposition-based classical algorithm,which efficiently derives the optimal parameters for the shallow QAOA circuit.Quafu-Qcover incorporates a dedicated compiler capable of translating QAOA circuits into physical quantum circuits that can be executed on Quafu cloud quantum computers.Compared to a general-purpose compiler,our compiler demonstrates the ability to generate shorter circuit depths,while also exhibiting superior speed performance.Additionally,the Qcover compiler has the capability to dynamically create a library of qubits coupling substructures in real-time,utilizing the most recent calibration data from the superconducting quantum devices.This ensures that computational tasks can be assigned to connected physical qubits with the highest fidelity.The Quafu-Qcover allows us to retrieve quantum computing sampling results using a task ID at any time,enabling asynchronous processing.Moreover,it incorporates modules for results preprocessing and visualization,facilitating an intuitive display of solutions for combinatorial optimization problems.We hope that Quafu-Qcover can serve as an instructive illustration for how to explore application problems on the Quafu cloud quantum computers.展开更多
This paper focuses on the effective utilization of data augmentation techniques for 3Dlidar point clouds to enhance the performance of neural network models.These point clouds,which represent spatial information throu...This paper focuses on the effective utilization of data augmentation techniques for 3Dlidar point clouds to enhance the performance of neural network models.These point clouds,which represent spatial information through a collection of 3D coordinates,have found wide-ranging applications.Data augmentation has emerged as a potent solution to the challenges posed by limited labeled data and the need to enhance model generalization capabilities.Much of the existing research is devoted to crafting novel data augmentation methods specifically for 3D lidar point clouds.However,there has been a lack of focus on making the most of the numerous existing augmentation techniques.Addressing this deficiency,this research investigates the possibility of combining two fundamental data augmentation strategies.The paper introduces PolarMix andMix3D,two commonly employed augmentation techniques,and presents a new approach,named RandomFusion.Instead of using a fixed or predetermined combination of augmentation methods,RandomFusion randomly chooses one method from a pool of options for each instance or sample.This innovative data augmentation technique randomly augments each point in the point cloud with either PolarMix or Mix3D.The crux of this strategy is the random choice between PolarMix and Mix3Dfor the augmentation of each point within the point cloud data set.The results of the experiments conducted validate the efficacy of the RandomFusion strategy in enhancing the performance of neural network models for 3D lidar point cloud semantic segmentation tasks.This is achieved without compromising computational efficiency.By examining the potential of merging different augmentation techniques,the research contributes significantly to a more comprehensive understanding of how to utilize existing augmentation methods for 3D lidar point clouds.RandomFusion data augmentation technique offers a simple yet effective method to leverage the diversity of augmentation techniques and boost the robustness of models.The insights gained from this research can pave the way for future work aimed at developing more advanced and efficient data augmentation strategies for 3D lidar point cloud analysis.展开更多
Cloud top pressure(CTP)is one of the critical cloud properties that significantly affects the radiative effect of clouds.Multi-angle polarized sensors can employ polarized bands(490 nm)or O_(2)A-bands(763 and 765 nm)t...Cloud top pressure(CTP)is one of the critical cloud properties that significantly affects the radiative effect of clouds.Multi-angle polarized sensors can employ polarized bands(490 nm)or O_(2)A-bands(763 and 765 nm)to retrieve the CTP.However,the CTP retrieved by the two methods shows inconsistent results in certain cases,and large uncertainties in low and thin cloud retrievals,which may lead to challenges in subsequent applications.This study proposes a synergistic algorithm that considers both O_(2)A-bands and polarized bands using a random forest(RF)model.LiDAR CTP data are used as the true values and the polarized and non-polarized measurements are concatenated to train the RF model to determine CTP.Additionally,through analysis,we proposed that the polarized signal becomes saturated as the cloud optical thickness(COT)increases,necessitating a particular treatment for cases where COT<10 to improve the algorithm's stability.The synergistic method was then applied to the directional polarized camera(DPC)and Polarized and Directionality of the Earth’s Reflectance(POLDER)measurements for evaluation,and the resulting retrieval accuracy of the POLDER-based measurements(RMSEPOLDER=205.176 hPa,RMSEDPC=171.141 hPa,R^(2)POLDER=0.636,R^(2)DPC=0.663,respectively)were higher than that of the MODIS and POLDER Rayleigh pressure measurements.The synergistic algorithm also showed good performance with the application of DPC data.This algorithm is expected to provide data support for atmosphere-related fields as an atmospheric remote sensing algorithm within the Cloud Application for Remote Sensing,Atmospheric Radiation,and Updating Energy(CARE)platform.展开更多
A new era of data access and management has begun with the use of cloud computing in the healthcare industry.Despite the efficiency and scalability that the cloud provides, the security of private patient data is stil...A new era of data access and management has begun with the use of cloud computing in the healthcare industry.Despite the efficiency and scalability that the cloud provides, the security of private patient data is still a majorconcern. Encryption, network security, and adherence to data protection laws are key to ensuring the confidentialityand integrity of healthcare data in the cloud. The computational overhead of encryption technologies could leadto delays in data access and processing rates. To address these challenges, we introduced the Enhanced ParallelMulti-Key Encryption Algorithm (EPM-KEA), aiming to bolster healthcare data security and facilitate the securestorage of critical patient records in the cloud. The data was gathered from two categories Authorization forHospital Admission (AIH) and Authorization for High Complexity Operations.We use Z-score normalization forpreprocessing. The primary goal of implementing encryption techniques is to secure and store massive amountsof data on the cloud. It is feasible that cloud storage alternatives for protecting healthcare data will become morewidely available if security issues can be successfully fixed. As a result of our analysis using specific parametersincluding Execution time (42%), Encryption time (45%), Decryption time (40%), Security level (97%), and Energyconsumption (53%), the system demonstrated favorable performance when compared to the traditional method.This suggests that by addressing these security concerns, there is the potential for broader accessibility to cloudstorage solutions for safeguarding healthcare data.展开更多
This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hac...This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.展开更多
To address the current issues of inaccurate segmentation and the limited applicability of segmentation methods for building facades in point clouds, we propose a facade segmentation algorithm based on optimal dual-sca...To address the current issues of inaccurate segmentation and the limited applicability of segmentation methods for building facades in point clouds, we propose a facade segmentation algorithm based on optimal dual-scale feature descriptors. First, we select the optimal dual-scale descriptors from a range of feature descriptors. Next, we segment the facade according to the threshold value of the chosen optimal dual-scale descriptors. Finally, we use RANSAC (Random Sample Consensus) to fit the segmented surface and optimize the fitting result. Experimental results show that, compared to commonly used facade segmentation algorithms, the proposed method yields more accurate segmentation results, providing a robust data foundation for subsequent 3D model reconstruction of buildings.展开更多
The Google Cloud Platform (GCP) is a popular choice for companies seeking a comprehensive cloud computing solution because it provides everything from essential computing resources to powerful data analytics and machi...The Google Cloud Platform (GCP) is a popular choice for companies seeking a comprehensive cloud computing solution because it provides everything from essential computing resources to powerful data analytics and machine learning capabilities. Saviynt is a cloud-based Identity and Access Management (IAM) system that integrates with Google Cloud Platform (GCP) and other services for additional functionality. However, other problems are associated with the transition, such as the requirement to correctly integrate IAM Saviynt into current IT infrastructures and provide comprehensive training to users on the new system. The paper will give a detailed review of the advantages, disadvantages, and best practices related to this transition.展开更多
There is instability in the distributed energy storage cloud group end region on the power grid side.In order to avoid large-scale fluctuating charging and discharging in the power grid environment and make the capaci...There is instability in the distributed energy storage cloud group end region on the power grid side.In order to avoid large-scale fluctuating charging and discharging in the power grid environment and make the capacitor components showa continuous and stable charging and discharging state,a hierarchical time-sharing configuration algorithm of distributed energy storage cloud group end region on the power grid side based on multi-scale and multi feature convolution neural network is proposed.Firstly,a voltage stability analysis model based onmulti-scale and multi feature convolution neural network is constructed,and the multi-scale and multi feature convolution neural network is optimized based on Self-OrganizingMaps(SOM)algorithm to analyze the voltage stability of the cloud group end region of distributed energy storage on the grid side under the framework of credibility.According to the optimal scheduling objectives and network size,the distributed robust optimal configuration control model is solved under the framework of coordinated optimal scheduling at multiple time scales;Finally,the time series characteristics of regional power grid load and distributed generation are analyzed.According to the regional hierarchical time-sharing configuration model of“cloud”,“group”and“end”layer,the grid side distributed energy storage cloud group end regional hierarchical time-sharing configuration algorithm is realized.The experimental results show that after applying this algorithm,the best grid side distributed energy storage configuration scheme can be determined,and the stability of grid side distributed energy storage cloud group end region layered timesharing configuration can be improved.展开更多
Safety evaluation of toppling rock slopes developing in reservoir areas is crucial. To reduce the uncertainty of safety evaluation, this study developed a composite cloud model, which improved the combination weights ...Safety evaluation of toppling rock slopes developing in reservoir areas is crucial. To reduce the uncertainty of safety evaluation, this study developed a composite cloud model, which improved the combination weights of the decision-making trial and evaluation laboratory (DEMATEL) and criteria importance through intercriteria correlation (CRITIC) methods. A safety evaluation system was developed according to in situ monitoring data. The backward cloud generator was used to calculate the numerical characteristics of a cloud model of quantitative indices, and different virtual clouds were used to synthesize some clouds into a generalized one. The synthesized numerical characteristics were calculated to comprehensively evaluate the safety of toppling rock slopes. A case study of a toppling rock slope near the Huangdeng Hydropower Station in China was conducted using monitoring data collected since operation of the hydropower project began. The results indicated that the toppling rock slope was moderately safe with a low safety margin. The composite cloud model considers the fuzziness and randomness of safety evaluation and enables interchange between qualitative and quantitative knowledge. This study provides a new theoretical method for evaluating the safety of toppling rock slopes. It can aid in the predication, control, and even prevention of disasters.展开更多
The process of entrainment-mixing between cumulus clouds and the ambient air is important for the development of cumulus clouds.Accurately obtaining the entrainment rate(λ)is particularly important for its parameteri...The process of entrainment-mixing between cumulus clouds and the ambient air is important for the development of cumulus clouds.Accurately obtaining the entrainment rate(λ)is particularly important for its parameterization within the overall cumulus parameterization scheme.In this study,an improved bulk-plume method is proposed by solving the equations of two conserved variables simultaneously to calculateλof cumulus clouds in a large-eddy simulation.The results demonstrate that the improved bulk-plume method is more reliable than the traditional bulk-plume method,becauseλ,as calculated from the improved method,falls within the range ofλvalues obtained from the traditional method using different conserved variables.The probability density functions ofλfor all data,different times,and different heights can be well-fitted by a log-normal distribution,which supports the assumed stochastic entrainment process in previous studies.Further analysis demonstrate that the relationship betweenλand the vertical velocity is better than other thermodynamic/dynamical properties;thus,the vertical velocity is recommended as the primary influencing factor for the parameterization ofλin the future.The results of this study enhance the theoretical understanding ofλand its influencing factors and shed new light on the development ofλparameterization.展开更多
Some of the significant new technologies researched in recent studies include BlockChain(BC),Software Defined Networking(SDN),and Smart Industrial Internet of Things(IIoT).All three technologies provide data integrity...Some of the significant new technologies researched in recent studies include BlockChain(BC),Software Defined Networking(SDN),and Smart Industrial Internet of Things(IIoT).All three technologies provide data integrity,confidentiality,and integrity in their respective use cases(especially in industrial fields).Additionally,cloud computing has been in use for several years now.Confidential information is exchanged with cloud infrastructure to provide clients with access to distant resources,such as computing and storage activities in the IIoT.There are also significant security risks,concerns,and difficulties associated with cloud computing.To address these challenges,we propose merging BC and SDN into a cloud computing platform for the IIoT.This paper introduces“DistB-SDCloud”,an architecture for enhanced cloud security for smart IIoT applications.The proposed architecture uses a distributed BC method to provide security,secrecy,privacy,and integrity while remaining flexible and scalable.Customers in the industrial sector benefit from the dispersed or decentralized,and efficient environment of BC.Additionally,we described an SDN method to improve the durability,stability,and load balancing of cloud infrastructure.The efficacy of our SDN and BC-based implementation was experimentally tested by using various parameters including throughput,packet analysis,response time,bandwidth,and latency analysis,as well as the monitoring of several attacks on the system itself.展开更多
Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to est...Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.展开更多
The shape parameter of the Gamma size distribution plays a key role in the evolution of the cloud droplet spectrum in the bulk parameterization schemes. However, due to the inaccurate specification of the shape parame...The shape parameter of the Gamma size distribution plays a key role in the evolution of the cloud droplet spectrum in the bulk parameterization schemes. However, due to the inaccurate specification of the shape parameter in the commonly used bulk double-moment schemes, the cloud droplet spectra cannot reasonably be described during the condensation process. Therefore, a newly-developed triple-parameter condensation scheme with the shape parameter diagnosed through the number concentration, cloud water content, and reflectivity factor of cloud droplets can be applied to improve the evolution of the cloud droplet spectrum. The simulation with the new parameterization scheme was compared to those with a high-resolution Lagrangian bin scheme, the double-moment schemes in a parcel model, and the observation in a 1.5D Eulerian model that consists of two cylinders. The new scheme with the shape parameter varying with time and space can accurately simulate the evolution of the cloud droplet spectrum. Furthermore, the volume-mean radius and cloud water content simulated with the new scheme match the Lagrangian analytical solutions well, and the errors are steady, within approximately 0.2%.展开更多
Rapid urbanization has led to a surge in the number of towering structures,and overturning is widely used because it can better accommodate the construction of shaped structures such as variable sections.The complexit...Rapid urbanization has led to a surge in the number of towering structures,and overturning is widely used because it can better accommodate the construction of shaped structures such as variable sections.The complexity of the construction process makes the construction risk have certain randomness,so this paper proposes a cloudbased coupled matter-element model to address the ambiguity and randomness in the safety risk assessment of overturning construction of towering structures.In the pretended model,the digital eigenvalues of the cloud model are used to replace the eigenvalues in the matter–element basic element,and calculate the cloud correlation of the risk assessment metrics through the correlation algorithm of the cloud model to build the computational model.Meanwhile,the improved hierarchical analysis method based on the cloud model is used to determine the weight of the index.The comprehensive evaluation scores of the evaluation event are then obtained through the weighted average method,and the safety risk level is determined accordingly.Through empirical analysis,(1)the improved hierarchical analysis method based on the cloud model can incorporate the data of multiple decisionmakers into the calculation formula to determine theweights,which makes the assessment resultsmore credible;(2)the evaluation results of the cloud-basedmatter-element coupledmodelmethod are basically consistent with those of the other two commonly used methods,and the confidence factor is less than 0.05,indicating that the cloudbased physical element coupled model method is reasonable and practical for towering structure overturning;(3)the cloud-based coupled element model method,which confirms the reliability of risk level by performing Spearman correlation on comprehensive assessment scores,can provide more comprehensive information of instances compared with other methods,and more comprehensively reflects the fuzzy uncertainty relationship between assessment indexes,which makes the assessment results more realistic,scientific and reliable.展开更多
In the present scenario of rapid growth in cloud computing models,several companies and users started to share their data on cloud servers.However,when the model is not completely trusted,the data owners face several ...In the present scenario of rapid growth in cloud computing models,several companies and users started to share their data on cloud servers.However,when the model is not completely trusted,the data owners face several security-related problems,such as user privacy breaches,data disclosure,data corruption,and so on,during the process of data outsourcing.For addressing and handling the security-related issues on Cloud,several models were proposed.With that concern,this paper develops a Privacy-Preserved Data Security Approach(PP-DSA)to provide the data security and data integrity for the out-sourcing data in Cloud Environment.Privacy preservation is ensured in this work with the Efficient Authentication Technique(EAT)using the Group Signature method that is applied with Third-Party Auditor(TPA).The role of the auditor is to secure the data and guarantee shared data integrity.Additionally,the Cloud Service Provider(CSP)and Data User(DU)can also be the attackers that are to be handled with the EAT.Here,the major objective of the work is to enhance cloud security and thereby,increase Quality of Service(QoS).The results are evaluated based on the model effectiveness,security,and reliability and show that the proposed model provides better results than existing works.展开更多
To achieve the high availability of health data in erasure-coded cloud storage systems,the data update performance in erasure coding should be continuously optimized.However,the data update performance is often bottle...To achieve the high availability of health data in erasure-coded cloud storage systems,the data update performance in erasure coding should be continuously optimized.However,the data update performance is often bottlenecked by the constrained cross-rack bandwidth.Various techniques have been proposed in the literature to improve network bandwidth efficiency,including delta transmission,relay,and batch update.These techniques were largely proposed individually previously,and in this work,we seek to use them jointly.To mitigate the cross-rack update traffic,we propose DXR-DU which builds on four valuable techniques:(i)delta transmission,(ii)XOR-based data update,(iii)relay,and(iv)batch update.Meanwhile,we offer two selective update approaches:1)data-deltabased update,and 2)parity-delta-based update.The proposed DXR-DU is evaluated via trace-driven local testbed experiments.Comprehensive experiments show that DXR-DU can significantly improve data update throughput while mitigating the cross-rack update traffic.展开更多
基金National Natural Science Foundation of China(Nos.42071444,42101444)。
文摘Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.
文摘As the extensive use of cloud computing raises questions about the security of any personal data stored there,cryptography is being used more frequently as a security tool to protect data confidentiality and privacy in the cloud environment.A hypervisor is a virtualization software used in cloud hosting to divide and allocate resources on various pieces of hardware.The choice of hypervisor can significantly impact the performance of cryptographic operations in the cloud environment.An important issue that must be carefully examined is that no hypervisor is completely superior in terms of performance;Each hypervisor should be examined to meet specific needs.The main objective of this study is to provide accurate results to compare the performance of Hyper-V and Kernel-based Virtual Machine(KVM)while implementing different cryptographic algorithms to guide cloud service providers and end users in choosing the most suitable hypervisor for their cryptographic needs.This study evaluated the efficiency of two hypervisors,Hyper-V and KVM,in implementing six cryptographic algorithms:Rivest,Shamir,Adleman(RSA),Advanced Encryption Standard(AES),Triple Data Encryption Standard(TripleDES),Carlisle Adams and Stafford Tavares(CAST-128),BLOWFISH,and TwoFish.The study’s findings show that KVM outperforms Hyper-V,with 12.2%less Central Processing Unit(CPU)use and 12.95%less time overall for encryption and decryption operations with various file sizes.The study’s findings emphasize how crucial it is to pick a hypervisor that is appropriate for cryptographic needs in a cloud environment,which could assist both cloud service providers and end users.Future research may focus more on how various hypervisors perform while handling cryptographic workloads.
文摘Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This article presents an effective mechanism to preserve the secrecy and integrity of data stored on the public cloud by leveraging blockchain technology,smart contracts,and cryptographic primitives.The proposed approach utilizes a Solidity-based smart contract as an auditor for maintaining and verifying the integrity of outsourced data.To preserve data secrecy,symmetric encryption systems are employed to encrypt user data before outsourcing it.An extensive performance analysis is conducted to illustrate the efficiency of the proposed mechanism.Additionally,a rigorous assessment is conducted to ensure that the developed smart contract is free from vulnerabilities and to measure its associated running costs.The security analysis of the proposed system confirms that our approach can securely maintain the confidentiality and integrity of cloud storage,even in the presence of malicious entities.The proposed mechanism contributes to enhancing data security in cloud computing environments and can be used as a foundation for developing more secure cloud storage systems.
基金supported by the National Natural Science Foundation of China(Grant No.92365206)the support of the China Postdoctoral Science Foundation(Certificate Number:2023M740272)+1 种基金supported by the National Natural Science Foundation of China(Grant No.12247168)China Postdoctoral Science Foundation(Certificate Number:2022TQ0036)。
文摘We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and comprehensive workflow that utilizes the quantum approximate optimization algorithm(QAOA).It facilitates the automatic conversion of the original problem into a quadratic unconstrained binary optimization(QUBO)model and its corresponding Ising model,which can be subsequently transformed into a weight graph.The core of Qcover relies on a graph decomposition-based classical algorithm,which efficiently derives the optimal parameters for the shallow QAOA circuit.Quafu-Qcover incorporates a dedicated compiler capable of translating QAOA circuits into physical quantum circuits that can be executed on Quafu cloud quantum computers.Compared to a general-purpose compiler,our compiler demonstrates the ability to generate shorter circuit depths,while also exhibiting superior speed performance.Additionally,the Qcover compiler has the capability to dynamically create a library of qubits coupling substructures in real-time,utilizing the most recent calibration data from the superconducting quantum devices.This ensures that computational tasks can be assigned to connected physical qubits with the highest fidelity.The Quafu-Qcover allows us to retrieve quantum computing sampling results using a task ID at any time,enabling asynchronous processing.Moreover,it incorporates modules for results preprocessing and visualization,facilitating an intuitive display of solutions for combinatorial optimization problems.We hope that Quafu-Qcover can serve as an instructive illustration for how to explore application problems on the Quafu cloud quantum computers.
基金funded in part by the Key Project of Nature Science Research for Universities of Anhui Province of China(No.2022AH051720)in part by the Science and Technology Development Fund,Macao SAR(Grant Nos.0093/2022/A2,0076/2022/A2 and 0008/2022/AGJ)in part by the China University Industry-University-Research Collaborative Innovation Fund(No.2021FNA04017).
文摘This paper focuses on the effective utilization of data augmentation techniques for 3Dlidar point clouds to enhance the performance of neural network models.These point clouds,which represent spatial information through a collection of 3D coordinates,have found wide-ranging applications.Data augmentation has emerged as a potent solution to the challenges posed by limited labeled data and the need to enhance model generalization capabilities.Much of the existing research is devoted to crafting novel data augmentation methods specifically for 3D lidar point clouds.However,there has been a lack of focus on making the most of the numerous existing augmentation techniques.Addressing this deficiency,this research investigates the possibility of combining two fundamental data augmentation strategies.The paper introduces PolarMix andMix3D,two commonly employed augmentation techniques,and presents a new approach,named RandomFusion.Instead of using a fixed or predetermined combination of augmentation methods,RandomFusion randomly chooses one method from a pool of options for each instance or sample.This innovative data augmentation technique randomly augments each point in the point cloud with either PolarMix or Mix3D.The crux of this strategy is the random choice between PolarMix and Mix3Dfor the augmentation of each point within the point cloud data set.The results of the experiments conducted validate the efficacy of the RandomFusion strategy in enhancing the performance of neural network models for 3D lidar point cloud semantic segmentation tasks.This is achieved without compromising computational efficiency.By examining the potential of merging different augmentation techniques,the research contributes significantly to a more comprehensive understanding of how to utilize existing augmentation methods for 3D lidar point clouds.RandomFusion data augmentation technique offers a simple yet effective method to leverage the diversity of augmentation techniques and boost the robustness of models.The insights gained from this research can pave the way for future work aimed at developing more advanced and efficient data augmentation strategies for 3D lidar point cloud analysis.
基金the National Natural Science Foundation of China(Grant Nos.42025504,No.41905023)National Natural Science Youth Science Foundation(Grant No.41701406)Youth Innovation Promotion Association of Chinese Academy of Sciences(Grant No.:2021122).
文摘Cloud top pressure(CTP)is one of the critical cloud properties that significantly affects the radiative effect of clouds.Multi-angle polarized sensors can employ polarized bands(490 nm)or O_(2)A-bands(763 and 765 nm)to retrieve the CTP.However,the CTP retrieved by the two methods shows inconsistent results in certain cases,and large uncertainties in low and thin cloud retrievals,which may lead to challenges in subsequent applications.This study proposes a synergistic algorithm that considers both O_(2)A-bands and polarized bands using a random forest(RF)model.LiDAR CTP data are used as the true values and the polarized and non-polarized measurements are concatenated to train the RF model to determine CTP.Additionally,through analysis,we proposed that the polarized signal becomes saturated as the cloud optical thickness(COT)increases,necessitating a particular treatment for cases where COT<10 to improve the algorithm's stability.The synergistic method was then applied to the directional polarized camera(DPC)and Polarized and Directionality of the Earth’s Reflectance(POLDER)measurements for evaluation,and the resulting retrieval accuracy of the POLDER-based measurements(RMSEPOLDER=205.176 hPa,RMSEDPC=171.141 hPa,R^(2)POLDER=0.636,R^(2)DPC=0.663,respectively)were higher than that of the MODIS and POLDER Rayleigh pressure measurements.The synergistic algorithm also showed good performance with the application of DPC data.This algorithm is expected to provide data support for atmosphere-related fields as an atmospheric remote sensing algorithm within the Cloud Application for Remote Sensing,Atmospheric Radiation,and Updating Energy(CARE)platform.
文摘A new era of data access and management has begun with the use of cloud computing in the healthcare industry.Despite the efficiency and scalability that the cloud provides, the security of private patient data is still a majorconcern. Encryption, network security, and adherence to data protection laws are key to ensuring the confidentialityand integrity of healthcare data in the cloud. The computational overhead of encryption technologies could leadto delays in data access and processing rates. To address these challenges, we introduced the Enhanced ParallelMulti-Key Encryption Algorithm (EPM-KEA), aiming to bolster healthcare data security and facilitate the securestorage of critical patient records in the cloud. The data was gathered from two categories Authorization forHospital Admission (AIH) and Authorization for High Complexity Operations.We use Z-score normalization forpreprocessing. The primary goal of implementing encryption techniques is to secure and store massive amountsof data on the cloud. It is feasible that cloud storage alternatives for protecting healthcare data will become morewidely available if security issues can be successfully fixed. As a result of our analysis using specific parametersincluding Execution time (42%), Encryption time (45%), Decryption time (40%), Security level (97%), and Energyconsumption (53%), the system demonstrated favorable performance when compared to the traditional method.This suggests that by addressing these security concerns, there is the potential for broader accessibility to cloudstorage solutions for safeguarding healthcare data.
文摘This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.
文摘To address the current issues of inaccurate segmentation and the limited applicability of segmentation methods for building facades in point clouds, we propose a facade segmentation algorithm based on optimal dual-scale feature descriptors. First, we select the optimal dual-scale descriptors from a range of feature descriptors. Next, we segment the facade according to the threshold value of the chosen optimal dual-scale descriptors. Finally, we use RANSAC (Random Sample Consensus) to fit the segmented surface and optimize the fitting result. Experimental results show that, compared to commonly used facade segmentation algorithms, the proposed method yields more accurate segmentation results, providing a robust data foundation for subsequent 3D model reconstruction of buildings.
文摘The Google Cloud Platform (GCP) is a popular choice for companies seeking a comprehensive cloud computing solution because it provides everything from essential computing resources to powerful data analytics and machine learning capabilities. Saviynt is a cloud-based Identity and Access Management (IAM) system that integrates with Google Cloud Platform (GCP) and other services for additional functionality. However, other problems are associated with the transition, such as the requirement to correctly integrate IAM Saviynt into current IT infrastructures and provide comprehensive training to users on the new system. The paper will give a detailed review of the advantages, disadvantages, and best practices related to this transition.
基金supported by State Grid Corporation Limited Science and Technology Project Funding(Contract No.SGCQSQ00YJJS2200380).
文摘There is instability in the distributed energy storage cloud group end region on the power grid side.In order to avoid large-scale fluctuating charging and discharging in the power grid environment and make the capacitor components showa continuous and stable charging and discharging state,a hierarchical time-sharing configuration algorithm of distributed energy storage cloud group end region on the power grid side based on multi-scale and multi feature convolution neural network is proposed.Firstly,a voltage stability analysis model based onmulti-scale and multi feature convolution neural network is constructed,and the multi-scale and multi feature convolution neural network is optimized based on Self-OrganizingMaps(SOM)algorithm to analyze the voltage stability of the cloud group end region of distributed energy storage on the grid side under the framework of credibility.According to the optimal scheduling objectives and network size,the distributed robust optimal configuration control model is solved under the framework of coordinated optimal scheduling at multiple time scales;Finally,the time series characteristics of regional power grid load and distributed generation are analyzed.According to the regional hierarchical time-sharing configuration model of“cloud”,“group”and“end”layer,the grid side distributed energy storage cloud group end regional hierarchical time-sharing configuration algorithm is realized.The experimental results show that after applying this algorithm,the best grid side distributed energy storage configuration scheme can be determined,and the stability of grid side distributed energy storage cloud group end region layered timesharing configuration can be improved.
基金supported by the Natural Science Foundation of China(Grant No.51939004)the Fundamental Research Funds for the Central Universities(Grant No.B210204009)the China Huaneng Group Science and Technology Project(Grant No.HNKJ18-H24).
文摘Safety evaluation of toppling rock slopes developing in reservoir areas is crucial. To reduce the uncertainty of safety evaluation, this study developed a composite cloud model, which improved the combination weights of the decision-making trial and evaluation laboratory (DEMATEL) and criteria importance through intercriteria correlation (CRITIC) methods. A safety evaluation system was developed according to in situ monitoring data. The backward cloud generator was used to calculate the numerical characteristics of a cloud model of quantitative indices, and different virtual clouds were used to synthesize some clouds into a generalized one. The synthesized numerical characteristics were calculated to comprehensively evaluate the safety of toppling rock slopes. A case study of a toppling rock slope near the Huangdeng Hydropower Station in China was conducted using monitoring data collected since operation of the hydropower project began. The results indicated that the toppling rock slope was moderately safe with a low safety margin. The composite cloud model considers the fuzziness and randomness of safety evaluation and enables interchange between qualitative and quantitative knowledge. This study provides a new theoretical method for evaluating the safety of toppling rock slopes. It can aid in the predication, control, and even prevention of disasters.
基金supported by the National Natural Science Foundation of China(Grant Nos.42175099,42027804,42075073)the Innovative Project of Postgraduates in Jiangsu Province in 2023(Grant No.KYCX23_1319)+3 种基金supported by the National Natural Science Foundation of China(Grant No.42205080)the Natural Science Foundation of Sichuan(Grant No.2023YFS0442)the Research Fund of Civil Aviation Flight University of China(Grant No.J2022-037)supported by the National Key Scientific and Technological Infrastructure project“Earth System Science Numerical Simulator Facility”(Earth Lab)。
文摘The process of entrainment-mixing between cumulus clouds and the ambient air is important for the development of cumulus clouds.Accurately obtaining the entrainment rate(λ)is particularly important for its parameterization within the overall cumulus parameterization scheme.In this study,an improved bulk-plume method is proposed by solving the equations of two conserved variables simultaneously to calculateλof cumulus clouds in a large-eddy simulation.The results demonstrate that the improved bulk-plume method is more reliable than the traditional bulk-plume method,becauseλ,as calculated from the improved method,falls within the range ofλvalues obtained from the traditional method using different conserved variables.The probability density functions ofλfor all data,different times,and different heights can be well-fitted by a log-normal distribution,which supports the assumed stochastic entrainment process in previous studies.Further analysis demonstrate that the relationship betweenλand the vertical velocity is better than other thermodynamic/dynamical properties;thus,the vertical velocity is recommended as the primary influencing factor for the parameterization ofλin the future.The results of this study enhance the theoretical understanding ofλand its influencing factors and shed new light on the development ofλparameterization.
基金Supporting Project number(RSP2023R34)King Saud University,Riyadh,Saudi Arabia.
文摘Some of the significant new technologies researched in recent studies include BlockChain(BC),Software Defined Networking(SDN),and Smart Industrial Internet of Things(IIoT).All three technologies provide data integrity,confidentiality,and integrity in their respective use cases(especially in industrial fields).Additionally,cloud computing has been in use for several years now.Confidential information is exchanged with cloud infrastructure to provide clients with access to distant resources,such as computing and storage activities in the IIoT.There are also significant security risks,concerns,and difficulties associated with cloud computing.To address these challenges,we propose merging BC and SDN into a cloud computing platform for the IIoT.This paper introduces“DistB-SDCloud”,an architecture for enhanced cloud security for smart IIoT applications.The proposed architecture uses a distributed BC method to provide security,secrecy,privacy,and integrity while remaining flexible and scalable.Customers in the industrial sector benefit from the dispersed or decentralized,and efficient environment of BC.Additionally,we described an SDN method to improve the durability,stability,and load balancing of cloud infrastructure.The efficacy of our SDN and BC-based implementation was experimentally tested by using various parameters including throughput,packet analysis,response time,bandwidth,and latency analysis,as well as the monitoring of several attacks on the system itself.
基金supported in part by the Nationa Natural Science Foundation of China (61876011)the National Key Research and Development Program of China (2022YFB4703700)+1 种基金the Key Research and Development Program 2020 of Guangzhou (202007050002)the Key-Area Research and Development Program of Guangdong Province (2020B090921003)。
文摘Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.
基金supported by the National Natural Science Foundation of China (Grant Nos. 41275147 and 41875173)the STS Program of Inner Mongolia Meteorological Service, Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences and Institute of Atmospheric Physics, Chinese Academy of Sciences (Grant No. 2021CG0047)
文摘The shape parameter of the Gamma size distribution plays a key role in the evolution of the cloud droplet spectrum in the bulk parameterization schemes. However, due to the inaccurate specification of the shape parameter in the commonly used bulk double-moment schemes, the cloud droplet spectra cannot reasonably be described during the condensation process. Therefore, a newly-developed triple-parameter condensation scheme with the shape parameter diagnosed through the number concentration, cloud water content, and reflectivity factor of cloud droplets can be applied to improve the evolution of the cloud droplet spectrum. The simulation with the new parameterization scheme was compared to those with a high-resolution Lagrangian bin scheme, the double-moment schemes in a parcel model, and the observation in a 1.5D Eulerian model that consists of two cylinders. The new scheme with the shape parameter varying with time and space can accurately simulate the evolution of the cloud droplet spectrum. Furthermore, the volume-mean radius and cloud water content simulated with the new scheme match the Lagrangian analytical solutions well, and the errors are steady, within approximately 0.2%.
基金funded by China Railway No.21 Bureau Group No.1 Engineering Co.,Ltd.,Grant No.202209140002.
文摘Rapid urbanization has led to a surge in the number of towering structures,and overturning is widely used because it can better accommodate the construction of shaped structures such as variable sections.The complexity of the construction process makes the construction risk have certain randomness,so this paper proposes a cloudbased coupled matter-element model to address the ambiguity and randomness in the safety risk assessment of overturning construction of towering structures.In the pretended model,the digital eigenvalues of the cloud model are used to replace the eigenvalues in the matter–element basic element,and calculate the cloud correlation of the risk assessment metrics through the correlation algorithm of the cloud model to build the computational model.Meanwhile,the improved hierarchical analysis method based on the cloud model is used to determine the weight of the index.The comprehensive evaluation scores of the evaluation event are then obtained through the weighted average method,and the safety risk level is determined accordingly.Through empirical analysis,(1)the improved hierarchical analysis method based on the cloud model can incorporate the data of multiple decisionmakers into the calculation formula to determine theweights,which makes the assessment resultsmore credible;(2)the evaluation results of the cloud-basedmatter-element coupledmodelmethod are basically consistent with those of the other two commonly used methods,and the confidence factor is less than 0.05,indicating that the cloudbased physical element coupled model method is reasonable and practical for towering structure overturning;(3)the cloud-based coupled element model method,which confirms the reliability of risk level by performing Spearman correlation on comprehensive assessment scores,can provide more comprehensive information of instances compared with other methods,and more comprehensively reflects the fuzzy uncertainty relationship between assessment indexes,which makes the assessment results more realistic,scientific and reliable.
文摘In the present scenario of rapid growth in cloud computing models,several companies and users started to share their data on cloud servers.However,when the model is not completely trusted,the data owners face several security-related problems,such as user privacy breaches,data disclosure,data corruption,and so on,during the process of data outsourcing.For addressing and handling the security-related issues on Cloud,several models were proposed.With that concern,this paper develops a Privacy-Preserved Data Security Approach(PP-DSA)to provide the data security and data integrity for the out-sourcing data in Cloud Environment.Privacy preservation is ensured in this work with the Efficient Authentication Technique(EAT)using the Group Signature method that is applied with Third-Party Auditor(TPA).The role of the auditor is to secure the data and guarantee shared data integrity.Additionally,the Cloud Service Provider(CSP)and Data User(DU)can also be the attackers that are to be handled with the EAT.Here,the major objective of the work is to enhance cloud security and thereby,increase Quality of Service(QoS).The results are evaluated based on the model effectiveness,security,and reliability and show that the proposed model provides better results than existing works.
基金supported by Major Special Project of Sichuan Science and Technology Department(2020YFG0460)Central University Project of China(ZYGX2020ZB020,ZYGX2020ZB019).
文摘To achieve the high availability of health data in erasure-coded cloud storage systems,the data update performance in erasure coding should be continuously optimized.However,the data update performance is often bottlenecked by the constrained cross-rack bandwidth.Various techniques have been proposed in the literature to improve network bandwidth efficiency,including delta transmission,relay,and batch update.These techniques were largely proposed individually previously,and in this work,we seek to use them jointly.To mitigate the cross-rack update traffic,we propose DXR-DU which builds on four valuable techniques:(i)delta transmission,(ii)XOR-based data update,(iii)relay,and(iv)batch update.Meanwhile,we offer two selective update approaches:1)data-deltabased update,and 2)parity-delta-based update.The proposed DXR-DU is evaluated via trace-driven local testbed experiments.Comprehensive experiments show that DXR-DU can significantly improve data update throughput while mitigating the cross-rack update traffic.