Aiming at the problem of radar base and ground observation stations on the Tibet is sparsely distributed and cannot achieve large-scale precipitation monitoring.U-Net,an advanced machine learning(ML)method,is used to ...Aiming at the problem of radar base and ground observation stations on the Tibet is sparsely distributed and cannot achieve large-scale precipitation monitoring.U-Net,an advanced machine learning(ML)method,is used to develop a robust and rapid algorithm for precipitating cloud detection based on the new-generation geostationary satellite of FengYun-4A(FY-4A).First,in this algorithm,the real-time multi-band infrared brightness temperature from FY-4A combined with the data of Digital Elevation Model(DEM)has been used as predictor variables for our model.Second,the efficiency of the feature was improved by changing the traditional convolution layer serial connection method of U-Net to residual mapping.Then,in order to solve the problem of the network that would produce semantic differences when directly concentrated with low-level and high-level features,we use dense skip pathways to reuse feature maps of different layers as inputs for concatenate neural networks feature layers from different depths.Finally,according to the characteristics of precipitation clouds,the pooling layer of U-Net was replaced by a convolution operation to realize the detection of small precipitation clouds.It was experimentally concluded that the Pixel Accuracy(PA)and Mean Intersection over Union(MIoU)of the improved U-Net on the test set could reach 0.916 and 0.928,the detection of precipitation clouds over Tibet were well actualized.展开更多
In this study we observed the microphysical properties, including the vertical and horizontal distributions of ice particles,liquid water content and ice habit, in different regions of a slightly supercooled stratifor...In this study we observed the microphysical properties, including the vertical and horizontal distributions of ice particles,liquid water content and ice habit, in different regions of a slightly supercooled stratiform cloud. Using aircraft instrument and radar data, the cloud top temperature was recorded as higher than -15℃, behind a cold front, on 9 September 2015 in North China. During the flight sampling, the high ice number concentration area was located in the supercooled part of a shallow convective cloud embedded in a stratiform cloud, where the ambient temperature was around -3℃. In this area,the maximum number concentrations of particles with diameter greater than 100 μm and 500 μm(N_(100) and N_(500)) exceeded 300 L-(-1) and 30 L-(-1), respectively, and were related to large supercooled water droplets with diameter greater than 24 μm derived from cloud–aerosol spectrometer probe measurements. The ice particles types in this region were predominantly columnar, needle, graupel, and some freezing drops, suggesting that the occurrence of high ice number concentrations was likely related to the Hallett–Mossop mechanism, although many other ice multiplication processes cannot be totally ruled out.The maximum ice number concentration obtained during the first penetration was around two to three orders of magnitude larger than that predicted by the Demott and Fletcher schemes when assuming the cloud top temperature was around-15℃.During the second penetration conducted within the stratiform cloud, N_(100) and N_(500) decreased by a factor of five to ten, and the presence of columnar and needle-like crystals became very rare.展开更多
We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and c...We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and comprehensive workflow that utilizes the quantum approximate optimization algorithm(QAOA).It facilitates the automatic conversion of the original problem into a quadratic unconstrained binary optimization(QUBO)model and its corresponding Ising model,which can be subsequently transformed into a weight graph.The core of Qcover relies on a graph decomposition-based classical algorithm,which efficiently derives the optimal parameters for the shallow QAOA circuit.Quafu-Qcover incorporates a dedicated compiler capable of translating QAOA circuits into physical quantum circuits that can be executed on Quafu cloud quantum computers.Compared to a general-purpose compiler,our compiler demonstrates the ability to generate shorter circuit depths,while also exhibiting superior speed performance.Additionally,the Qcover compiler has the capability to dynamically create a library of qubits coupling substructures in real-time,utilizing the most recent calibration data from the superconducting quantum devices.This ensures that computational tasks can be assigned to connected physical qubits with the highest fidelity.The Quafu-Qcover allows us to retrieve quantum computing sampling results using a task ID at any time,enabling asynchronous processing.Moreover,it incorporates modules for results preprocessing and visualization,facilitating an intuitive display of solutions for combinatorial optimization problems.We hope that Quafu-Qcover can serve as an instructive illustration for how to explore application problems on the Quafu cloud quantum computers.展开更多
Rapid urbanization has led to a surge in the number of towering structures,and overturning is widely used because it can better accommodate the construction of shaped structures such as variable sections.The complexit...Rapid urbanization has led to a surge in the number of towering structures,and overturning is widely used because it can better accommodate the construction of shaped structures such as variable sections.The complexity of the construction process makes the construction risk have certain randomness,so this paper proposes a cloudbased coupled matter-element model to address the ambiguity and randomness in the safety risk assessment of overturning construction of towering structures.In the pretended model,the digital eigenvalues of the cloud model are used to replace the eigenvalues in the matter–element basic element,and calculate the cloud correlation of the risk assessment metrics through the correlation algorithm of the cloud model to build the computational model.Meanwhile,the improved hierarchical analysis method based on the cloud model is used to determine the weight of the index.The comprehensive evaluation scores of the evaluation event are then obtained through the weighted average method,and the safety risk level is determined accordingly.Through empirical analysis,(1)the improved hierarchical analysis method based on the cloud model can incorporate the data of multiple decisionmakers into the calculation formula to determine theweights,which makes the assessment resultsmore credible;(2)the evaluation results of the cloud-basedmatter-element coupledmodelmethod are basically consistent with those of the other two commonly used methods,and the confidence factor is less than 0.05,indicating that the cloudbased physical element coupled model method is reasonable and practical for towering structure overturning;(3)the cloud-based coupled element model method,which confirms the reliability of risk level by performing Spearman correlation on comprehensive assessment scores,can provide more comprehensive information of instances compared with other methods,and more comprehensively reflects the fuzzy uncertainty relationship between assessment indexes,which makes the assessment results more realistic,scientific and reliable.展开更多
Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduct...Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.展开更多
Water resources are one of the key factors restricting the development of arid areas,and cloud water resources is an important part of water resources.The arid region of central Asia is the core region of the current ...Water resources are one of the key factors restricting the development of arid areas,and cloud water resources is an important part of water resources.The arid region of central Asia is the core region of the current national green silk road construction,and is the largest arid region in the world.Based on cloud cover data of ECMWF,the current study analyzed temporal and spatial characteristics of cloud properties in arid regions of Central Asia between 1980 and 2019.Our findings show that:(1)From the point of view of spatial distribution,total cloudiness in arid regions of Central Asia was low in the south and high in the north.The distribution of high cloud frequency and medium cloud frequency was higher in the south and lower in the north,while low cloud frequency distribution was low in the south and high in the north.(2)In terms of time,the variation of cloud cover and cloud type frequency had obvious seasonal characteristics.From winter to spring,cloud cover increased,and the change of cloud type frequency increased.From spring to summer,cloud cover continued to increase and the change of cloud type frequency increased further.Cloud cover began to decrease from summer to autumn,and the change of cloud type frequency also decreased.(3)Generally,average total cloud cover decreased in most of central Asia,and high and medium cloud cover increased while low cloud cover decreased.This study provides a reference for the rational development of cloud resources in the region.展开更多
The cloud type product 2B-CLDCLASS-LIDAR based on CloudSat and CALIPSO from June 2006 to May 2017 is used to examine the temporal and spatial distribution characteristics and interannual variability of eight cloud typ...The cloud type product 2B-CLDCLASS-LIDAR based on CloudSat and CALIPSO from June 2006 to May 2017 is used to examine the temporal and spatial distribution characteristics and interannual variability of eight cloud types(high cloud, altostratus, altocumulus, stratus, stratocumulus, cumulus, nimbostratus, and deep convection) and three phases(ice,mixed, and water) in the Arctic. Possible reasons for the observed interannual variability are also discussed. The main conclusions are as follows:(1) More water clouds occur on the Atlantic side, and more ice clouds occur over continents.(2)The average spatial and seasonal distributions of cloud types show three patterns: high clouds and most cumuliform clouds are concentrated in low-latitude locations and peak in summer;altostratus and nimbostratus are concentrated over and around continents and are less abundant in summer;stratocumulus and stratus are concentrated near the inner Arctic and peak during spring and autumn.(3) Regional averaged interannual frequencies of ice clouds and altostratus clouds significantly decrease, while those of water clouds, altocumulus, and cumulus clouds increase significantly.(4) Significant features of the linear trends of cloud frequencies are mainly located over ocean areas.(5) The monthly water cloud frequency anomalies are positively correlated with air temperature in most of the troposphere, while those for ice clouds are negatively correlated.(6) The decrease in altostratus clouds is associated with the weakening of the Arctic front due to Arctic warming, while increased water vapor transport into the Arctic and higher atmospheric instability lead to more cumulus and altocumulus clouds.展开更多
Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin ...Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin in CC’s performance,the Cloud Service Broker(CSB),orchestrates DC selection.Failure to adroitly route user requests with suitable DCs transforms the CSB into a bottleneck,endangering service quality.To tackle this,deploying an efficient CSB policy becomes imperative,optimizing DC selection to meet stringent Qualityof-Service(QoS)demands.Amidst numerous CSB policies,their implementation grapples with challenges like costs and availability.This article undertakes a holistic review of diverse CSB policies,concurrently surveying the predicaments confronted by current policies.The foremost objective is to pinpoint research gaps and remedies to invigorate future policy development.Additionally,it extensively clarifies various DC selection methodologies employed in CC,enriching practitioners and researchers alike.Employing synthetic analysis,the article systematically assesses and compares myriad DC selection techniques.These analytical insights equip decision-makers with a pragmatic framework to discern the apt technique for their needs.In summation,this discourse resoundingly underscores the paramount importance of adept CSB policies in DC selection,highlighting the imperative role of efficient CSB policies in optimizing CC performance.By emphasizing the significance of these policies and their modeling implications,the article contributes to both the general modeling discourse and its practical applications in the CC domain.展开更多
In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding ...In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding the risk of data loss and data overlapping.The development of data flow scheduling approaches in the cloud environment taking security parameters into account is insufficient.In our work,we propose a data scheduling model for the cloud environment.Themodel is made up of three parts that together help dispatch user data flow to the appropriate cloudVMs.The first component is the Collector Agent whichmust periodically collect information on the state of the network links.The second one is the monitoring agent which must then analyze,classify,and make a decision on the state of the link and finally transmit this information to the scheduler.The third one is the scheduler who must consider previous information to transfer user data,including fair distribution and reliable paths.It should be noted that each part of the proposedmodel requires the development of its algorithms.In this article,we are interested in the development of data transfer algorithms,including fairness distribution with the consideration of a stable link state.These algorithms are based on the grouping of transmitted files and the iterative method.The proposed algorithms showthe performances to obtain an approximate solution to the studied problem which is an NP-hard(Non-Polynomial solution)problem.The experimental results show that the best algorithm is the half-grouped minimum excluding(HME),with a percentage of 91.3%,an average deviation of 0.042,and an execution time of 0.001 s.展开更多
As the extensive use of cloud computing raises questions about the security of any personal data stored there,cryptography is being used more frequently as a security tool to protect data confidentiality and privacy i...As the extensive use of cloud computing raises questions about the security of any personal data stored there,cryptography is being used more frequently as a security tool to protect data confidentiality and privacy in the cloud environment.A hypervisor is a virtualization software used in cloud hosting to divide and allocate resources on various pieces of hardware.The choice of hypervisor can significantly impact the performance of cryptographic operations in the cloud environment.An important issue that must be carefully examined is that no hypervisor is completely superior in terms of performance;Each hypervisor should be examined to meet specific needs.The main objective of this study is to provide accurate results to compare the performance of Hyper-V and Kernel-based Virtual Machine(KVM)while implementing different cryptographic algorithms to guide cloud service providers and end users in choosing the most suitable hypervisor for their cryptographic needs.This study evaluated the efficiency of two hypervisors,Hyper-V and KVM,in implementing six cryptographic algorithms:Rivest,Shamir,Adleman(RSA),Advanced Encryption Standard(AES),Triple Data Encryption Standard(TripleDES),Carlisle Adams and Stafford Tavares(CAST-128),BLOWFISH,and TwoFish.The study’s findings show that KVM outperforms Hyper-V,with 12.2%less Central Processing Unit(CPU)use and 12.95%less time overall for encryption and decryption operations with various file sizes.The study’s findings emphasize how crucial it is to pick a hypervisor that is appropriate for cryptographic needs in a cloud environment,which could assist both cloud service providers and end users.Future research may focus more on how various hypervisors perform while handling cryptographic workloads.展开更多
Environmental conditions can change markedly over geographical distances along elevation gradients,making them natural laboratories to study the processes that structure communities.This work aimed to assess the influ...Environmental conditions can change markedly over geographical distances along elevation gradients,making them natural laboratories to study the processes that structure communities.This work aimed to assess the influences of elevation on Tropical Montane Cloud Forest plant communities in the Brazilian Atlantic Forest,a historically neglected ecoregion.We evaluated the phylogenetic structure,forest structure(tree basal area and tree density)and species richness along an elevation gradient,as well as the evolutionary fingerprints of elevation-success on phylogenetic lineages from the tree communities.To do so,we assessed nine communities along an elevation gradient from 1210 to 2310 m a.s.l.without large elevation gaps.The relationships between elevation and phylogenetic structure,forest structure and species richness were investigated through Linear Models.The occurrence of evolutionary fingerprint on phylogenetic lineages was investigated by quantifying the extent of phylogenetic signal of elevation-success using a genus-level molecular phylogeny.Our results showed decreased species richness at higher elevations and independence between forest structure,phylogenetic structure and elevation.We also verified that there is a phylogenetic signal associated with elevation-success by lineages.We concluded that the elevation is associated with species richness and the occurrence of phylogenetic lineages in the tree communities evaluated in Mantiqueira Range.On the other hand,elevation is not associated with forest structure or phylogenetic structure.Furthermore,closely related taxa tend to have their higher ecological success in similar elevations.Finally,we highlight the fragility of the tropical montane cloud forests in the Mantiqueira Range in face of environmental changes(i.e.global warming)due to the occurrence of exclusive phylogenetic lineages evolutionarily adapted to environmental conditions(i.e.minimum temperature)associated with each elevation range.展开更多
Data security assurance is crucial due to the increasing prevalence of cloud computing and its widespread use across different industries,especially in light of the growing number of cybersecurity threats.A major and ...Data security assurance is crucial due to the increasing prevalence of cloud computing and its widespread use across different industries,especially in light of the growing number of cybersecurity threats.A major and everpresent threat is Ransomware-as-a-Service(RaaS)assaults,which enable even individuals with minimal technical knowledge to conduct ransomware operations.This study provides a new approach for RaaS attack detection which uses an ensemble of deep learning models.For this purpose,the network intrusion detection dataset“UNSWNB15”from the Intelligent Security Group of the University of New South Wales,Australia is analyzed.In the initial phase,the rectified linear unit-,scaled exponential linear unit-,and exponential linear unit-based three separate Multi-Layer Perceptron(MLP)models are developed.Later,using the combined predictive power of these three MLPs,the RansoDetect Fusion ensemble model is introduced in the suggested methodology.The proposed ensemble technique outperforms previous studieswith impressive performance metrics results,including 98.79%accuracy and recall,98.85%precision,and 98.80%F1-score.The empirical results of this study validate the ensemble model’s ability to improve cybersecurity defenses by showing that it outperforms individual MLPmodels.In expanding the field of cybersecurity strategy,this research highlights the significance of combined deep learning models in strengthening intrusion detection systems against sophisticated cyber threats.展开更多
Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to est...Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.展开更多
The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of...The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of failure and lack of trust in the centralized system.However,it also brings new problems to the health information in the cloud storage environment,such as attribute leakage,low consensus efficiency,complex permission updates,and so on.This paper proposes an access control scheme with fine-grained attribute revocation,keyword search,and traceability of the attribute private key distribution process.Blockchain technology tracks the authorization of attribute private keys.The credit scoring method improves the Raft protocol in consensus efficiency.Besides,the interplanetary file system(IPFS)addresses the capacity deficit of blockchain.Under the premise of hiding policy,the research proposes a fine-grained access control method based on users,user attributes,and file structure.It optimizes the data-sharing mode.At the same time,Proxy Re-Encryption(PRE)technology is used to update the access rights.The proposed scheme proved to be secure.Comparative analysis and experimental results show that the proposed scheme has higher efficiency and more functions.It can meet the needs of medical institutions.展开更多
This paper focuses on the task of few-shot 3D point cloud semantic segmentation.Despite some progress,this task still encounters many issues due to the insufficient samples given,e.g.,incomplete object segmentation an...This paper focuses on the task of few-shot 3D point cloud semantic segmentation.Despite some progress,this task still encounters many issues due to the insufficient samples given,e.g.,incomplete object segmentation and inaccurate semantic discrimination.To tackle these issues,we first leverage part-whole relationships into the task of 3D point cloud semantic segmentation to capture semantic integrity,which is empowered by the dynamic capsule routing with the module of 3D Capsule Networks(CapsNets)in the embedding network.Concretely,the dynamic routing amalgamates geometric information of the 3D point cloud data to construct higher-level feature representations,which capture the relationships between object parts and their wholes.Secondly,we designed a multi-prototype enhancement module to enhance the prototype discriminability.Specifically,the single-prototype enhancement mechanism is expanded to the multi-prototype enhancement version for capturing rich semantics.Besides,the shot-correlation within the category is calculated via the interaction of different samples to enhance the intra-category similarity.Ablation studies prove that the involved part-whole relations and proposed multi-prototype enhancement module help to achieve complete object segmentation and improve semantic discrimination.Moreover,under the integration of these two modules,quantitative and qualitative experiments on two public benchmarks,including S3DIS and ScanNet,indicate the superior performance of the proposed framework on the task of 3D point cloud semantic segmentation,compared to some state-of-the-art methods.展开更多
Redundancy elimination techniques are extensively investigated to reduce storage overheads for cloud-assisted health systems.Deduplication eliminates the redundancy of duplicate blocks by storing one physical instance...Redundancy elimination techniques are extensively investigated to reduce storage overheads for cloud-assisted health systems.Deduplication eliminates the redundancy of duplicate blocks by storing one physical instance referenced by multiple duplicates.Delta compression is usually regarded as a complementary technique to deduplication to further remove the redundancy of similar blocks,but our observations indicate that this is disobedient when data have sparse duplicate blocks.In addition,there are many overlapped deltas in the resemblance detection process of post-deduplication delta compression,which hinders the efficiency of delta compression and the index phase of resemblance detection inquires abundant non-similar blocks,resulting in inefficient system throughput.Therefore,a multi-feature-based redundancy elimination scheme,called MFRE,is proposed to solve these problems.The similarity feature and temporal locality feature are excavated to assist redundancy elimination where the similarity feature well expresses the duplicate attribute.Then,similarity-based dynamic post-deduplication delta compression and temporal locality-based dynamic delta compression discover more similar base blocks to minimise overlapped deltas and improve compression ratios.Moreover,the clustering method based on block-relationship and the feature index strategy based on bloom filters reduce IO overheads and improve system throughput.Experiments demonstrate that the proposed method,compared to the state-of-the-art method,improves the compression ratio and system throughput by 9.68%and 50%,respectively.展开更多
Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This a...Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This article presents an effective mechanism to preserve the secrecy and integrity of data stored on the public cloud by leveraging blockchain technology,smart contracts,and cryptographic primitives.The proposed approach utilizes a Solidity-based smart contract as an auditor for maintaining and verifying the integrity of outsourced data.To preserve data secrecy,symmetric encryption systems are employed to encrypt user data before outsourcing it.An extensive performance analysis is conducted to illustrate the efficiency of the proposed mechanism.Additionally,a rigorous assessment is conducted to ensure that the developed smart contract is free from vulnerabilities and to measure its associated running costs.The security analysis of the proposed system confirms that our approach can securely maintain the confidentiality and integrity of cloud storage,even in the presence of malicious entities.The proposed mechanism contributes to enhancing data security in cloud computing environments and can be used as a foundation for developing more secure cloud storage systems.展开更多
Cavitation is a prevalent phenomenon within the domain of ship and ocean engineering,predominantly occurring in the tail flow fields of high-speed rotating propellers and on the surfaces of high-speed underwater vehic...Cavitation is a prevalent phenomenon within the domain of ship and ocean engineering,predominantly occurring in the tail flow fields of high-speed rotating propellers and on the surfaces of high-speed underwater vehicles.The re-entrant jet and compression wave resulting from the collapse of cavity vapour are pivotal factors contributing to cavity instability.Concurrently,these phenomena significantly modulate the evolution of cavitation flow.In this paper,numerical investigations into cloud cavitation over a Clark-Y hydrofoil were conducted,utilizing the Large Eddy Simulation(LES)turbulence model and the Volume of Fluid(VOF)method within the OpenFOAM framework.Comparative analysis of results obtained at different angles of attack is undertaken.A discernible augmentation in cavity thickness is observed concomitant with the escalation in attack angle,alongside a progressive intensification in pressure at the leading edge of the hydrofoil,contributing to the suction force.These results can serve as a fundamental point of reference for gaining a deeper comprehension of cloud cavitation dynamics.展开更多
基金The authors would like to acknowledge the financial support from the National Science Foundation of China(Grant No.41875027).
文摘Aiming at the problem of radar base and ground observation stations on the Tibet is sparsely distributed and cannot achieve large-scale precipitation monitoring.U-Net,an advanced machine learning(ML)method,is used to develop a robust and rapid algorithm for precipitating cloud detection based on the new-generation geostationary satellite of FengYun-4A(FY-4A).First,in this algorithm,the real-time multi-band infrared brightness temperature from FY-4A combined with the data of Digital Elevation Model(DEM)has been used as predictor variables for our model.Second,the efficiency of the feature was improved by changing the traditional convolution layer serial connection method of U-Net to residual mapping.Then,in order to solve the problem of the network that would produce semantic differences when directly concentrated with low-level and high-level features,we use dense skip pathways to reuse feature maps of different layers as inputs for concatenate neural networks feature layers from different depths.Finally,according to the characteristics of precipitation clouds,the pooling layer of U-Net was replaced by a convolution operation to realize the detection of small precipitation clouds.It was experimentally concluded that the Pixel Accuracy(PA)and Mean Intersection over Union(MIoU)of the improved U-Net on the test set could reach 0.916 and 0.928,the detection of precipitation clouds over Tibet were well actualized.
基金jointly supported by the National Natural Science Foundation of China(Grant Nos.41475028 and 41405128)the“Strategic Priority Research Program”of the Chinese Academy of Sciences(Grant No.XDA05100304)
文摘In this study we observed the microphysical properties, including the vertical and horizontal distributions of ice particles,liquid water content and ice habit, in different regions of a slightly supercooled stratiform cloud. Using aircraft instrument and radar data, the cloud top temperature was recorded as higher than -15℃, behind a cold front, on 9 September 2015 in North China. During the flight sampling, the high ice number concentration area was located in the supercooled part of a shallow convective cloud embedded in a stratiform cloud, where the ambient temperature was around -3℃. In this area,the maximum number concentrations of particles with diameter greater than 100 μm and 500 μm(N_(100) and N_(500)) exceeded 300 L-(-1) and 30 L-(-1), respectively, and were related to large supercooled water droplets with diameter greater than 24 μm derived from cloud–aerosol spectrometer probe measurements. The ice particles types in this region were predominantly columnar, needle, graupel, and some freezing drops, suggesting that the occurrence of high ice number concentrations was likely related to the Hallett–Mossop mechanism, although many other ice multiplication processes cannot be totally ruled out.The maximum ice number concentration obtained during the first penetration was around two to three orders of magnitude larger than that predicted by the Demott and Fletcher schemes when assuming the cloud top temperature was around-15℃.During the second penetration conducted within the stratiform cloud, N_(100) and N_(500) decreased by a factor of five to ten, and the presence of columnar and needle-like crystals became very rare.
基金supported by the National Natural Science Foundation of China(Grant No.92365206)the support of the China Postdoctoral Science Foundation(Certificate Number:2023M740272)+1 种基金supported by the National Natural Science Foundation of China(Grant No.12247168)China Postdoctoral Science Foundation(Certificate Number:2022TQ0036)。
文摘We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and comprehensive workflow that utilizes the quantum approximate optimization algorithm(QAOA).It facilitates the automatic conversion of the original problem into a quadratic unconstrained binary optimization(QUBO)model and its corresponding Ising model,which can be subsequently transformed into a weight graph.The core of Qcover relies on a graph decomposition-based classical algorithm,which efficiently derives the optimal parameters for the shallow QAOA circuit.Quafu-Qcover incorporates a dedicated compiler capable of translating QAOA circuits into physical quantum circuits that can be executed on Quafu cloud quantum computers.Compared to a general-purpose compiler,our compiler demonstrates the ability to generate shorter circuit depths,while also exhibiting superior speed performance.Additionally,the Qcover compiler has the capability to dynamically create a library of qubits coupling substructures in real-time,utilizing the most recent calibration data from the superconducting quantum devices.This ensures that computational tasks can be assigned to connected physical qubits with the highest fidelity.The Quafu-Qcover allows us to retrieve quantum computing sampling results using a task ID at any time,enabling asynchronous processing.Moreover,it incorporates modules for results preprocessing and visualization,facilitating an intuitive display of solutions for combinatorial optimization problems.We hope that Quafu-Qcover can serve as an instructive illustration for how to explore application problems on the Quafu cloud quantum computers.
基金funded by China Railway No.21 Bureau Group No.1 Engineering Co.,Ltd.,Grant No.202209140002.
文摘Rapid urbanization has led to a surge in the number of towering structures,and overturning is widely used because it can better accommodate the construction of shaped structures such as variable sections.The complexity of the construction process makes the construction risk have certain randomness,so this paper proposes a cloudbased coupled matter-element model to address the ambiguity and randomness in the safety risk assessment of overturning construction of towering structures.In the pretended model,the digital eigenvalues of the cloud model are used to replace the eigenvalues in the matter–element basic element,and calculate the cloud correlation of the risk assessment metrics through the correlation algorithm of the cloud model to build the computational model.Meanwhile,the improved hierarchical analysis method based on the cloud model is used to determine the weight of the index.The comprehensive evaluation scores of the evaluation event are then obtained through the weighted average method,and the safety risk level is determined accordingly.Through empirical analysis,(1)the improved hierarchical analysis method based on the cloud model can incorporate the data of multiple decisionmakers into the calculation formula to determine theweights,which makes the assessment resultsmore credible;(2)the evaluation results of the cloud-basedmatter-element coupledmodelmethod are basically consistent with those of the other two commonly used methods,and the confidence factor is less than 0.05,indicating that the cloudbased physical element coupled model method is reasonable and practical for towering structure overturning;(3)the cloud-based coupled element model method,which confirms the reliability of risk level by performing Spearman correlation on comprehensive assessment scores,can provide more comprehensive information of instances compared with other methods,and more comprehensively reflects the fuzzy uncertainty relationship between assessment indexes,which makes the assessment results more realistic,scientific and reliable.
基金National Natural Science Foundation of China(Nos.42071444,42101444)。
文摘Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.
基金financially supported by the National Natural Science Foundation of China (41867030, 41971036)the National Natural Science Foundation innovation research group science foundation of China (41421061)
文摘Water resources are one of the key factors restricting the development of arid areas,and cloud water resources is an important part of water resources.The arid region of central Asia is the core region of the current national green silk road construction,and is the largest arid region in the world.Based on cloud cover data of ECMWF,the current study analyzed temporal and spatial characteristics of cloud properties in arid regions of Central Asia between 1980 and 2019.Our findings show that:(1)From the point of view of spatial distribution,total cloudiness in arid regions of Central Asia was low in the south and high in the north.The distribution of high cloud frequency and medium cloud frequency was higher in the south and lower in the north,while low cloud frequency distribution was low in the south and high in the north.(2)In terms of time,the variation of cloud cover and cloud type frequency had obvious seasonal characteristics.From winter to spring,cloud cover increased,and the change of cloud type frequency increased.From spring to summer,cloud cover continued to increase and the change of cloud type frequency increased further.Cloud cover began to decrease from summer to autumn,and the change of cloud type frequency also decreased.(3)Generally,average total cloud cover decreased in most of central Asia,and high and medium cloud cover increased while low cloud cover decreased.This study provides a reference for the rational development of cloud resources in the region.
基金supported in part by the National Natural Science Foundation of China (Grant No. 42105127)the Special Research Assistant Project of the Chinese Academy of Sciencesthe National Key Research and Development Plans of China (Grant Nos. 2019YFC1510304 and 2016YFE0201900-02)。
文摘The cloud type product 2B-CLDCLASS-LIDAR based on CloudSat and CALIPSO from June 2006 to May 2017 is used to examine the temporal and spatial distribution characteristics and interannual variability of eight cloud types(high cloud, altostratus, altocumulus, stratus, stratocumulus, cumulus, nimbostratus, and deep convection) and three phases(ice,mixed, and water) in the Arctic. Possible reasons for the observed interannual variability are also discussed. The main conclusions are as follows:(1) More water clouds occur on the Atlantic side, and more ice clouds occur over continents.(2)The average spatial and seasonal distributions of cloud types show three patterns: high clouds and most cumuliform clouds are concentrated in low-latitude locations and peak in summer;altostratus and nimbostratus are concentrated over and around continents and are less abundant in summer;stratocumulus and stratus are concentrated near the inner Arctic and peak during spring and autumn.(3) Regional averaged interannual frequencies of ice clouds and altostratus clouds significantly decrease, while those of water clouds, altocumulus, and cumulus clouds increase significantly.(4) Significant features of the linear trends of cloud frequencies are mainly located over ocean areas.(5) The monthly water cloud frequency anomalies are positively correlated with air temperature in most of the troposphere, while those for ice clouds are negatively correlated.(6) The decrease in altostratus clouds is associated with the weakening of the Arctic front due to Arctic warming, while increased water vapor transport into the Arctic and higher atmospheric instability lead to more cumulus and altocumulus clouds.
文摘Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin in CC’s performance,the Cloud Service Broker(CSB),orchestrates DC selection.Failure to adroitly route user requests with suitable DCs transforms the CSB into a bottleneck,endangering service quality.To tackle this,deploying an efficient CSB policy becomes imperative,optimizing DC selection to meet stringent Qualityof-Service(QoS)demands.Amidst numerous CSB policies,their implementation grapples with challenges like costs and availability.This article undertakes a holistic review of diverse CSB policies,concurrently surveying the predicaments confronted by current policies.The foremost objective is to pinpoint research gaps and remedies to invigorate future policy development.Additionally,it extensively clarifies various DC selection methodologies employed in CC,enriching practitioners and researchers alike.Employing synthetic analysis,the article systematically assesses and compares myriad DC selection techniques.These analytical insights equip decision-makers with a pragmatic framework to discern the apt technique for their needs.In summation,this discourse resoundingly underscores the paramount importance of adept CSB policies in DC selection,highlighting the imperative role of efficient CSB policies in optimizing CC performance.By emphasizing the significance of these policies and their modeling implications,the article contributes to both the general modeling discourse and its practical applications in the CC domain.
基金the deputyship for Research&Innovation,Ministry of Education in Saudi Arabia for funding this research work through the Project Number(IFP-2022-34).
文摘In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding the risk of data loss and data overlapping.The development of data flow scheduling approaches in the cloud environment taking security parameters into account is insufficient.In our work,we propose a data scheduling model for the cloud environment.Themodel is made up of three parts that together help dispatch user data flow to the appropriate cloudVMs.The first component is the Collector Agent whichmust periodically collect information on the state of the network links.The second one is the monitoring agent which must then analyze,classify,and make a decision on the state of the link and finally transmit this information to the scheduler.The third one is the scheduler who must consider previous information to transfer user data,including fair distribution and reliable paths.It should be noted that each part of the proposedmodel requires the development of its algorithms.In this article,we are interested in the development of data transfer algorithms,including fairness distribution with the consideration of a stable link state.These algorithms are based on the grouping of transmitted files and the iterative method.The proposed algorithms showthe performances to obtain an approximate solution to the studied problem which is an NP-hard(Non-Polynomial solution)problem.The experimental results show that the best algorithm is the half-grouped minimum excluding(HME),with a percentage of 91.3%,an average deviation of 0.042,and an execution time of 0.001 s.
文摘As the extensive use of cloud computing raises questions about the security of any personal data stored there,cryptography is being used more frequently as a security tool to protect data confidentiality and privacy in the cloud environment.A hypervisor is a virtualization software used in cloud hosting to divide and allocate resources on various pieces of hardware.The choice of hypervisor can significantly impact the performance of cryptographic operations in the cloud environment.An important issue that must be carefully examined is that no hypervisor is completely superior in terms of performance;Each hypervisor should be examined to meet specific needs.The main objective of this study is to provide accurate results to compare the performance of Hyper-V and Kernel-based Virtual Machine(KVM)while implementing different cryptographic algorithms to guide cloud service providers and end users in choosing the most suitable hypervisor for their cryptographic needs.This study evaluated the efficiency of two hypervisors,Hyper-V and KVM,in implementing six cryptographic algorithms:Rivest,Shamir,Adleman(RSA),Advanced Encryption Standard(AES),Triple Data Encryption Standard(TripleDES),Carlisle Adams and Stafford Tavares(CAST-128),BLOWFISH,and TwoFish.The study’s findings show that KVM outperforms Hyper-V,with 12.2%less Central Processing Unit(CPU)use and 12.95%less time overall for encryption and decryption operations with various file sizes.The study’s findings emphasize how crucial it is to pick a hypervisor that is appropriate for cryptographic needs in a cloud environment,which could assist both cloud service providers and end users.Future research may focus more on how various hypervisors perform while handling cryptographic workloads.
基金supported this work by granting the doctoral scholarship to Ravi Fernandes Mariano,Carolina Njaime Mendes and Cléber Rodrigo de Souza,and through the master’s scholarship to Aloysio Souza de Mourathe postdoctoral scholarship to Vanessa Leite Rezende+2 种基金The authors also thank the Conselho Nacional de Desenvolvimento Científico e Tecnológico(CNPQ)by project funding(Edital Universal 2014,Process 459739/2014-0)the Instituto Alto-Montana da Serra Fina,the Fundação de AmparoàPesquisa do Estado de Minas Gerais(FAPEMIG)the Fundação Grupo Boticário de ProteçãoàNatureza,and finally the Fundo de Recuperação,Proteção e Desenvolvimento Sustentável das Bacias Hidrográficas do Estado de Minas Gerais(Fhidro).
文摘Environmental conditions can change markedly over geographical distances along elevation gradients,making them natural laboratories to study the processes that structure communities.This work aimed to assess the influences of elevation on Tropical Montane Cloud Forest plant communities in the Brazilian Atlantic Forest,a historically neglected ecoregion.We evaluated the phylogenetic structure,forest structure(tree basal area and tree density)and species richness along an elevation gradient,as well as the evolutionary fingerprints of elevation-success on phylogenetic lineages from the tree communities.To do so,we assessed nine communities along an elevation gradient from 1210 to 2310 m a.s.l.without large elevation gaps.The relationships between elevation and phylogenetic structure,forest structure and species richness were investigated through Linear Models.The occurrence of evolutionary fingerprint on phylogenetic lineages was investigated by quantifying the extent of phylogenetic signal of elevation-success using a genus-level molecular phylogeny.Our results showed decreased species richness at higher elevations and independence between forest structure,phylogenetic structure and elevation.We also verified that there is a phylogenetic signal associated with elevation-success by lineages.We concluded that the elevation is associated with species richness and the occurrence of phylogenetic lineages in the tree communities evaluated in Mantiqueira Range.On the other hand,elevation is not associated with forest structure or phylogenetic structure.Furthermore,closely related taxa tend to have their higher ecological success in similar elevations.Finally,we highlight the fragility of the tropical montane cloud forests in the Mantiqueira Range in face of environmental changes(i.e.global warming)due to the occurrence of exclusive phylogenetic lineages evolutionarily adapted to environmental conditions(i.e.minimum temperature)associated with each elevation range.
基金the Deanship of Scientific Research,Najran University,Kingdom of Saudi Arabia,for funding this work under the Research Groups Funding Program Grant Code Number(NU/RG/SERC/12/43).
文摘Data security assurance is crucial due to the increasing prevalence of cloud computing and its widespread use across different industries,especially in light of the growing number of cybersecurity threats.A major and everpresent threat is Ransomware-as-a-Service(RaaS)assaults,which enable even individuals with minimal technical knowledge to conduct ransomware operations.This study provides a new approach for RaaS attack detection which uses an ensemble of deep learning models.For this purpose,the network intrusion detection dataset“UNSWNB15”from the Intelligent Security Group of the University of New South Wales,Australia is analyzed.In the initial phase,the rectified linear unit-,scaled exponential linear unit-,and exponential linear unit-based three separate Multi-Layer Perceptron(MLP)models are developed.Later,using the combined predictive power of these three MLPs,the RansoDetect Fusion ensemble model is introduced in the suggested methodology.The proposed ensemble technique outperforms previous studieswith impressive performance metrics results,including 98.79%accuracy and recall,98.85%precision,and 98.80%F1-score.The empirical results of this study validate the ensemble model’s ability to improve cybersecurity defenses by showing that it outperforms individual MLPmodels.In expanding the field of cybersecurity strategy,this research highlights the significance of combined deep learning models in strengthening intrusion detection systems against sophisticated cyber threats.
基金supported in part by the Nationa Natural Science Foundation of China (61876011)the National Key Research and Development Program of China (2022YFB4703700)+1 种基金the Key Research and Development Program 2020 of Guangzhou (202007050002)the Key-Area Research and Development Program of Guangdong Province (2020B090921003)。
文摘Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.
基金This research was funded by the National Natural Science Foundation of China,Grant Number 62162039the Shaanxi Provincial Key R&D Program,China with Grant Number 2020GY-041.
文摘The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of failure and lack of trust in the centralized system.However,it also brings new problems to the health information in the cloud storage environment,such as attribute leakage,low consensus efficiency,complex permission updates,and so on.This paper proposes an access control scheme with fine-grained attribute revocation,keyword search,and traceability of the attribute private key distribution process.Blockchain technology tracks the authorization of attribute private keys.The credit scoring method improves the Raft protocol in consensus efficiency.Besides,the interplanetary file system(IPFS)addresses the capacity deficit of blockchain.Under the premise of hiding policy,the research proposes a fine-grained access control method based on users,user attributes,and file structure.It optimizes the data-sharing mode.At the same time,Proxy Re-Encryption(PRE)technology is used to update the access rights.The proposed scheme proved to be secure.Comparative analysis and experimental results show that the proposed scheme has higher efficiency and more functions.It can meet the needs of medical institutions.
基金This work is supported by the National Natural Science Foundation of China under Grant No.62001341the National Natural Science Foundation of Jiangsu Province under Grant No.BK20221379the Jiangsu Engineering Research Center of Digital Twinning Technology for Key Equipment in Petrochemical Process under Grant No.DTEC202104.
文摘This paper focuses on the task of few-shot 3D point cloud semantic segmentation.Despite some progress,this task still encounters many issues due to the insufficient samples given,e.g.,incomplete object segmentation and inaccurate semantic discrimination.To tackle these issues,we first leverage part-whole relationships into the task of 3D point cloud semantic segmentation to capture semantic integrity,which is empowered by the dynamic capsule routing with the module of 3D Capsule Networks(CapsNets)in the embedding network.Concretely,the dynamic routing amalgamates geometric information of the 3D point cloud data to construct higher-level feature representations,which capture the relationships between object parts and their wholes.Secondly,we designed a multi-prototype enhancement module to enhance the prototype discriminability.Specifically,the single-prototype enhancement mechanism is expanded to the multi-prototype enhancement version for capturing rich semantics.Besides,the shot-correlation within the category is calculated via the interaction of different samples to enhance the intra-category similarity.Ablation studies prove that the involved part-whole relations and proposed multi-prototype enhancement module help to achieve complete object segmentation and improve semantic discrimination.Moreover,under the integration of these two modules,quantitative and qualitative experiments on two public benchmarks,including S3DIS and ScanNet,indicate the superior performance of the proposed framework on the task of 3D point cloud semantic segmentation,compared to some state-of-the-art methods.
基金National Key R&D Program of China,Grant/Award Number:2018AAA0102100National Natural Science Foundation of China,Grant/Award Numbers:62177047,U22A2034+6 种基金International Science and Technology Innovation Joint Base of Machine Vision and Medical Image Processing in Hunan Province,Grant/Award Number:2021CB1013Key Research and Development Program of Hunan Province,Grant/Award Number:2022SK2054111 Project,Grant/Award Number:B18059Natural Science Foundation of Hunan Province,Grant/Award Number:2022JJ30762Fundamental Research Funds for the Central Universities of Central South University,Grant/Award Number:2020zzts143Scientific and Technological Innovation Leading Plan of High‐tech Industry of Hunan Province,Grant/Award Number:2020GK2021Central South University Research Program of Advanced Interdisciplinary Studies,Grant/Award Number:2023QYJC020。
文摘Redundancy elimination techniques are extensively investigated to reduce storage overheads for cloud-assisted health systems.Deduplication eliminates the redundancy of duplicate blocks by storing one physical instance referenced by multiple duplicates.Delta compression is usually regarded as a complementary technique to deduplication to further remove the redundancy of similar blocks,but our observations indicate that this is disobedient when data have sparse duplicate blocks.In addition,there are many overlapped deltas in the resemblance detection process of post-deduplication delta compression,which hinders the efficiency of delta compression and the index phase of resemblance detection inquires abundant non-similar blocks,resulting in inefficient system throughput.Therefore,a multi-feature-based redundancy elimination scheme,called MFRE,is proposed to solve these problems.The similarity feature and temporal locality feature are excavated to assist redundancy elimination where the similarity feature well expresses the duplicate attribute.Then,similarity-based dynamic post-deduplication delta compression and temporal locality-based dynamic delta compression discover more similar base blocks to minimise overlapped deltas and improve compression ratios.Moreover,the clustering method based on block-relationship and the feature index strategy based on bloom filters reduce IO overheads and improve system throughput.Experiments demonstrate that the proposed method,compared to the state-of-the-art method,improves the compression ratio and system throughput by 9.68%and 50%,respectively.
文摘Cloud computing has emerged as a viable alternative to traditional computing infrastructures,offering various benefits.However,the adoption of cloud storage poses significant risks to data secrecy and integrity.This article presents an effective mechanism to preserve the secrecy and integrity of data stored on the public cloud by leveraging blockchain technology,smart contracts,and cryptographic primitives.The proposed approach utilizes a Solidity-based smart contract as an auditor for maintaining and verifying the integrity of outsourced data.To preserve data secrecy,symmetric encryption systems are employed to encrypt user data before outsourcing it.An extensive performance analysis is conducted to illustrate the efficiency of the proposed mechanism.Additionally,a rigorous assessment is conducted to ensure that the developed smart contract is free from vulnerabilities and to measure its associated running costs.The security analysis of the proposed system confirms that our approach can securely maintain the confidentiality and integrity of cloud storage,even in the presence of malicious entities.The proposed mechanism contributes to enhancing data security in cloud computing environments and can be used as a foundation for developing more secure cloud storage systems.
基金supported by the National Natural Science Foundation of China(Nos.12202011,12332014)China Postdoctoral Science Foundation(No.2022M710190).
文摘Cavitation is a prevalent phenomenon within the domain of ship and ocean engineering,predominantly occurring in the tail flow fields of high-speed rotating propellers and on the surfaces of high-speed underwater vehicles.The re-entrant jet and compression wave resulting from the collapse of cavity vapour are pivotal factors contributing to cavity instability.Concurrently,these phenomena significantly modulate the evolution of cavitation flow.In this paper,numerical investigations into cloud cavitation over a Clark-Y hydrofoil were conducted,utilizing the Large Eddy Simulation(LES)turbulence model and the Volume of Fluid(VOF)method within the OpenFOAM framework.Comparative analysis of results obtained at different angles of attack is undertaken.A discernible augmentation in cavity thickness is observed concomitant with the escalation in attack angle,alongside a progressive intensification in pressure at the leading edge of the hydrofoil,contributing to the suction force.These results can serve as a fundamental point of reference for gaining a deeper comprehension of cloud cavitation dynamics.