A statistically-based low-level cloud parameterization scheme is introduced, modified, and applied in the Flexible coupled General Circulation Model (FGCM-O). It is found that the low-level cloud scheme makes improved...A statistically-based low-level cloud parameterization scheme is introduced, modified, and applied in the Flexible coupled General Circulation Model (FGCM-O). It is found that the low-level cloud scheme makes improved simulations of low-level cloud fractions and net surface shortwave radiation fluxes in the subtropical eastern oceans off western coasts in the model. Accompanying the improvement in the net surface shortwave radiation fluxes, the simulated distribution of SSTs is more reasonably asymmetrical about the equator in the tropical eastern Pacific, which suppresses, to some extent, the development of the double ITCZ in the model. Warm SST biases in the ITCZ north of the equator are more realistically reduced, too. But the equatorial cold tongue is strengthened and extends further westward, which reduces the precipitation rate in the western equatorial Pacific but increases it in the ITCZ north of the equator in the far eastern Pacific. It is demonstrated that the low-level cloud-radiation feedback would enhance the cooperative feedback between the equatorial cold tongue and the ITCZ. Based on surface layer heat budget analyses, it is demonstrated that the reduction of SSTs is attributed to both the thermodynamic cooling process modified by the increase of cloud fractions and the oceanic dynamical cooling processes associated with the strengthened surface wind in the eastern equatorial Pacific, but it is mainly attributed to oceanic dynamical cooling processes associated with the strengthening of surface wind in the central and western equatorial Pacific.展开更多
Like many other coupled models, the Flexible coupled General Circulation Model (FGCM-0) suffers from the spurious “Double ITCZ”. In order to understand the “Double ITCZ” in FGCM-0, this study first examines the lo...Like many other coupled models, the Flexible coupled General Circulation Model (FGCM-0) suffers from the spurious “Double ITCZ”. In order to understand the “Double ITCZ” in FGCM-0, this study first examines the low-level cloud cover and the bulk stability of the low troposphere over the eastern subtropical Pacific simulated by the National Center for Atmospheric Research (NCAR) Community Climate Model version 3 (CCM3), which is the atmosphere component model of FGCM-0. It is found that the bulk stability of the low troposphere simulated by CCM3 is very consistent with the one derived from the National Center for Environmental Prediction (NCEP) reanalysis, but the simulated low-level cloud cover is much less than that derived from the International Satellite Cloud Climatology Project (ISCCP) D2 data. Based on the regression equations between the low-level cloud cover from the ISCCP data and the bulk stability of the low troposphere derived from the NCEP reanalysis, the parameterization scheme of low-level cloud in CCM3 is modified and used in sensitivity experiments to examine the impact of low-level cloud over the eastern subtropical Pacific on the spurious “Double ITCZ” in FGCM-0. Results show that the modified scheme causes the simulated low-level cloud cover to be improved locally over the cold oceans. Increasing the low-level cloud cover off Peru not only significantly alleviates the SST warm biases in the southeastern tropical Pacific, but also causes the equatorial cold tongue to be strengthened and to extend further west. Increasing the low-level cloud fraction off California effectively reduces the SST warm biases in ITCZ north of the equator. In order to examine the feedback between the SST and low-level cloud cover off Peru, one additional sensitivity experiment is performed in which the SST over the cold ocean off Peru is restored. It shows that decreasing the SST results in similar impacts over the wide regions from the southeastern tropical Pacific northwestwards to the western/central equatorial Pacific as increasing the low-level cloud cover does.展开更多
Cloud computing has created a paradigm shift that affects the way in which business applications are developed. Many business organizations use cloud infrastructures as platforms on which to deploy business applicatio...Cloud computing has created a paradigm shift that affects the way in which business applications are developed. Many business organizations use cloud infrastructures as platforms on which to deploy business applications. Increasing numbers of vendors are supplying the cloud marketplace with a wide range of cloud products. Different vendors offer cloud products in different formats. The cost structures for consuming cloud products can be complex. Finding a suitable set of cloud products that meets an application’s requirements and budget can be a challenging task. In this paper, an ontology-based resource mapping mechanism is proposed. Domain-specific ontologies are used to specify high-level application’s requirements. These are then translated into high-level infrastructure ontologies which then can be mapped onto low-level descriptions of cloud resources. Cost ontologies are proposed for cloud resources. An exemplar media transcoding and delivery service is studied in order to illustrate how high-level requirements can be modeled and mapped onto cloud resources within a budget constraint. The proposed ontologies provide an application-centric mechanism for specifying cloud requirements which can then be used for searching for suitable resources in a multi-provider cloud environment.展开更多
With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud...With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud servers vulnerable due to insufficient encryption.This paper introduces a novel mechanism that encrypts data in‘bundle’units,designed to meet the dual requirements of efficiency and security for frequently updated collaborative data.Each bundle includes updated information,allowing only the updated portions to be reencrypted when changes occur.The encryption method proposed in this paper addresses the inefficiencies of traditional encryption modes,such as Cipher Block Chaining(CBC)and Counter(CTR),which require decrypting and re-encrypting the entire dataset whenever updates occur.The proposed method leverages update-specific information embedded within data bundles and metadata that maps the relationship between these bundles and the plaintext data.By utilizing this information,the method accurately identifies the modified portions and applies algorithms to selectively re-encrypt only those sections.This approach significantly enhances the efficiency of data updates while maintaining high performance,particularly in large-scale data environments.To validate this approach,we conducted experiments measuring execution time as both the size of the modified data and the total dataset size varied.Results show that the proposed method significantly outperforms CBC and CTR modes in execution speed,with greater performance gains as data size increases.Additionally,our security evaluation confirms that this method provides robust protection against both passive and active attacks.展开更多
In this study, a statistical cloud scheme is first introduced and coupledwith a first-order turbulence scheme with second-order turbulence moments parameterized by thetimescale of the turbulence dissipation and the ve...In this study, a statistical cloud scheme is first introduced and coupledwith a first-order turbulence scheme with second-order turbulence moments parameterized by thetimescale of the turbulence dissipation and the vertical turbulent diffusion coefficient. Then theability of the scheme to simulate cloud fraction at different relative humidity, verticaltemperature profile, and the timescale of the turbulent dissipation is examined by numericalsimulation. It is found that the simulated cloud fraction is sensitive to the parameter used in thestatistical cloud scheme and the timescale of the turbulent dissipation. Based on the analyses, theintroduced statistical cloud scheme is modified. By combining the modified statistical cloud schemewith a boundary layer cumulus scheme, a new statistically-based low-level cloud scheme is proposedand tentatively applied in NCAR (National Center for Atmospheric Research) CCM3 (Community ClimateModel version 3). It is found that the simulation of low-level cloud fraction is markedly improvedand the centers with maximum low-level cloud fractions are well simulated in the cold oceans off thewestern coasts with the statistically-based low-level cloud scheme applied in CCM3. It suggeststhat the new statistically-based low-level cloud scheme has a great potential in the generalcirculation model for improving the low-level cloud parameterization.展开更多
Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to est...Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.展开更多
The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of...The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of failure and lack of trust in the centralized system.However,it also brings new problems to the health information in the cloud storage environment,such as attribute leakage,low consensus efficiency,complex permission updates,and so on.This paper proposes an access control scheme with fine-grained attribute revocation,keyword search,and traceability of the attribute private key distribution process.Blockchain technology tracks the authorization of attribute private keys.The credit scoring method improves the Raft protocol in consensus efficiency.Besides,the interplanetary file system(IPFS)addresses the capacity deficit of blockchain.Under the premise of hiding policy,the research proposes a fine-grained access control method based on users,user attributes,and file structure.It optimizes the data-sharing mode.At the same time,Proxy Re-Encryption(PRE)technology is used to update the access rights.The proposed scheme proved to be secure.Comparative analysis and experimental results show that the proposed scheme has higher efficiency and more functions.It can meet the needs of medical institutions.展开更多
We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and c...We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and comprehensive workflow that utilizes the quantum approximate optimization algorithm(QAOA).It facilitates the automatic conversion of the original problem into a quadratic unconstrained binary optimization(QUBO)model and its corresponding Ising model,which can be subsequently transformed into a weight graph.The core of Qcover relies on a graph decomposition-based classical algorithm,which efficiently derives the optimal parameters for the shallow QAOA circuit.Quafu-Qcover incorporates a dedicated compiler capable of translating QAOA circuits into physical quantum circuits that can be executed on Quafu cloud quantum computers.Compared to a general-purpose compiler,our compiler demonstrates the ability to generate shorter circuit depths,while also exhibiting superior speed performance.Additionally,the Qcover compiler has the capability to dynamically create a library of qubits coupling substructures in real-time,utilizing the most recent calibration data from the superconducting quantum devices.This ensures that computational tasks can be assigned to connected physical qubits with the highest fidelity.The Quafu-Qcover allows us to retrieve quantum computing sampling results using a task ID at any time,enabling asynchronous processing.Moreover,it incorporates modules for results preprocessing and visualization,facilitating an intuitive display of solutions for combinatorial optimization problems.We hope that Quafu-Qcover can serve as an instructive illustration for how to explore application problems on the Quafu cloud quantum computers.展开更多
Cloud computing provides a diverse and adaptable resource pool over the internet,allowing users to tap into various resources as needed.It has been seen as a robust solution to relevant challenges.A significant delay ...Cloud computing provides a diverse and adaptable resource pool over the internet,allowing users to tap into various resources as needed.It has been seen as a robust solution to relevant challenges.A significant delay can hamper the performance of IoT-enabled cloud platforms.However,efficient task scheduling can lower the cloud infrastructure’s energy consumption,thus maximizing the service provider’s revenue by decreasing user job processing times.The proposed Modified Chimp-Whale Optimization Algorithm called Modified Chimp-Whale Optimization Algorithm(MCWOA),combines elements of the Chimp Optimization Algorithm(COA)and the Whale Optimization Algorithm(WOA).To enhance MCWOA’s identification precision,the Sobol sequence is used in the population initialization phase,ensuring an even distribution of the population across the solution space.Moreover,the traditional MCWOA’s local search capabilities are augmented by incorporating the whale optimization algorithm’s bubble-net hunting and random search mechanisms into MCWOA’s position-updating process.This study demonstrates the effectiveness of the proposed approach using a two-story rigid frame and a simply supported beam model.Simulated outcomes reveal that the new method outperforms the original MCWOA,especially in multi-damage detection scenarios.MCWOA excels in avoiding false positives and enhancing computational speed,making it an optimal choice for structural damage detection.The efficiency of the proposed MCWOA is assessed against metrics such as energy usage,computational expense,task duration,and delay.The simulated data indicates that the new MCWOA outpaces other methods across all metrics.The study also references the Whale Optimization Algorithm(WOA),Chimp Algorithm(CA),Ant Lion Optimizer(ALO),Genetic Algorithm(GA)and Grey Wolf Optimizer(GWO).展开更多
With the rapid advancement of quantum computing,hybrid quantum–classical machine learning has shown numerous potential applications at the current stage,with expectations of being achievable in the noisy intermediate...With the rapid advancement of quantum computing,hybrid quantum–classical machine learning has shown numerous potential applications at the current stage,with expectations of being achievable in the noisy intermediate-scale quantum(NISQ)era.Quantum reinforcement learning,as an indispensable study,has recently demonstrated its ability to solve standard benchmark environments with formally provable theoretical advantages over classical counterparts.However,despite the progress of quantum processors and the emergence of quantum computing clouds,implementing quantum reinforcement learning algorithms utilizing parameterized quantum circuits(PQCs)on NISQ devices remains infrequent.In this work,we take the first step towards executing benchmark quantum reinforcement problems on real devices equipped with at most 136 qubits on the BAQIS Quafu quantum computing cloud.The experimental results demonstrate that the policy agents can successfully accomplish objectives under modified conditions in both the training and inference phases.Moreover,we design hardware-efficient PQC architectures in the quantum model using a multi-objective evolutionary algorithm and develop a learning algorithm that is adaptable to quantum devices.We hope that the Quafu-RL can be a guiding example to show how to realize machine learning tasks by taking advantage of quantum computers on the quantum cloud platform.展开更多
Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduct...Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.展开更多
Security issues in cloud networks and edge computing have become very common. This research focuses on analyzing such issues and developing the best solutions. A detailed literature review has been conducted in this r...Security issues in cloud networks and edge computing have become very common. This research focuses on analyzing such issues and developing the best solutions. A detailed literature review has been conducted in this regard. The findings have shown that many challenges are linked to edge computing, such as privacy concerns, security breaches, high costs, low efficiency, etc. Therefore, there is a need to implement proper security measures to overcome these issues. Using emerging trends, like machine learning, encryption, artificial intelligence, real-time monitoring, etc., can help mitigate security issues. They can also develop a secure and safe future in cloud computing. It was concluded that the security implications of edge computing can easily be covered with the help of new technologies and techniques.展开更多
Cloud vertical structure(CVS)strongly affects atmospheric circulation and radiative transfer.Yet,long-term,groundbased observations are scarce over the Tibetan Plateau(TP)despite its vital role in global climate.This ...Cloud vertical structure(CVS)strongly affects atmospheric circulation and radiative transfer.Yet,long-term,groundbased observations are scarce over the Tibetan Plateau(TP)despite its vital role in global climate.This study utilizes ground-based lidar and Ka-band cloud profiling radar(KaCR)measurements at Yangbajain(YBJ),TP,from October 2021 to September 2022 to characterize cloud properties.A satisfactorily performing novel anomaly detection algorithm(LevelShiftAD)is proposed for lidar and KaCR profiles to identify cloud boundaries.Cloud base heights(CBH)retrieved from KaCR and lidar observations show good consistency,with a correlation coefficient of 0.78 and a mean difference of-0.06 km.Cloud top heights(CTH)derived from KaCR match the FengYun-4A and Himawari-8 products well.Thus,KaCR measurements serve as the primary dataset for investigating CVSs over the TP.Different diurnal cycles occur in summer and winter.The diurnal cycle is characterized by a pronounced increase in cloud occurrence frequency in the afternoon with an early-morning decrease in winter,while cloud amounts remain high all day,with scattered nocturnal increases in summer.Summer features more frequent clouds with larger geometrical thicknesses,a higher multi-layer ratio,and greater inter-cloud spacing.Around 26%of the cloud bases occur below 0.5 km.Winter exhibits a bimodal distribution of cloud base heights with peaks at 0-0.5 km and 2-2.5 km.Single-layer and geometrically thin clouds prevail at YBJ.This study enriches long-term measurements of CVSs over the TP,and the robust anomaly detection method helps quantify cloud macro-physical properties via synergistic lidar and radar observations.展开更多
In a convective scheme featuring a discretized cloud size density, the assumed lateral mixing rate is inversely proportional to the exponential coefficient of plume size. This follows a typical assumption of-1, but it...In a convective scheme featuring a discretized cloud size density, the assumed lateral mixing rate is inversely proportional to the exponential coefficient of plume size. This follows a typical assumption of-1, but it has unveiled inherent uncertainties, especially for deep layer clouds. Addressing this knowledge gap, we conducted comprehensive large eddy simulations and comparative analyses focused on terrestrial regions. Our investigation revealed that cloud formation adheres to the tenets of Bernoulli trials, illustrating power-law scaling that remains consistent regardless of the inherent deep layer cloud attributes existing between cloud size and the number of clouds. This scaling paradigm encompasses liquid, ice, and mixed phases in deep layer clouds. The exponent characterizing the interplay between cloud scale and number in the deep layer cloud, specifically for liquid, ice, or mixed-phase clouds, resembles that of shallow convection,but converges closely to zero. This convergence signifies a propensity for diminished cloud numbers and sizes within deep layer clouds. Notably, the infusion of abundant moisture and the release of latent heat by condensation within the lower atmospheric strata make substantial contributions. However, this role in ice phase formation is limited. The emergence of liquid and ice phases in deep layer clouds is facilitated by the latent heat and influenced by the wind shear inherent in the middle levels. These interrelationships hold potential applications in formulating parameterizations and post-processing model outcomes.展开更多
基金This study was jointly supported by the National Science Foundation of China under Grant No.s40233031 and 40221503the National Key Basic Research Project under Grant No.G200078502.
文摘A statistically-based low-level cloud parameterization scheme is introduced, modified, and applied in the Flexible coupled General Circulation Model (FGCM-O). It is found that the low-level cloud scheme makes improved simulations of low-level cloud fractions and net surface shortwave radiation fluxes in the subtropical eastern oceans off western coasts in the model. Accompanying the improvement in the net surface shortwave radiation fluxes, the simulated distribution of SSTs is more reasonably asymmetrical about the equator in the tropical eastern Pacific, which suppresses, to some extent, the development of the double ITCZ in the model. Warm SST biases in the ITCZ north of the equator are more realistically reduced, too. But the equatorial cold tongue is strengthened and extends further westward, which reduces the precipitation rate in the western equatorial Pacific but increases it in the ITCZ north of the equator in the far eastern Pacific. It is demonstrated that the low-level cloud-radiation feedback would enhance the cooperative feedback between the equatorial cold tongue and the ITCZ. Based on surface layer heat budget analyses, it is demonstrated that the reduction of SSTs is attributed to both the thermodynamic cooling process modified by the increase of cloud fractions and the oceanic dynamical cooling processes associated with the strengthened surface wind in the eastern equatorial Pacific, but it is mainly attributed to oceanic dynamical cooling processes associated with the strengthening of surface wind in the central and western equatorial Pacific.
基金the National Natu-ral Science Foundation of China under Grant No.40023001and No.40233031 and"Innovation Program"under GrantZKCX2-SW-210and the National Key Basic ResearchProject under Grant G200078502.
文摘Like many other coupled models, the Flexible coupled General Circulation Model (FGCM-0) suffers from the spurious “Double ITCZ”. In order to understand the “Double ITCZ” in FGCM-0, this study first examines the low-level cloud cover and the bulk stability of the low troposphere over the eastern subtropical Pacific simulated by the National Center for Atmospheric Research (NCAR) Community Climate Model version 3 (CCM3), which is the atmosphere component model of FGCM-0. It is found that the bulk stability of the low troposphere simulated by CCM3 is very consistent with the one derived from the National Center for Environmental Prediction (NCEP) reanalysis, but the simulated low-level cloud cover is much less than that derived from the International Satellite Cloud Climatology Project (ISCCP) D2 data. Based on the regression equations between the low-level cloud cover from the ISCCP data and the bulk stability of the low troposphere derived from the NCEP reanalysis, the parameterization scheme of low-level cloud in CCM3 is modified and used in sensitivity experiments to examine the impact of low-level cloud over the eastern subtropical Pacific on the spurious “Double ITCZ” in FGCM-0. Results show that the modified scheme causes the simulated low-level cloud cover to be improved locally over the cold oceans. Increasing the low-level cloud cover off Peru not only significantly alleviates the SST warm biases in the southeastern tropical Pacific, but also causes the equatorial cold tongue to be strengthened and to extend further west. Increasing the low-level cloud fraction off California effectively reduces the SST warm biases in ITCZ north of the equator. In order to examine the feedback between the SST and low-level cloud cover off Peru, one additional sensitivity experiment is performed in which the SST over the cold ocean off Peru is restored. It shows that decreasing the SST results in similar impacts over the wide regions from the southeastern tropical Pacific northwestwards to the western/central equatorial Pacific as increasing the low-level cloud cover does.
文摘Cloud computing has created a paradigm shift that affects the way in which business applications are developed. Many business organizations use cloud infrastructures as platforms on which to deploy business applications. Increasing numbers of vendors are supplying the cloud marketplace with a wide range of cloud products. Different vendors offer cloud products in different formats. The cost structures for consuming cloud products can be complex. Finding a suitable set of cloud products that meets an application’s requirements and budget can be a challenging task. In this paper, an ontology-based resource mapping mechanism is proposed. Domain-specific ontologies are used to specify high-level application’s requirements. These are then translated into high-level infrastructure ontologies which then can be mapped onto low-level descriptions of cloud resources. Cost ontologies are proposed for cloud resources. An exemplar media transcoding and delivery service is studied in order to illustrate how high-level requirements can be modeled and mapped onto cloud resources within a budget constraint. The proposed ontologies provide an application-centric mechanism for specifying cloud requirements which can then be used for searching for suitable resources in a multi-provider cloud environment.
基金supported by the Institute of Information&communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(RS-2024-00399401,Development of Quantum-Safe Infrastructure Migration and Quantum Security Verification Technologies).
文摘With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud servers vulnerable due to insufficient encryption.This paper introduces a novel mechanism that encrypts data in‘bundle’units,designed to meet the dual requirements of efficiency and security for frequently updated collaborative data.Each bundle includes updated information,allowing only the updated portions to be reencrypted when changes occur.The encryption method proposed in this paper addresses the inefficiencies of traditional encryption modes,such as Cipher Block Chaining(CBC)and Counter(CTR),which require decrypting and re-encrypting the entire dataset whenever updates occur.The proposed method leverages update-specific information embedded within data bundles and metadata that maps the relationship between these bundles and the plaintext data.By utilizing this information,the method accurately identifies the modified portions and applies algorithms to selectively re-encrypt only those sections.This approach significantly enhances the efficiency of data updates while maintaining high performance,particularly in large-scale data environments.To validate this approach,we conducted experiments measuring execution time as both the size of the modified data and the total dataset size varied.Results show that the proposed method significantly outperforms CBC and CTR modes in execution speed,with greater performance gains as data size increases.Additionally,our security evaluation confirms that this method provides robust protection against both passive and active attacks.
基金This study is jointly supported by the Chinese Academy of Sciences "Innovation Program" under Grant ZKCX2-SW-210, theNational Natural Science Foundation of China under Grant Nos. 40233031, 40231004, and 40221503, and the National Key BasicResearch Projec
文摘In this study, a statistical cloud scheme is first introduced and coupledwith a first-order turbulence scheme with second-order turbulence moments parameterized by thetimescale of the turbulence dissipation and the vertical turbulent diffusion coefficient. Then theability of the scheme to simulate cloud fraction at different relative humidity, verticaltemperature profile, and the timescale of the turbulent dissipation is examined by numericalsimulation. It is found that the simulated cloud fraction is sensitive to the parameter used in thestatistical cloud scheme and the timescale of the turbulent dissipation. Based on the analyses, theintroduced statistical cloud scheme is modified. By combining the modified statistical cloud schemewith a boundary layer cumulus scheme, a new statistically-based low-level cloud scheme is proposedand tentatively applied in NCAR (National Center for Atmospheric Research) CCM3 (Community ClimateModel version 3). It is found that the simulation of low-level cloud fraction is markedly improvedand the centers with maximum low-level cloud fractions are well simulated in the cold oceans off thewestern coasts with the statistically-based low-level cloud scheme applied in CCM3. It suggeststhat the new statistically-based low-level cloud scheme has a great potential in the generalcirculation model for improving the low-level cloud parameterization.
基金supported in part by the Nationa Natural Science Foundation of China (61876011)the National Key Research and Development Program of China (2022YFB4703700)+1 种基金the Key Research and Development Program 2020 of Guangzhou (202007050002)the Key-Area Research and Development Program of Guangdong Province (2020B090921003)。
文摘Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.
基金This research was funded by the National Natural Science Foundation of China,Grant Number 62162039the Shaanxi Provincial Key R&D Program,China with Grant Number 2020GY-041.
文摘The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of failure and lack of trust in the centralized system.However,it also brings new problems to the health information in the cloud storage environment,such as attribute leakage,low consensus efficiency,complex permission updates,and so on.This paper proposes an access control scheme with fine-grained attribute revocation,keyword search,and traceability of the attribute private key distribution process.Blockchain technology tracks the authorization of attribute private keys.The credit scoring method improves the Raft protocol in consensus efficiency.Besides,the interplanetary file system(IPFS)addresses the capacity deficit of blockchain.Under the premise of hiding policy,the research proposes a fine-grained access control method based on users,user attributes,and file structure.It optimizes the data-sharing mode.At the same time,Proxy Re-Encryption(PRE)technology is used to update the access rights.The proposed scheme proved to be secure.Comparative analysis and experimental results show that the proposed scheme has higher efficiency and more functions.It can meet the needs of medical institutions.
基金supported by the National Natural Science Foundation of China(Grant No.92365206)the support of the China Postdoctoral Science Foundation(Certificate Number:2023M740272)+1 种基金supported by the National Natural Science Foundation of China(Grant No.12247168)China Postdoctoral Science Foundation(Certificate Number:2022TQ0036)。
文摘We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and comprehensive workflow that utilizes the quantum approximate optimization algorithm(QAOA).It facilitates the automatic conversion of the original problem into a quadratic unconstrained binary optimization(QUBO)model and its corresponding Ising model,which can be subsequently transformed into a weight graph.The core of Qcover relies on a graph decomposition-based classical algorithm,which efficiently derives the optimal parameters for the shallow QAOA circuit.Quafu-Qcover incorporates a dedicated compiler capable of translating QAOA circuits into physical quantum circuits that can be executed on Quafu cloud quantum computers.Compared to a general-purpose compiler,our compiler demonstrates the ability to generate shorter circuit depths,while also exhibiting superior speed performance.Additionally,the Qcover compiler has the capability to dynamically create a library of qubits coupling substructures in real-time,utilizing the most recent calibration data from the superconducting quantum devices.This ensures that computational tasks can be assigned to connected physical qubits with the highest fidelity.The Quafu-Qcover allows us to retrieve quantum computing sampling results using a task ID at any time,enabling asynchronous processing.Moreover,it incorporates modules for results preprocessing and visualization,facilitating an intuitive display of solutions for combinatorial optimization problems.We hope that Quafu-Qcover can serve as an instructive illustration for how to explore application problems on the Quafu cloud quantum computers.
文摘Cloud computing provides a diverse and adaptable resource pool over the internet,allowing users to tap into various resources as needed.It has been seen as a robust solution to relevant challenges.A significant delay can hamper the performance of IoT-enabled cloud platforms.However,efficient task scheduling can lower the cloud infrastructure’s energy consumption,thus maximizing the service provider’s revenue by decreasing user job processing times.The proposed Modified Chimp-Whale Optimization Algorithm called Modified Chimp-Whale Optimization Algorithm(MCWOA),combines elements of the Chimp Optimization Algorithm(COA)and the Whale Optimization Algorithm(WOA).To enhance MCWOA’s identification precision,the Sobol sequence is used in the population initialization phase,ensuring an even distribution of the population across the solution space.Moreover,the traditional MCWOA’s local search capabilities are augmented by incorporating the whale optimization algorithm’s bubble-net hunting and random search mechanisms into MCWOA’s position-updating process.This study demonstrates the effectiveness of the proposed approach using a two-story rigid frame and a simply supported beam model.Simulated outcomes reveal that the new method outperforms the original MCWOA,especially in multi-damage detection scenarios.MCWOA excels in avoiding false positives and enhancing computational speed,making it an optimal choice for structural damage detection.The efficiency of the proposed MCWOA is assessed against metrics such as energy usage,computational expense,task duration,and delay.The simulated data indicates that the new MCWOA outpaces other methods across all metrics.The study also references the Whale Optimization Algorithm(WOA),Chimp Algorithm(CA),Ant Lion Optimizer(ALO),Genetic Algorithm(GA)and Grey Wolf Optimizer(GWO).
基金supported by the Beijing Academy of Quantum Information Sciencessupported by the National Natural Science Foundation of China(Grant No.92365206)+2 种基金the support of the China Postdoctoral Science Foundation(Certificate Number:2023M740272)supported by the National Natural Science Foundation of China(Grant No.12247168)China Postdoctoral Science Foundation(Certificate Number:2022TQ0036)。
文摘With the rapid advancement of quantum computing,hybrid quantum–classical machine learning has shown numerous potential applications at the current stage,with expectations of being achievable in the noisy intermediate-scale quantum(NISQ)era.Quantum reinforcement learning,as an indispensable study,has recently demonstrated its ability to solve standard benchmark environments with formally provable theoretical advantages over classical counterparts.However,despite the progress of quantum processors and the emergence of quantum computing clouds,implementing quantum reinforcement learning algorithms utilizing parameterized quantum circuits(PQCs)on NISQ devices remains infrequent.In this work,we take the first step towards executing benchmark quantum reinforcement problems on real devices equipped with at most 136 qubits on the BAQIS Quafu quantum computing cloud.The experimental results demonstrate that the policy agents can successfully accomplish objectives under modified conditions in both the training and inference phases.Moreover,we design hardware-efficient PQC architectures in the quantum model using a multi-objective evolutionary algorithm and develop a learning algorithm that is adaptable to quantum devices.We hope that the Quafu-RL can be a guiding example to show how to realize machine learning tasks by taking advantage of quantum computers on the quantum cloud platform.
基金National Natural Science Foundation of China(Nos.42071444,42101444)。
文摘Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.
文摘Security issues in cloud networks and edge computing have become very common. This research focuses on analyzing such issues and developing the best solutions. A detailed literature review has been conducted in this regard. The findings have shown that many challenges are linked to edge computing, such as privacy concerns, security breaches, high costs, low efficiency, etc. Therefore, there is a need to implement proper security measures to overcome these issues. Using emerging trends, like machine learning, encryption, artificial intelligence, real-time monitoring, etc., can help mitigate security issues. They can also develop a secure and safe future in cloud computing. It was concluded that the security implications of edge computing can easily be covered with the help of new technologies and techniques.
基金jointly funded by the Second Tibetan Plateau Scientific Expedition and Research Program of China under Grant 2019QZKK0604the National Natural Science Foundation of China(Grant Nos.92044303 and 42001294).
文摘Cloud vertical structure(CVS)strongly affects atmospheric circulation and radiative transfer.Yet,long-term,groundbased observations are scarce over the Tibetan Plateau(TP)despite its vital role in global climate.This study utilizes ground-based lidar and Ka-band cloud profiling radar(KaCR)measurements at Yangbajain(YBJ),TP,from October 2021 to September 2022 to characterize cloud properties.A satisfactorily performing novel anomaly detection algorithm(LevelShiftAD)is proposed for lidar and KaCR profiles to identify cloud boundaries.Cloud base heights(CBH)retrieved from KaCR and lidar observations show good consistency,with a correlation coefficient of 0.78 and a mean difference of-0.06 km.Cloud top heights(CTH)derived from KaCR match the FengYun-4A and Himawari-8 products well.Thus,KaCR measurements serve as the primary dataset for investigating CVSs over the TP.Different diurnal cycles occur in summer and winter.The diurnal cycle is characterized by a pronounced increase in cloud occurrence frequency in the afternoon with an early-morning decrease in winter,while cloud amounts remain high all day,with scattered nocturnal increases in summer.Summer features more frequent clouds with larger geometrical thicknesses,a higher multi-layer ratio,and greater inter-cloud spacing.Around 26%of the cloud bases occur below 0.5 km.Winter exhibits a bimodal distribution of cloud base heights with peaks at 0-0.5 km and 2-2.5 km.Single-layer and geometrically thin clouds prevail at YBJ.This study enriches long-term measurements of CVSs over the TP,and the robust anomaly detection method helps quantify cloud macro-physical properties via synergistic lidar and radar observations.
基金supported by the Second Tibetan Plateau Scientific Expedition and Research Program (STEP) (Grant No.2019QZKK010203)the National Natural Science Foundation of China (Grant No.42175174 and 41975130)+1 种基金the Natural Science Foundation of Sichuan Province (Grant No.2022NSFSC1092)the Sichuan Provincial Innovation Training Program for College Students (Grant No.S202210621009)。
文摘In a convective scheme featuring a discretized cloud size density, the assumed lateral mixing rate is inversely proportional to the exponential coefficient of plume size. This follows a typical assumption of-1, but it has unveiled inherent uncertainties, especially for deep layer clouds. Addressing this knowledge gap, we conducted comprehensive large eddy simulations and comparative analyses focused on terrestrial regions. Our investigation revealed that cloud formation adheres to the tenets of Bernoulli trials, illustrating power-law scaling that remains consistent regardless of the inherent deep layer cloud attributes existing between cloud size and the number of clouds. This scaling paradigm encompasses liquid, ice, and mixed phases in deep layer clouds. The exponent characterizing the interplay between cloud scale and number in the deep layer cloud, specifically for liquid, ice, or mixed-phase clouds, resembles that of shallow convection,but converges closely to zero. This convergence signifies a propensity for diminished cloud numbers and sizes within deep layer clouds. Notably, the infusion of abundant moisture and the release of latent heat by condensation within the lower atmospheric strata make substantial contributions. However, this role in ice phase formation is limited. The emergence of liquid and ice phases in deep layer clouds is facilitated by the latent heat and influenced by the wind shear inherent in the middle levels. These interrelationships hold potential applications in formulating parameterizations and post-processing model outcomes.