The data series of monthly clouldiness over global ocean from COADS was compared with that of from satellite Nimbus-7 during April 1979 to March 1985. The correspondence between them is good. Both the two methods of o...The data series of monthly clouldiness over global ocean from COADS was compared with that of from satellite Nimbus-7 during April 1979 to March 1985. The correspondence between them is good. Both the two methods of observation can provide useful information of the distribution of cloudiness and the two data sets can be mutually complementary.展开更多
Relationship between the δ 13C of tree ring cellulose from Pinus Koraiensis and climate parameters was investigated. A significantly negative correlation between δ 13C and mean low-cloud amount from May to July was ...Relationship between the δ 13C of tree ring cellulose from Pinus Koraiensis and climate parameters was investigated. A significantly negative correlation between δ 13C and mean low-cloud amount from May to July was discovered, which contributes to reconstructing the mean low-cloud amount from May to July at Antu in recent 200 years. Periodicals of quasi-8-year, quasi-4-year and quasi-2-year were detected both in δ 13C series and in the reconstructed low cloud amount series with 95% confidence level. Quasi-8-year period may reflect the integrated influence of solar activity, monsoon activity and local regional factors. Quasi-4-year and quasi-2-year periods indicate the influences of ENSO and Quasi Biennial Oscillation (QBO) of East Asian monsoon, respectively.展开更多
Based on the NOAA’s Advanced Very High Resolution Radiometer (AVHRR) Pathfi nder Atmospheres Extended (PATMOS-x) monthly mean cloud amount data, variations of annual and seasonal mean cloud amount over the Yangtz...Based on the NOAA’s Advanced Very High Resolution Radiometer (AVHRR) Pathfi nder Atmospheres Extended (PATMOS-x) monthly mean cloud amount data, variations of annual and seasonal mean cloud amount over the Yangtze River Delta (YRD), China were examined for the period 1982-2006 by using a linear regression analysis. Both total and high-level cloud amounts peak in June and reach minimum in December, mid-level clouds have a peak during winter months and reach a minimum in summer, and low-level clouds vary weakly throughout the year with a weak maximum from August to October. For the annual mean cloud amount, a slightly decreasing tendency (-0.6% sky cover per decade) of total cloud amount is observed during the studying period, which is mainly due to the reduction of annual mean high-level cloud amount (-2.2% sky cover per decade). Mid-level clouds occur least (approximately 15% sky cover) and remain invariant, while the low-level cloud amount shows a signifi cant increase during spring (1.5% sky cover per decade) and summer (3.0%sky cover per decade). Further analysis has revealed that the increased low-level clouds during the summer season are mainly impacted by the local environment. For example, compared to the low-level cloud amounts over the adjacent rural areas (e.g., cropland, large water body, and mountain areas covered by forest), those over and around urban agglomerations rise more dramatically.展开更多
A novel approach was developed for the determination of ultratrace amounts of copper in water samples by using electrothermal atomic absorption spectrometry (ETAAS) after cloud point extraction ( CPE ). 1-( 2-Pyr...A novel approach was developed for the determination of ultratrace amounts of copper in water samples by using electrothermal atomic absorption spectrometry (ETAAS) after cloud point extraction ( CPE ). 1-( 2-Pyridylazo ) -2- naphthol was used as the chelating reagent and Triton X-114 as the mieellar-forming surfactant. CPE was conducted in a pH 8. 0 medium at 40 ℃ for 10 rain. After the separation of the phases by contrifugafion, the surfactant-rieh phase was diluted with 1 mL of a methanol solution of 0. 1 mol/L HNO3. Then 20μL of the diluted surfactant-rieh phase was injected into the graphite furnace for atomization in the absence of any matrix modifier. Various experimental conditions that affect the extraction and atomization processes were optimized. A detection limit of 5 ng/L was obtained after preconeentration. The linear dynamic range of the copper mass concentration was found to be 0-2.0 ng/mL, and the relative standard deviation was found to be less than 3. 1% for a sample containing 1.0 ng/mL Cu ( Ⅱ ). This developed method was successfully applied to the determination of uhratraee amounts of Cu in drinking water, tap water, and seawater samples.展开更多
A new method based on the cloud point extraction(CPE) for separation and preconcentration of nickel(Ⅱ) and its subsequent determination by graphite furnace atomic absorption spectrometry(GFAAS) was proposed, 8-...A new method based on the cloud point extraction(CPE) for separation and preconcentration of nickel(Ⅱ) and its subsequent determination by graphite furnace atomic absorption spectrometry(GFAAS) was proposed, 8-hydroxyquinoline and Triton X-100 were used as the ligand and surfactant respectively. Nickel(Ⅱ) can form a hy-drophobic complex with 8-hydroxyquinoline, the complex can be extracted into the small volume surfactant rich phase at the cloud point temperature(CPT) for GFAAS determination. The factors affecting the cloud point extraction, such as pH, ligand concentration, surfactant concentration, and the incubation time were optimized. Under the optimal conditions, a detection limit of 12 ng/L and a relative standard deviation(RSD) of 2.9% were obtained for Ni(Ⅱ) determination. The enrichment factor was found to be 25. The proposed method was successfully applied to the determination of nickel(Ⅱ) in certified reference material and different types of water samples and the recovery was in a range of 95%―103%.展开更多
This paper analyzes the correlation between variations of total and low cloud amounts and the varying features of aerosols related to urban development of Beijing by using the cubic spline fitting method based on the ...This paper analyzes the correlation between variations of total and low cloud amounts and the varying features of aerosols related to urban development of Beijing by using the cubic spline fitting method based on the monthly meteorological data of temperature,humidity,precipitation,clouds,and aerosol optical depth (AOD) during 1950-2005.The statistics on the development of the city of Beijing in this period,including the total industrial output,population,residential housing development,highway construction,charcoal production,etc.,is revealed.Accompanying the rapid urban development of Beijing over the past 55 years or so,the urban aerosol concentration and composition have changed.The results indicate that:1) there is a general trend of climate warming and drying in Beijing;2) the total cloud amount in all seasons declines drastically,but lower cloud amount climbs up slightly;3) the high correlations between cloud amount and the indices of Beijing urban development such as the housing area,charcoal production,and road construction show that the variation of cloud amount is closely related to the urban development;4) the changing trend of AOD coincides more closely with the variation of low cloud amount.The evident drop of total cloud amount is in agreement with the trend of climate warming and drying,while the slight growth of low cloud amount is likely caused by more haze and fog occurrences in the lower troposphere in association with the pollution responsible for the"darkening"of Beijing and the surrounding areas.展开更多
Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to est...Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.展开更多
The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of...The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of failure and lack of trust in the centralized system.However,it also brings new problems to the health information in the cloud storage environment,such as attribute leakage,low consensus efficiency,complex permission updates,and so on.This paper proposes an access control scheme with fine-grained attribute revocation,keyword search,and traceability of the attribute private key distribution process.Blockchain technology tracks the authorization of attribute private keys.The credit scoring method improves the Raft protocol in consensus efficiency.Besides,the interplanetary file system(IPFS)addresses the capacity deficit of blockchain.Under the premise of hiding policy,the research proposes a fine-grained access control method based on users,user attributes,and file structure.It optimizes the data-sharing mode.At the same time,Proxy Re-Encryption(PRE)technology is used to update the access rights.The proposed scheme proved to be secure.Comparative analysis and experimental results show that the proposed scheme has higher efficiency and more functions.It can meet the needs of medical institutions.展开更多
We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and c...We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and comprehensive workflow that utilizes the quantum approximate optimization algorithm(QAOA).It facilitates the automatic conversion of the original problem into a quadratic unconstrained binary optimization(QUBO)model and its corresponding Ising model,which can be subsequently transformed into a weight graph.The core of Qcover relies on a graph decomposition-based classical algorithm,which efficiently derives the optimal parameters for the shallow QAOA circuit.Quafu-Qcover incorporates a dedicated compiler capable of translating QAOA circuits into physical quantum circuits that can be executed on Quafu cloud quantum computers.Compared to a general-purpose compiler,our compiler demonstrates the ability to generate shorter circuit depths,while also exhibiting superior speed performance.Additionally,the Qcover compiler has the capability to dynamically create a library of qubits coupling substructures in real-time,utilizing the most recent calibration data from the superconducting quantum devices.This ensures that computational tasks can be assigned to connected physical qubits with the highest fidelity.The Quafu-Qcover allows us to retrieve quantum computing sampling results using a task ID at any time,enabling asynchronous processing.Moreover,it incorporates modules for results preprocessing and visualization,facilitating an intuitive display of solutions for combinatorial optimization problems.We hope that Quafu-Qcover can serve as an instructive illustration for how to explore application problems on the Quafu cloud quantum computers.展开更多
Cloud computing provides a diverse and adaptable resource pool over the internet,allowing users to tap into various resources as needed.It has been seen as a robust solution to relevant challenges.A significant delay ...Cloud computing provides a diverse and adaptable resource pool over the internet,allowing users to tap into various resources as needed.It has been seen as a robust solution to relevant challenges.A significant delay can hamper the performance of IoT-enabled cloud platforms.However,efficient task scheduling can lower the cloud infrastructure’s energy consumption,thus maximizing the service provider’s revenue by decreasing user job processing times.The proposed Modified Chimp-Whale Optimization Algorithm called Modified Chimp-Whale Optimization Algorithm(MCWOA),combines elements of the Chimp Optimization Algorithm(COA)and the Whale Optimization Algorithm(WOA).To enhance MCWOA’s identification precision,the Sobol sequence is used in the population initialization phase,ensuring an even distribution of the population across the solution space.Moreover,the traditional MCWOA’s local search capabilities are augmented by incorporating the whale optimization algorithm’s bubble-net hunting and random search mechanisms into MCWOA’s position-updating process.This study demonstrates the effectiveness of the proposed approach using a two-story rigid frame and a simply supported beam model.Simulated outcomes reveal that the new method outperforms the original MCWOA,especially in multi-damage detection scenarios.MCWOA excels in avoiding false positives and enhancing computational speed,making it an optimal choice for structural damage detection.The efficiency of the proposed MCWOA is assessed against metrics such as energy usage,computational expense,task duration,and delay.The simulated data indicates that the new MCWOA outpaces other methods across all metrics.The study also references the Whale Optimization Algorithm(WOA),Chimp Algorithm(CA),Ant Lion Optimizer(ALO),Genetic Algorithm(GA)and Grey Wolf Optimizer(GWO).展开更多
With the rapid advancement of quantum computing,hybrid quantum–classical machine learning has shown numerous potential applications at the current stage,with expectations of being achievable in the noisy intermediate...With the rapid advancement of quantum computing,hybrid quantum–classical machine learning has shown numerous potential applications at the current stage,with expectations of being achievable in the noisy intermediate-scale quantum(NISQ)era.Quantum reinforcement learning,as an indispensable study,has recently demonstrated its ability to solve standard benchmark environments with formally provable theoretical advantages over classical counterparts.However,despite the progress of quantum processors and the emergence of quantum computing clouds,implementing quantum reinforcement learning algorithms utilizing parameterized quantum circuits(PQCs)on NISQ devices remains infrequent.In this work,we take the first step towards executing benchmark quantum reinforcement problems on real devices equipped with at most 136 qubits on the BAQIS Quafu quantum computing cloud.The experimental results demonstrate that the policy agents can successfully accomplish objectives under modified conditions in both the training and inference phases.Moreover,we design hardware-efficient PQC architectures in the quantum model using a multi-objective evolutionary algorithm and develop a learning algorithm that is adaptable to quantum devices.We hope that the Quafu-RL can be a guiding example to show how to realize machine learning tasks by taking advantage of quantum computers on the quantum cloud platform.展开更多
Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduct...Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.展开更多
Security issues in cloud networks and edge computing have become very common. This research focuses on analyzing such issues and developing the best solutions. A detailed literature review has been conducted in this r...Security issues in cloud networks and edge computing have become very common. This research focuses on analyzing such issues and developing the best solutions. A detailed literature review has been conducted in this regard. The findings have shown that many challenges are linked to edge computing, such as privacy concerns, security breaches, high costs, low efficiency, etc. Therefore, there is a need to implement proper security measures to overcome these issues. Using emerging trends, like machine learning, encryption, artificial intelligence, real-time monitoring, etc., can help mitigate security issues. They can also develop a secure and safe future in cloud computing. It was concluded that the security implications of edge computing can easily be covered with the help of new technologies and techniques.展开更多
Cloud vertical structure(CVS)strongly affects atmospheric circulation and radiative transfer.Yet,long-term,groundbased observations are scarce over the Tibetan Plateau(TP)despite its vital role in global climate.This ...Cloud vertical structure(CVS)strongly affects atmospheric circulation and radiative transfer.Yet,long-term,groundbased observations are scarce over the Tibetan Plateau(TP)despite its vital role in global climate.This study utilizes ground-based lidar and Ka-band cloud profiling radar(KaCR)measurements at Yangbajain(YBJ),TP,from October 2021 to September 2022 to characterize cloud properties.A satisfactorily performing novel anomaly detection algorithm(LevelShiftAD)is proposed for lidar and KaCR profiles to identify cloud boundaries.Cloud base heights(CBH)retrieved from KaCR and lidar observations show good consistency,with a correlation coefficient of 0.78 and a mean difference of-0.06 km.Cloud top heights(CTH)derived from KaCR match the FengYun-4A and Himawari-8 products well.Thus,KaCR measurements serve as the primary dataset for investigating CVSs over the TP.Different diurnal cycles occur in summer and winter.The diurnal cycle is characterized by a pronounced increase in cloud occurrence frequency in the afternoon with an early-morning decrease in winter,while cloud amounts remain high all day,with scattered nocturnal increases in summer.Summer features more frequent clouds with larger geometrical thicknesses,a higher multi-layer ratio,and greater inter-cloud spacing.Around 26%of the cloud bases occur below 0.5 km.Winter exhibits a bimodal distribution of cloud base heights with peaks at 0-0.5 km and 2-2.5 km.Single-layer and geometrically thin clouds prevail at YBJ.This study enriches long-term measurements of CVSs over the TP,and the robust anomaly detection method helps quantify cloud macro-physical properties via synergistic lidar and radar observations.展开更多
In a convective scheme featuring a discretized cloud size density, the assumed lateral mixing rate is inversely proportional to the exponential coefficient of plume size. This follows a typical assumption of-1, but it...In a convective scheme featuring a discretized cloud size density, the assumed lateral mixing rate is inversely proportional to the exponential coefficient of plume size. This follows a typical assumption of-1, but it has unveiled inherent uncertainties, especially for deep layer clouds. Addressing this knowledge gap, we conducted comprehensive large eddy simulations and comparative analyses focused on terrestrial regions. Our investigation revealed that cloud formation adheres to the tenets of Bernoulli trials, illustrating power-law scaling that remains consistent regardless of the inherent deep layer cloud attributes existing between cloud size and the number of clouds. This scaling paradigm encompasses liquid, ice, and mixed phases in deep layer clouds. The exponent characterizing the interplay between cloud scale and number in the deep layer cloud, specifically for liquid, ice, or mixed-phase clouds, resembles that of shallow convection,but converges closely to zero. This convergence signifies a propensity for diminished cloud numbers and sizes within deep layer clouds. Notably, the infusion of abundant moisture and the release of latent heat by condensation within the lower atmospheric strata make substantial contributions. However, this role in ice phase formation is limited. The emergence of liquid and ice phases in deep layer clouds is facilitated by the latent heat and influenced by the wind shear inherent in the middle levels. These interrelationships hold potential applications in formulating parameterizations and post-processing model outcomes.展开更多
The cloud type product 2B-CLDCLASS-LIDAR based on CloudSat and CALIPSO from June 2006 to May 2017 is used to examine the temporal and spatial distribution characteristics and interannual variability of eight cloud typ...The cloud type product 2B-CLDCLASS-LIDAR based on CloudSat and CALIPSO from June 2006 to May 2017 is used to examine the temporal and spatial distribution characteristics and interannual variability of eight cloud types(high cloud, altostratus, altocumulus, stratus, stratocumulus, cumulus, nimbostratus, and deep convection) and three phases(ice,mixed, and water) in the Arctic. Possible reasons for the observed interannual variability are also discussed. The main conclusions are as follows:(1) More water clouds occur on the Atlantic side, and more ice clouds occur over continents.(2)The average spatial and seasonal distributions of cloud types show three patterns: high clouds and most cumuliform clouds are concentrated in low-latitude locations and peak in summer;altostratus and nimbostratus are concentrated over and around continents and are less abundant in summer;stratocumulus and stratus are concentrated near the inner Arctic and peak during spring and autumn.(3) Regional averaged interannual frequencies of ice clouds and altostratus clouds significantly decrease, while those of water clouds, altocumulus, and cumulus clouds increase significantly.(4) Significant features of the linear trends of cloud frequencies are mainly located over ocean areas.(5) The monthly water cloud frequency anomalies are positively correlated with air temperature in most of the troposphere, while those for ice clouds are negatively correlated.(6) The decrease in altostratus clouds is associated with the weakening of the Arctic front due to Arctic warming, while increased water vapor transport into the Arctic and higher atmospheric instability lead to more cumulus and altocumulus clouds.展开更多
Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laborat...Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laboratories, pharmacies, and daily medical status reports. The electronic format of medical reports ensures that all information is available in a single place. However, it is difficult to store and manage large amounts of data. Dedicated servers and a data center are needed to store and manage patient data. However, self-managed data centers are expensive for hospitals. Storing data in a cloud is a cheaper alternative. The advantage of storing data in a cloud is that it can be retrieved anywhere and anytime using any device connected to the Internet. Therefore, doctors can easily access the medical history of a patient and diagnose diseases according to the context. It also helps prescribe the correct medicine to a patient in an appropriate way. The systematic storage of medical records could help reduce medical errors in hospitals. The challenge is to store medical records on a third-party cloud server while addressing privacy and security concerns. These servers are often semi-trusted. Thus, sensitive medical information must be protected. Open access to records and modifications performed on the information in those records may even cause patient fatalities. Patient-centric health-record security is a major concern. End-to-end file encryption before outsourcing data to a third-party cloud server ensures security. This paper presents a method that is a combination of the advanced encryption standard and the elliptical curve Diffie-Hellman method designed to increase the efficiency of medical record security for users. Comparisons of existing and proposed techniques are presented at the end of the article, with a focus on the analyzing the security approaches between the elliptic curve and secret-sharing methods. This study aims to provide a high level of security for patient health records.展开更多
文摘The data series of monthly clouldiness over global ocean from COADS was compared with that of from satellite Nimbus-7 during April 1979 to March 1985. The correspondence between them is good. Both the two methods of observation can provide useful information of the distribution of cloudiness and the two data sets can be mutually complementary.
文摘Relationship between the δ 13C of tree ring cellulose from Pinus Koraiensis and climate parameters was investigated. A significantly negative correlation between δ 13C and mean low-cloud amount from May to July was discovered, which contributes to reconstructing the mean low-cloud amount from May to July at Antu in recent 200 years. Periodicals of quasi-8-year, quasi-4-year and quasi-2-year were detected both in δ 13C series and in the reconstructed low cloud amount series with 95% confidence level. Quasi-8-year period may reflect the integrated influence of solar activity, monsoon activity and local regional factors. Quasi-4-year and quasi-2-year periods indicate the influences of ENSO and Quasi Biennial Oscillation (QBO) of East Asian monsoon, respectively.
基金Supported by the National Basic Research and Development(973)Program of China(2010CB428501)National Natural Science Foundation of China(41375014)
文摘Based on the NOAA’s Advanced Very High Resolution Radiometer (AVHRR) Pathfi nder Atmospheres Extended (PATMOS-x) monthly mean cloud amount data, variations of annual and seasonal mean cloud amount over the Yangtze River Delta (YRD), China were examined for the period 1982-2006 by using a linear regression analysis. Both total and high-level cloud amounts peak in June and reach minimum in December, mid-level clouds have a peak during winter months and reach a minimum in summer, and low-level clouds vary weakly throughout the year with a weak maximum from August to October. For the annual mean cloud amount, a slightly decreasing tendency (-0.6% sky cover per decade) of total cloud amount is observed during the studying period, which is mainly due to the reduction of annual mean high-level cloud amount (-2.2% sky cover per decade). Mid-level clouds occur least (approximately 15% sky cover) and remain invariant, while the low-level cloud amount shows a signifi cant increase during spring (1.5% sky cover per decade) and summer (3.0%sky cover per decade). Further analysis has revealed that the increased low-level clouds during the summer season are mainly impacted by the local environment. For example, compared to the low-level cloud amounts over the adjacent rural areas (e.g., cropland, large water body, and mountain areas covered by forest), those over and around urban agglomerations rise more dramatically.
基金the Analysis and Testing Foundation of Zhejiang Province(No 04045)
文摘A novel approach was developed for the determination of ultratrace amounts of copper in water samples by using electrothermal atomic absorption spectrometry (ETAAS) after cloud point extraction ( CPE ). 1-( 2-Pyridylazo ) -2- naphthol was used as the chelating reagent and Triton X-114 as the mieellar-forming surfactant. CPE was conducted in a pH 8. 0 medium at 40 ℃ for 10 rain. After the separation of the phases by contrifugafion, the surfactant-rieh phase was diluted with 1 mL of a methanol solution of 0. 1 mol/L HNO3. Then 20μL of the diluted surfactant-rieh phase was injected into the graphite furnace for atomization in the absence of any matrix modifier. Various experimental conditions that affect the extraction and atomization processes were optimized. A detection limit of 5 ng/L was obtained after preconeentration. The linear dynamic range of the copper mass concentration was found to be 0-2.0 ng/mL, and the relative standard deviation was found to be less than 3. 1% for a sample containing 1.0 ng/mL Cu ( Ⅱ ). This developed method was successfully applied to the determination of uhratraee amounts of Cu in drinking water, tap water, and seawater samples.
基金Supported by the National Natural Science Foundation of China(No.20075009)
文摘A new method based on the cloud point extraction(CPE) for separation and preconcentration of nickel(Ⅱ) and its subsequent determination by graphite furnace atomic absorption spectrometry(GFAAS) was proposed, 8-hydroxyquinoline and Triton X-100 were used as the ligand and surfactant respectively. Nickel(Ⅱ) can form a hy-drophobic complex with 8-hydroxyquinoline, the complex can be extracted into the small volume surfactant rich phase at the cloud point temperature(CPT) for GFAAS determination. The factors affecting the cloud point extraction, such as pH, ligand concentration, surfactant concentration, and the incubation time were optimized. Under the optimal conditions, a detection limit of 12 ng/L and a relative standard deviation(RSD) of 2.9% were obtained for Ni(Ⅱ) determination. The enrichment factor was found to be 25. The proposed method was successfully applied to the determination of nickel(Ⅱ) in certified reference material and different types of water samples and the recovery was in a range of 95%―103%.
基金Supported by the Special Grant in Atmospheric Sciences of the China Meteorological Administration(GYHY200706036)the National"973"Program of China(2011CB403404)the International Cooperation Project on Monsoon Monitoring (200 9DFA21430)
文摘This paper analyzes the correlation between variations of total and low cloud amounts and the varying features of aerosols related to urban development of Beijing by using the cubic spline fitting method based on the monthly meteorological data of temperature,humidity,precipitation,clouds,and aerosol optical depth (AOD) during 1950-2005.The statistics on the development of the city of Beijing in this period,including the total industrial output,population,residential housing development,highway construction,charcoal production,etc.,is revealed.Accompanying the rapid urban development of Beijing over the past 55 years or so,the urban aerosol concentration and composition have changed.The results indicate that:1) there is a general trend of climate warming and drying in Beijing;2) the total cloud amount in all seasons declines drastically,but lower cloud amount climbs up slightly;3) the high correlations between cloud amount and the indices of Beijing urban development such as the housing area,charcoal production,and road construction show that the variation of cloud amount is closely related to the urban development;4) the changing trend of AOD coincides more closely with the variation of low cloud amount.The evident drop of total cloud amount is in agreement with the trend of climate warming and drying,while the slight growth of low cloud amount is likely caused by more haze and fog occurrences in the lower troposphere in association with the pollution responsible for the"darkening"of Beijing and the surrounding areas.
基金supported in part by the Nationa Natural Science Foundation of China (61876011)the National Key Research and Development Program of China (2022YFB4703700)+1 种基金the Key Research and Development Program 2020 of Guangzhou (202007050002)the Key-Area Research and Development Program of Guangdong Province (2020B090921003)。
文摘Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.
基金This research was funded by the National Natural Science Foundation of China,Grant Number 62162039the Shaanxi Provincial Key R&D Program,China with Grant Number 2020GY-041.
文摘The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of failure and lack of trust in the centralized system.However,it also brings new problems to the health information in the cloud storage environment,such as attribute leakage,low consensus efficiency,complex permission updates,and so on.This paper proposes an access control scheme with fine-grained attribute revocation,keyword search,and traceability of the attribute private key distribution process.Blockchain technology tracks the authorization of attribute private keys.The credit scoring method improves the Raft protocol in consensus efficiency.Besides,the interplanetary file system(IPFS)addresses the capacity deficit of blockchain.Under the premise of hiding policy,the research proposes a fine-grained access control method based on users,user attributes,and file structure.It optimizes the data-sharing mode.At the same time,Proxy Re-Encryption(PRE)technology is used to update the access rights.The proposed scheme proved to be secure.Comparative analysis and experimental results show that the proposed scheme has higher efficiency and more functions.It can meet the needs of medical institutions.
基金supported by the National Natural Science Foundation of China(Grant No.92365206)the support of the China Postdoctoral Science Foundation(Certificate Number:2023M740272)+1 种基金supported by the National Natural Science Foundation of China(Grant No.12247168)China Postdoctoral Science Foundation(Certificate Number:2022TQ0036)。
文摘We introduce Quafu-Qcover,an open-source cloud-based software package developed for solving combinatorial optimization problems using quantum simulators and hardware backends.Quafu-Qcover provides a standardized and comprehensive workflow that utilizes the quantum approximate optimization algorithm(QAOA).It facilitates the automatic conversion of the original problem into a quadratic unconstrained binary optimization(QUBO)model and its corresponding Ising model,which can be subsequently transformed into a weight graph.The core of Qcover relies on a graph decomposition-based classical algorithm,which efficiently derives the optimal parameters for the shallow QAOA circuit.Quafu-Qcover incorporates a dedicated compiler capable of translating QAOA circuits into physical quantum circuits that can be executed on Quafu cloud quantum computers.Compared to a general-purpose compiler,our compiler demonstrates the ability to generate shorter circuit depths,while also exhibiting superior speed performance.Additionally,the Qcover compiler has the capability to dynamically create a library of qubits coupling substructures in real-time,utilizing the most recent calibration data from the superconducting quantum devices.This ensures that computational tasks can be assigned to connected physical qubits with the highest fidelity.The Quafu-Qcover allows us to retrieve quantum computing sampling results using a task ID at any time,enabling asynchronous processing.Moreover,it incorporates modules for results preprocessing and visualization,facilitating an intuitive display of solutions for combinatorial optimization problems.We hope that Quafu-Qcover can serve as an instructive illustration for how to explore application problems on the Quafu cloud quantum computers.
文摘Cloud computing provides a diverse and adaptable resource pool over the internet,allowing users to tap into various resources as needed.It has been seen as a robust solution to relevant challenges.A significant delay can hamper the performance of IoT-enabled cloud platforms.However,efficient task scheduling can lower the cloud infrastructure’s energy consumption,thus maximizing the service provider’s revenue by decreasing user job processing times.The proposed Modified Chimp-Whale Optimization Algorithm called Modified Chimp-Whale Optimization Algorithm(MCWOA),combines elements of the Chimp Optimization Algorithm(COA)and the Whale Optimization Algorithm(WOA).To enhance MCWOA’s identification precision,the Sobol sequence is used in the population initialization phase,ensuring an even distribution of the population across the solution space.Moreover,the traditional MCWOA’s local search capabilities are augmented by incorporating the whale optimization algorithm’s bubble-net hunting and random search mechanisms into MCWOA’s position-updating process.This study demonstrates the effectiveness of the proposed approach using a two-story rigid frame and a simply supported beam model.Simulated outcomes reveal that the new method outperforms the original MCWOA,especially in multi-damage detection scenarios.MCWOA excels in avoiding false positives and enhancing computational speed,making it an optimal choice for structural damage detection.The efficiency of the proposed MCWOA is assessed against metrics such as energy usage,computational expense,task duration,and delay.The simulated data indicates that the new MCWOA outpaces other methods across all metrics.The study also references the Whale Optimization Algorithm(WOA),Chimp Algorithm(CA),Ant Lion Optimizer(ALO),Genetic Algorithm(GA)and Grey Wolf Optimizer(GWO).
基金supported by the Beijing Academy of Quantum Information Sciencessupported by the National Natural Science Foundation of China(Grant No.92365206)+2 种基金the support of the China Postdoctoral Science Foundation(Certificate Number:2023M740272)supported by the National Natural Science Foundation of China(Grant No.12247168)China Postdoctoral Science Foundation(Certificate Number:2022TQ0036)。
文摘With the rapid advancement of quantum computing,hybrid quantum–classical machine learning has shown numerous potential applications at the current stage,with expectations of being achievable in the noisy intermediate-scale quantum(NISQ)era.Quantum reinforcement learning,as an indispensable study,has recently demonstrated its ability to solve standard benchmark environments with formally provable theoretical advantages over classical counterparts.However,despite the progress of quantum processors and the emergence of quantum computing clouds,implementing quantum reinforcement learning algorithms utilizing parameterized quantum circuits(PQCs)on NISQ devices remains infrequent.In this work,we take the first step towards executing benchmark quantum reinforcement problems on real devices equipped with at most 136 qubits on the BAQIS Quafu quantum computing cloud.The experimental results demonstrate that the policy agents can successfully accomplish objectives under modified conditions in both the training and inference phases.Moreover,we design hardware-efficient PQC architectures in the quantum model using a multi-objective evolutionary algorithm and develop a learning algorithm that is adaptable to quantum devices.We hope that the Quafu-RL can be a guiding example to show how to realize machine learning tasks by taking advantage of quantum computers on the quantum cloud platform.
基金National Natural Science Foundation of China(Nos.42071444,42101444)。
文摘Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.
文摘Security issues in cloud networks and edge computing have become very common. This research focuses on analyzing such issues and developing the best solutions. A detailed literature review has been conducted in this regard. The findings have shown that many challenges are linked to edge computing, such as privacy concerns, security breaches, high costs, low efficiency, etc. Therefore, there is a need to implement proper security measures to overcome these issues. Using emerging trends, like machine learning, encryption, artificial intelligence, real-time monitoring, etc., can help mitigate security issues. They can also develop a secure and safe future in cloud computing. It was concluded that the security implications of edge computing can easily be covered with the help of new technologies and techniques.
基金jointly funded by the Second Tibetan Plateau Scientific Expedition and Research Program of China under Grant 2019QZKK0604the National Natural Science Foundation of China(Grant Nos.92044303 and 42001294).
文摘Cloud vertical structure(CVS)strongly affects atmospheric circulation and radiative transfer.Yet,long-term,groundbased observations are scarce over the Tibetan Plateau(TP)despite its vital role in global climate.This study utilizes ground-based lidar and Ka-band cloud profiling radar(KaCR)measurements at Yangbajain(YBJ),TP,from October 2021 to September 2022 to characterize cloud properties.A satisfactorily performing novel anomaly detection algorithm(LevelShiftAD)is proposed for lidar and KaCR profiles to identify cloud boundaries.Cloud base heights(CBH)retrieved from KaCR and lidar observations show good consistency,with a correlation coefficient of 0.78 and a mean difference of-0.06 km.Cloud top heights(CTH)derived from KaCR match the FengYun-4A and Himawari-8 products well.Thus,KaCR measurements serve as the primary dataset for investigating CVSs over the TP.Different diurnal cycles occur in summer and winter.The diurnal cycle is characterized by a pronounced increase in cloud occurrence frequency in the afternoon with an early-morning decrease in winter,while cloud amounts remain high all day,with scattered nocturnal increases in summer.Summer features more frequent clouds with larger geometrical thicknesses,a higher multi-layer ratio,and greater inter-cloud spacing.Around 26%of the cloud bases occur below 0.5 km.Winter exhibits a bimodal distribution of cloud base heights with peaks at 0-0.5 km and 2-2.5 km.Single-layer and geometrically thin clouds prevail at YBJ.This study enriches long-term measurements of CVSs over the TP,and the robust anomaly detection method helps quantify cloud macro-physical properties via synergistic lidar and radar observations.
基金supported by the Second Tibetan Plateau Scientific Expedition and Research Program (STEP) (Grant No.2019QZKK010203)the National Natural Science Foundation of China (Grant No.42175174 and 41975130)+1 种基金the Natural Science Foundation of Sichuan Province (Grant No.2022NSFSC1092)the Sichuan Provincial Innovation Training Program for College Students (Grant No.S202210621009)。
文摘In a convective scheme featuring a discretized cloud size density, the assumed lateral mixing rate is inversely proportional to the exponential coefficient of plume size. This follows a typical assumption of-1, but it has unveiled inherent uncertainties, especially for deep layer clouds. Addressing this knowledge gap, we conducted comprehensive large eddy simulations and comparative analyses focused on terrestrial regions. Our investigation revealed that cloud formation adheres to the tenets of Bernoulli trials, illustrating power-law scaling that remains consistent regardless of the inherent deep layer cloud attributes existing between cloud size and the number of clouds. This scaling paradigm encompasses liquid, ice, and mixed phases in deep layer clouds. The exponent characterizing the interplay between cloud scale and number in the deep layer cloud, specifically for liquid, ice, or mixed-phase clouds, resembles that of shallow convection,but converges closely to zero. This convergence signifies a propensity for diminished cloud numbers and sizes within deep layer clouds. Notably, the infusion of abundant moisture and the release of latent heat by condensation within the lower atmospheric strata make substantial contributions. However, this role in ice phase formation is limited. The emergence of liquid and ice phases in deep layer clouds is facilitated by the latent heat and influenced by the wind shear inherent in the middle levels. These interrelationships hold potential applications in formulating parameterizations and post-processing model outcomes.
基金supported in part by the National Natural Science Foundation of China (Grant No. 42105127)the Special Research Assistant Project of the Chinese Academy of Sciencesthe National Key Research and Development Plans of China (Grant Nos. 2019YFC1510304 and 2016YFE0201900-02)。
文摘The cloud type product 2B-CLDCLASS-LIDAR based on CloudSat and CALIPSO from June 2006 to May 2017 is used to examine the temporal and spatial distribution characteristics and interannual variability of eight cloud types(high cloud, altostratus, altocumulus, stratus, stratocumulus, cumulus, nimbostratus, and deep convection) and three phases(ice,mixed, and water) in the Arctic. Possible reasons for the observed interannual variability are also discussed. The main conclusions are as follows:(1) More water clouds occur on the Atlantic side, and more ice clouds occur over continents.(2)The average spatial and seasonal distributions of cloud types show three patterns: high clouds and most cumuliform clouds are concentrated in low-latitude locations and peak in summer;altostratus and nimbostratus are concentrated over and around continents and are less abundant in summer;stratocumulus and stratus are concentrated near the inner Arctic and peak during spring and autumn.(3) Regional averaged interannual frequencies of ice clouds and altostratus clouds significantly decrease, while those of water clouds, altocumulus, and cumulus clouds increase significantly.(4) Significant features of the linear trends of cloud frequencies are mainly located over ocean areas.(5) The monthly water cloud frequency anomalies are positively correlated with air temperature in most of the troposphere, while those for ice clouds are negatively correlated.(6) The decrease in altostratus clouds is associated with the weakening of the Arctic front due to Arctic warming, while increased water vapor transport into the Arctic and higher atmospheric instability lead to more cumulus and altocumulus clouds.
文摘Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laboratories, pharmacies, and daily medical status reports. The electronic format of medical reports ensures that all information is available in a single place. However, it is difficult to store and manage large amounts of data. Dedicated servers and a data center are needed to store and manage patient data. However, self-managed data centers are expensive for hospitals. Storing data in a cloud is a cheaper alternative. The advantage of storing data in a cloud is that it can be retrieved anywhere and anytime using any device connected to the Internet. Therefore, doctors can easily access the medical history of a patient and diagnose diseases according to the context. It also helps prescribe the correct medicine to a patient in an appropriate way. The systematic storage of medical records could help reduce medical errors in hospitals. The challenge is to store medical records on a third-party cloud server while addressing privacy and security concerns. These servers are often semi-trusted. Thus, sensitive medical information must be protected. Open access to records and modifications performed on the information in those records may even cause patient fatalities. Patient-centric health-record security is a major concern. End-to-end file encryption before outsourcing data to a third-party cloud server ensures security. This paper presents a method that is a combination of the advanced encryption standard and the elliptical curve Diffie-Hellman method designed to increase the efficiency of medical record security for users. Comparisons of existing and proposed techniques are presented at the end of the article, with a focus on the analyzing the security approaches between the elliptic curve and secret-sharing methods. This study aims to provide a high level of security for patient health records.