Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to est...Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.展开更多
The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of...The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of failure and lack of trust in the centralized system.However,it also brings new problems to the health information in the cloud storage environment,such as attribute leakage,low consensus efficiency,complex permission updates,and so on.This paper proposes an access control scheme with fine-grained attribute revocation,keyword search,and traceability of the attribute private key distribution process.Blockchain technology tracks the authorization of attribute private keys.The credit scoring method improves the Raft protocol in consensus efficiency.Besides,the interplanetary file system(IPFS)addresses the capacity deficit of blockchain.Under the premise of hiding policy,the research proposes a fine-grained access control method based on users,user attributes,and file structure.It optimizes the data-sharing mode.At the same time,Proxy Re-Encryption(PRE)technology is used to update the access rights.The proposed scheme proved to be secure.Comparative analysis and experimental results show that the proposed scheme has higher efficiency and more functions.It can meet the needs of medical institutions.展开更多
Cloud computing provides a diverse and adaptable resource pool over the internet,allowing users to tap into various resources as needed.It has been seen as a robust solution to relevant challenges.A significant delay ...Cloud computing provides a diverse and adaptable resource pool over the internet,allowing users to tap into various resources as needed.It has been seen as a robust solution to relevant challenges.A significant delay can hamper the performance of IoT-enabled cloud platforms.However,efficient task scheduling can lower the cloud infrastructure’s energy consumption,thus maximizing the service provider’s revenue by decreasing user job processing times.The proposed Modified Chimp-Whale Optimization Algorithm called Modified Chimp-Whale Optimization Algorithm(MCWOA),combines elements of the Chimp Optimization Algorithm(COA)and the Whale Optimization Algorithm(WOA).To enhance MCWOA’s identification precision,the Sobol sequence is used in the population initialization phase,ensuring an even distribution of the population across the solution space.Moreover,the traditional MCWOA’s local search capabilities are augmented by incorporating the whale optimization algorithm’s bubble-net hunting and random search mechanisms into MCWOA’s position-updating process.This study demonstrates the effectiveness of the proposed approach using a two-story rigid frame and a simply supported beam model.Simulated outcomes reveal that the new method outperforms the original MCWOA,especially in multi-damage detection scenarios.MCWOA excels in avoiding false positives and enhancing computational speed,making it an optimal choice for structural damage detection.The efficiency of the proposed MCWOA is assessed against metrics such as energy usage,computational expense,task duration,and delay.The simulated data indicates that the new MCWOA outpaces other methods across all metrics.The study also references the Whale Optimization Algorithm(WOA),Chimp Algorithm(CA),Ant Lion Optimizer(ALO),Genetic Algorithm(GA)and Grey Wolf Optimizer(GWO).展开更多
Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduct...Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.展开更多
Security issues in cloud networks and edge computing have become very common. This research focuses on analyzing such issues and developing the best solutions. A detailed literature review has been conducted in this r...Security issues in cloud networks and edge computing have become very common. This research focuses on analyzing such issues and developing the best solutions. A detailed literature review has been conducted in this regard. The findings have shown that many challenges are linked to edge computing, such as privacy concerns, security breaches, high costs, low efficiency, etc. Therefore, there is a need to implement proper security measures to overcome these issues. Using emerging trends, like machine learning, encryption, artificial intelligence, real-time monitoring, etc., can help mitigate security issues. They can also develop a secure and safe future in cloud computing. It was concluded that the security implications of edge computing can easily be covered with the help of new technologies and techniques.展开更多
In a convective scheme featuring a discretized cloud size density, the assumed lateral mixing rate is inversely proportional to the exponential coefficient of plume size. This follows a typical assumption of-1, but it...In a convective scheme featuring a discretized cloud size density, the assumed lateral mixing rate is inversely proportional to the exponential coefficient of plume size. This follows a typical assumption of-1, but it has unveiled inherent uncertainties, especially for deep layer clouds. Addressing this knowledge gap, we conducted comprehensive large eddy simulations and comparative analyses focused on terrestrial regions. Our investigation revealed that cloud formation adheres to the tenets of Bernoulli trials, illustrating power-law scaling that remains consistent regardless of the inherent deep layer cloud attributes existing between cloud size and the number of clouds. This scaling paradigm encompasses liquid, ice, and mixed phases in deep layer clouds. The exponent characterizing the interplay between cloud scale and number in the deep layer cloud, specifically for liquid, ice, or mixed-phase clouds, resembles that of shallow convection,but converges closely to zero. This convergence signifies a propensity for diminished cloud numbers and sizes within deep layer clouds. Notably, the infusion of abundant moisture and the release of latent heat by condensation within the lower atmospheric strata make substantial contributions. However, this role in ice phase formation is limited. The emergence of liquid and ice phases in deep layer clouds is facilitated by the latent heat and influenced by the wind shear inherent in the middle levels. These interrelationships hold potential applications in formulating parameterizations and post-processing model outcomes.展开更多
In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding ...In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding the risk of data loss and data overlapping.The development of data flow scheduling approaches in the cloud environment taking security parameters into account is insufficient.In our work,we propose a data scheduling model for the cloud environment.Themodel is made up of three parts that together help dispatch user data flow to the appropriate cloudVMs.The first component is the Collector Agent whichmust periodically collect information on the state of the network links.The second one is the monitoring agent which must then analyze,classify,and make a decision on the state of the link and finally transmit this information to the scheduler.The third one is the scheduler who must consider previous information to transfer user data,including fair distribution and reliable paths.It should be noted that each part of the proposedmodel requires the development of its algorithms.In this article,we are interested in the development of data transfer algorithms,including fairness distribution with the consideration of a stable link state.These algorithms are based on the grouping of transmitted files and the iterative method.The proposed algorithms showthe performances to obtain an approximate solution to the studied problem which is an NP-hard(Non-Polynomial solution)problem.The experimental results show that the best algorithm is the half-grouped minimum excluding(HME),with a percentage of 91.3%,an average deviation of 0.042,and an execution time of 0.001 s.展开更多
This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering...This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.展开更多
Cloud base height(CBH) is a crucial parameter for cloud radiative effect estimates, climate change simulations, and aviation guidance. However, due to the limited information on cloud vertical structures included in p...Cloud base height(CBH) is a crucial parameter for cloud radiative effect estimates, climate change simulations, and aviation guidance. However, due to the limited information on cloud vertical structures included in passive satellite radiometer observations, few operational satellite CBH products are currently available. This study presents a new method for retrieving CBH from satellite radiometers. The method first uses the combined measurements of satellite radiometers and ground-based cloud radars to develop a lookup table(LUT) of effective cloud water content(ECWC), representing the vertically varying cloud water content. This LUT allows for the conversion of cloud water path to cloud geometric thickness(CGT), enabling the estimation of CBH as the difference between cloud top height and CGT. Detailed comparative analysis of CBH estimates from the state-of-the-art ECWC LUT are conducted against four ground-based millimeter-wave cloud radar(MMCR) measurements, and results show that the mean bias(correlation coefficient) is0.18±1.79 km(0.73), which is lower(higher) than 0.23±2.11 km(0.67) as derived from the combined measurements of satellite radiometers and satellite radar-lidar(i.e., Cloud Sat and CALIPSO). Furthermore, the percentages of the CBH biases within 250 m increase by 5% to 10%, which varies by location. This indicates that the CBH estimates from our algorithm are more consistent with ground-based MMCR measurements. Therefore, this algorithm shows great potential for further improvement of the CBH retrievals as ground-based MMCR are being increasingly included in global surface meteorological observing networks, and the improved CBH retrievals will contribute to better cloud radiative effect estimates.展开更多
Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin ...Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin in CC’s performance,the Cloud Service Broker(CSB),orchestrates DC selection.Failure to adroitly route user requests with suitable DCs transforms the CSB into a bottleneck,endangering service quality.To tackle this,deploying an efficient CSB policy becomes imperative,optimizing DC selection to meet stringent Qualityof-Service(QoS)demands.Amidst numerous CSB policies,their implementation grapples with challenges like costs and availability.This article undertakes a holistic review of diverse CSB policies,concurrently surveying the predicaments confronted by current policies.The foremost objective is to pinpoint research gaps and remedies to invigorate future policy development.Additionally,it extensively clarifies various DC selection methodologies employed in CC,enriching practitioners and researchers alike.Employing synthetic analysis,the article systematically assesses and compares myriad DC selection techniques.These analytical insights equip decision-makers with a pragmatic framework to discern the apt technique for their needs.In summation,this discourse resoundingly underscores the paramount importance of adept CSB policies in DC selection,highlighting the imperative role of efficient CSB policies in optimizing CC performance.By emphasizing the significance of these policies and their modeling implications,the article contributes to both the general modeling discourse and its practical applications in the CC domain.展开更多
Cloud computing is the new norm within business entities as businesses try to keep up with technological advancements and user needs. The concept is defined as a computing environment allowing for remote outsourcing o...Cloud computing is the new norm within business entities as businesses try to keep up with technological advancements and user needs. The concept is defined as a computing environment allowing for remote outsourcing of storage and computing resources. A hybrid cloud environment is an excellent example of cloud computing. Specifically, the hybrid system provides organizations with increased scalability and control over their data and support for a remote workforce. However, hybrid cloud systems are expensive as organizations operate different infrastructures while introducing complexity to the organization’s activities. Data security is critical among the most vital concerns that have resulted from the use of cloud computing, thus, affecting the rate of user adoption and acceptance. This article, borrowing from the hybrid cloud computing system, recommends combining traditional and modern data security systems. Traditional data security systems have proven effective in their respective roles, with the main challenge arising from their recognition of context and connectivity. Therefore, integrating traditional and modern designs is recommended to enhance effectiveness, context, connectivity, and efficiency.展开更多
The cloud type product 2B-CLDCLASS-LIDAR based on CloudSat and CALIPSO from June 2006 to May 2017 is used to examine the temporal and spatial distribution characteristics and interannual variability of eight cloud typ...The cloud type product 2B-CLDCLASS-LIDAR based on CloudSat and CALIPSO from June 2006 to May 2017 is used to examine the temporal and spatial distribution characteristics and interannual variability of eight cloud types(high cloud, altostratus, altocumulus, stratus, stratocumulus, cumulus, nimbostratus, and deep convection) and three phases(ice,mixed, and water) in the Arctic. Possible reasons for the observed interannual variability are also discussed. The main conclusions are as follows:(1) More water clouds occur on the Atlantic side, and more ice clouds occur over continents.(2)The average spatial and seasonal distributions of cloud types show three patterns: high clouds and most cumuliform clouds are concentrated in low-latitude locations and peak in summer;altostratus and nimbostratus are concentrated over and around continents and are less abundant in summer;stratocumulus and stratus are concentrated near the inner Arctic and peak during spring and autumn.(3) Regional averaged interannual frequencies of ice clouds and altostratus clouds significantly decrease, while those of water clouds, altocumulus, and cumulus clouds increase significantly.(4) Significant features of the linear trends of cloud frequencies are mainly located over ocean areas.(5) The monthly water cloud frequency anomalies are positively correlated with air temperature in most of the troposphere, while those for ice clouds are negatively correlated.(6) The decrease in altostratus clouds is associated with the weakening of the Arctic front due to Arctic warming, while increased water vapor transport into the Arctic and higher atmospheric instability lead to more cumulus and altocumulus clouds.展开更多
This study investigates how cybersecurity can be enhanced through cloud computing solutions in the United States. The motive for this study is due to the rampant loss of data, breaches, and unauthorized access of inte...This study investigates how cybersecurity can be enhanced through cloud computing solutions in the United States. The motive for this study is due to the rampant loss of data, breaches, and unauthorized access of internet criminals in the United States. The study adopted a survey research design, collecting data from 890 cloud professionals with relevant knowledge of cybersecurity and cloud computing. A machine learning approach was adopted, specifically a random forest classifier, an ensemble, and a decision tree model. Out of the features in the data, ten important features were selected using random forest feature importance, which helps to achieve the objective of the study. The study’s purpose is to enable organizations to develop suitable techniques to prevent cybercrime using random forest predictions as they relate to cloud services in the United States. The effectiveness of the models used is evaluated by utilizing validation matrices that include recall values, accuracy, and precision, in addition to F1 scores and confusion matrices. Based on evaluation scores (accuracy, precision, recall, and F1 scores) of 81.9%, 82.6%, and 82.1%, the results demonstrated the effectiveness of the random forest model. It showed the importance of machine learning algorithms in preventing cybercrime and boosting security in the cloud environment. It recommends that other machine learning models be adopted to see how to improve cybersecurity through cloud computing.展开更多
This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hac...This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.展开更多
基金supported in part by the Nationa Natural Science Foundation of China (61876011)the National Key Research and Development Program of China (2022YFB4703700)+1 种基金the Key Research and Development Program 2020 of Guangzhou (202007050002)the Key-Area Research and Development Program of Guangdong Province (2020B090921003)。
文摘Recently, there have been some attempts of Transformer in 3D point cloud classification. In order to reduce computations, most existing methods focus on local spatial attention,but ignore their content and fail to establish relationships between distant but relevant points. To overcome the limitation of local spatial attention, we propose a point content-based Transformer architecture, called PointConT for short. It exploits the locality of points in the feature space(content-based), which clusters the sampled points with similar features into the same class and computes the self-attention within each class, thus enabling an effective trade-off between capturing long-range dependencies and computational complexity. We further introduce an inception feature aggregator for point cloud classification, which uses parallel structures to aggregate high-frequency and low-frequency information in each branch separately. Extensive experiments show that our PointConT model achieves a remarkable performance on point cloud shape classification. Especially, our method exhibits 90.3% Top-1 accuracy on the hardest setting of ScanObjectN N. Source code of this paper is available at https://github.com/yahuiliu99/PointC onT.
基金This research was funded by the National Natural Science Foundation of China,Grant Number 62162039the Shaanxi Provincial Key R&D Program,China with Grant Number 2020GY-041.
文摘The Access control scheme is an effective method to protect user data privacy.The access control scheme based on blockchain and ciphertext policy attribute encryption(CP–ABE)can solve the problems of single—point of failure and lack of trust in the centralized system.However,it also brings new problems to the health information in the cloud storage environment,such as attribute leakage,low consensus efficiency,complex permission updates,and so on.This paper proposes an access control scheme with fine-grained attribute revocation,keyword search,and traceability of the attribute private key distribution process.Blockchain technology tracks the authorization of attribute private keys.The credit scoring method improves the Raft protocol in consensus efficiency.Besides,the interplanetary file system(IPFS)addresses the capacity deficit of blockchain.Under the premise of hiding policy,the research proposes a fine-grained access control method based on users,user attributes,and file structure.It optimizes the data-sharing mode.At the same time,Proxy Re-Encryption(PRE)technology is used to update the access rights.The proposed scheme proved to be secure.Comparative analysis and experimental results show that the proposed scheme has higher efficiency and more functions.It can meet the needs of medical institutions.
文摘Cloud computing provides a diverse and adaptable resource pool over the internet,allowing users to tap into various resources as needed.It has been seen as a robust solution to relevant challenges.A significant delay can hamper the performance of IoT-enabled cloud platforms.However,efficient task scheduling can lower the cloud infrastructure’s energy consumption,thus maximizing the service provider’s revenue by decreasing user job processing times.The proposed Modified Chimp-Whale Optimization Algorithm called Modified Chimp-Whale Optimization Algorithm(MCWOA),combines elements of the Chimp Optimization Algorithm(COA)and the Whale Optimization Algorithm(WOA).To enhance MCWOA’s identification precision,the Sobol sequence is used in the population initialization phase,ensuring an even distribution of the population across the solution space.Moreover,the traditional MCWOA’s local search capabilities are augmented by incorporating the whale optimization algorithm’s bubble-net hunting and random search mechanisms into MCWOA’s position-updating process.This study demonstrates the effectiveness of the proposed approach using a two-story rigid frame and a simply supported beam model.Simulated outcomes reveal that the new method outperforms the original MCWOA,especially in multi-damage detection scenarios.MCWOA excels in avoiding false positives and enhancing computational speed,making it an optimal choice for structural damage detection.The efficiency of the proposed MCWOA is assessed against metrics such as energy usage,computational expense,task duration,and delay.The simulated data indicates that the new MCWOA outpaces other methods across all metrics.The study also references the Whale Optimization Algorithm(WOA),Chimp Algorithm(CA),Ant Lion Optimizer(ALO),Genetic Algorithm(GA)and Grey Wolf Optimizer(GWO).
基金National Natural Science Foundation of China(Nos.42071444,42101444)。
文摘Cultural relics line graphic serves as a crucial form of traditional artifact information documentation,which is a simple and intuitive product with low cost of displaying compared with 3D models.Dimensionality reduction is undoubtedly necessary for line drawings.However,most existing methods for artifact drawing rely on the principles of orthographic projection that always cannot avoid angle occlusion and data overlapping while the surface of cultural relics is complex.Therefore,conformal mapping was introduced as a dimensionality reduction way to compensate for the limitation of orthographic projection.Based on the given criteria for assessing surface complexity,this paper proposed a three-dimensional feature guideline extraction method for complex cultural relic surfaces.A 2D and 3D combined factor that measured the importance of points on describing surface features,vertex weight,was designed.Then the selection threshold for feature guideline extraction was determined based on the differences between vertex weight and shape index distributions.The feasibility and stability were verified through experiments conducted on real cultural relic surface data.Results demonstrated the ability of the method to address the challenges associated with the automatic generation of line drawings for complex surfaces.The extraction method and the obtained results will be useful for line graphic drawing,displaying and propaganda of cultural relics.
文摘Security issues in cloud networks and edge computing have become very common. This research focuses on analyzing such issues and developing the best solutions. A detailed literature review has been conducted in this regard. The findings have shown that many challenges are linked to edge computing, such as privacy concerns, security breaches, high costs, low efficiency, etc. Therefore, there is a need to implement proper security measures to overcome these issues. Using emerging trends, like machine learning, encryption, artificial intelligence, real-time monitoring, etc., can help mitigate security issues. They can also develop a secure and safe future in cloud computing. It was concluded that the security implications of edge computing can easily be covered with the help of new technologies and techniques.
基金supported by the Second Tibetan Plateau Scientific Expedition and Research Program (STEP) (Grant No.2019QZKK010203)the National Natural Science Foundation of China (Grant No.42175174 and 41975130)+1 种基金the Natural Science Foundation of Sichuan Province (Grant No.2022NSFSC1092)the Sichuan Provincial Innovation Training Program for College Students (Grant No.S202210621009)。
文摘In a convective scheme featuring a discretized cloud size density, the assumed lateral mixing rate is inversely proportional to the exponential coefficient of plume size. This follows a typical assumption of-1, but it has unveiled inherent uncertainties, especially for deep layer clouds. Addressing this knowledge gap, we conducted comprehensive large eddy simulations and comparative analyses focused on terrestrial regions. Our investigation revealed that cloud formation adheres to the tenets of Bernoulli trials, illustrating power-law scaling that remains consistent regardless of the inherent deep layer cloud attributes existing between cloud size and the number of clouds. This scaling paradigm encompasses liquid, ice, and mixed phases in deep layer clouds. The exponent characterizing the interplay between cloud scale and number in the deep layer cloud, specifically for liquid, ice, or mixed-phase clouds, resembles that of shallow convection,but converges closely to zero. This convergence signifies a propensity for diminished cloud numbers and sizes within deep layer clouds. Notably, the infusion of abundant moisture and the release of latent heat by condensation within the lower atmospheric strata make substantial contributions. However, this role in ice phase formation is limited. The emergence of liquid and ice phases in deep layer clouds is facilitated by the latent heat and influenced by the wind shear inherent in the middle levels. These interrelationships hold potential applications in formulating parameterizations and post-processing model outcomes.
基金the deputyship for Research&Innovation,Ministry of Education in Saudi Arabia for funding this research work through the Project Number(IFP-2022-34).
文摘In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding the risk of data loss and data overlapping.The development of data flow scheduling approaches in the cloud environment taking security parameters into account is insufficient.In our work,we propose a data scheduling model for the cloud environment.Themodel is made up of three parts that together help dispatch user data flow to the appropriate cloudVMs.The first component is the Collector Agent whichmust periodically collect information on the state of the network links.The second one is the monitoring agent which must then analyze,classify,and make a decision on the state of the link and finally transmit this information to the scheduler.The third one is the scheduler who must consider previous information to transfer user data,including fair distribution and reliable paths.It should be noted that each part of the proposedmodel requires the development of its algorithms.In this article,we are interested in the development of data transfer algorithms,including fairness distribution with the consideration of a stable link state.These algorithms are based on the grouping of transmitted files and the iterative method.The proposed algorithms showthe performances to obtain an approximate solution to the studied problem which is an NP-hard(Non-Polynomial solution)problem.The experimental results show that the best algorithm is the half-grouped minimum excluding(HME),with a percentage of 91.3%,an average deviation of 0.042,and an execution time of 0.001 s.
文摘This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.
基金funded by the National Natural Science Foundation of China (Grant Nos. 42305150 and 42325501)the China Postdoctoral Science Foundation (Grant No. 2023M741774)。
文摘Cloud base height(CBH) is a crucial parameter for cloud radiative effect estimates, climate change simulations, and aviation guidance. However, due to the limited information on cloud vertical structures included in passive satellite radiometer observations, few operational satellite CBH products are currently available. This study presents a new method for retrieving CBH from satellite radiometers. The method first uses the combined measurements of satellite radiometers and ground-based cloud radars to develop a lookup table(LUT) of effective cloud water content(ECWC), representing the vertically varying cloud water content. This LUT allows for the conversion of cloud water path to cloud geometric thickness(CGT), enabling the estimation of CBH as the difference between cloud top height and CGT. Detailed comparative analysis of CBH estimates from the state-of-the-art ECWC LUT are conducted against four ground-based millimeter-wave cloud radar(MMCR) measurements, and results show that the mean bias(correlation coefficient) is0.18±1.79 km(0.73), which is lower(higher) than 0.23±2.11 km(0.67) as derived from the combined measurements of satellite radiometers and satellite radar-lidar(i.e., Cloud Sat and CALIPSO). Furthermore, the percentages of the CBH biases within 250 m increase by 5% to 10%, which varies by location. This indicates that the CBH estimates from our algorithm are more consistent with ground-based MMCR measurements. Therefore, this algorithm shows great potential for further improvement of the CBH retrievals as ground-based MMCR are being increasingly included in global surface meteorological observing networks, and the improved CBH retrievals will contribute to better cloud radiative effect estimates.
文摘Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin in CC’s performance,the Cloud Service Broker(CSB),orchestrates DC selection.Failure to adroitly route user requests with suitable DCs transforms the CSB into a bottleneck,endangering service quality.To tackle this,deploying an efficient CSB policy becomes imperative,optimizing DC selection to meet stringent Qualityof-Service(QoS)demands.Amidst numerous CSB policies,their implementation grapples with challenges like costs and availability.This article undertakes a holistic review of diverse CSB policies,concurrently surveying the predicaments confronted by current policies.The foremost objective is to pinpoint research gaps and remedies to invigorate future policy development.Additionally,it extensively clarifies various DC selection methodologies employed in CC,enriching practitioners and researchers alike.Employing synthetic analysis,the article systematically assesses and compares myriad DC selection techniques.These analytical insights equip decision-makers with a pragmatic framework to discern the apt technique for their needs.In summation,this discourse resoundingly underscores the paramount importance of adept CSB policies in DC selection,highlighting the imperative role of efficient CSB policies in optimizing CC performance.By emphasizing the significance of these policies and their modeling implications,the article contributes to both the general modeling discourse and its practical applications in the CC domain.
文摘Cloud computing is the new norm within business entities as businesses try to keep up with technological advancements and user needs. The concept is defined as a computing environment allowing for remote outsourcing of storage and computing resources. A hybrid cloud environment is an excellent example of cloud computing. Specifically, the hybrid system provides organizations with increased scalability and control over their data and support for a remote workforce. However, hybrid cloud systems are expensive as organizations operate different infrastructures while introducing complexity to the organization’s activities. Data security is critical among the most vital concerns that have resulted from the use of cloud computing, thus, affecting the rate of user adoption and acceptance. This article, borrowing from the hybrid cloud computing system, recommends combining traditional and modern data security systems. Traditional data security systems have proven effective in their respective roles, with the main challenge arising from their recognition of context and connectivity. Therefore, integrating traditional and modern designs is recommended to enhance effectiveness, context, connectivity, and efficiency.
基金supported in part by the National Natural Science Foundation of China (Grant No. 42105127)the Special Research Assistant Project of the Chinese Academy of Sciencesthe National Key Research and Development Plans of China (Grant Nos. 2019YFC1510304 and 2016YFE0201900-02)。
文摘The cloud type product 2B-CLDCLASS-LIDAR based on CloudSat and CALIPSO from June 2006 to May 2017 is used to examine the temporal and spatial distribution characteristics and interannual variability of eight cloud types(high cloud, altostratus, altocumulus, stratus, stratocumulus, cumulus, nimbostratus, and deep convection) and three phases(ice,mixed, and water) in the Arctic. Possible reasons for the observed interannual variability are also discussed. The main conclusions are as follows:(1) More water clouds occur on the Atlantic side, and more ice clouds occur over continents.(2)The average spatial and seasonal distributions of cloud types show three patterns: high clouds and most cumuliform clouds are concentrated in low-latitude locations and peak in summer;altostratus and nimbostratus are concentrated over and around continents and are less abundant in summer;stratocumulus and stratus are concentrated near the inner Arctic and peak during spring and autumn.(3) Regional averaged interannual frequencies of ice clouds and altostratus clouds significantly decrease, while those of water clouds, altocumulus, and cumulus clouds increase significantly.(4) Significant features of the linear trends of cloud frequencies are mainly located over ocean areas.(5) The monthly water cloud frequency anomalies are positively correlated with air temperature in most of the troposphere, while those for ice clouds are negatively correlated.(6) The decrease in altostratus clouds is associated with the weakening of the Arctic front due to Arctic warming, while increased water vapor transport into the Arctic and higher atmospheric instability lead to more cumulus and altocumulus clouds.
文摘This study investigates how cybersecurity can be enhanced through cloud computing solutions in the United States. The motive for this study is due to the rampant loss of data, breaches, and unauthorized access of internet criminals in the United States. The study adopted a survey research design, collecting data from 890 cloud professionals with relevant knowledge of cybersecurity and cloud computing. A machine learning approach was adopted, specifically a random forest classifier, an ensemble, and a decision tree model. Out of the features in the data, ten important features were selected using random forest feature importance, which helps to achieve the objective of the study. The study’s purpose is to enable organizations to develop suitable techniques to prevent cybercrime using random forest predictions as they relate to cloud services in the United States. The effectiveness of the models used is evaluated by utilizing validation matrices that include recall values, accuracy, and precision, in addition to F1 scores and confusion matrices. Based on evaluation scores (accuracy, precision, recall, and F1 scores) of 81.9%, 82.6%, and 82.1%, the results demonstrated the effectiveness of the random forest model. It showed the importance of machine learning algorithms in preventing cybercrime and boosting security in the cloud environment. It recommends that other machine learning models be adopted to see how to improve cybersecurity through cloud computing.
文摘This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.