The Advanced Metering Infrastructure(AMI),as a crucial subsystem in the smart grid,is responsible for measuring user electricity consumption and plays a vital role in communication between providers and consumers.Howe...The Advanced Metering Infrastructure(AMI),as a crucial subsystem in the smart grid,is responsible for measuring user electricity consumption and plays a vital role in communication between providers and consumers.However,with the advancement of information and communication technology,new security and privacy challenges have emerged for AMI.To address these challenges and enhance the security and privacy of user data in the smart grid,a Hierarchical Privacy Protection Model in Advanced Metering Infrastructure based on Cloud and Fog Assistance(HPPM-AMICFA)is proposed in this paper.The proposed model integrates cloud and fog computing with hierarchical threshold encryption,offering a flexible and efficient privacy protection solution that significantly enhances data security in the smart grid.The methodology involves setting user protection levels by processing missing data and utilizing fuzzy comprehensive analysis to evaluate user importance,thereby assigning appropriate protection levels.Furthermore,a hierarchical threshold encryption algorithm is developed to provide differentiated protection strategies for fog nodes based on user IDs,ensuring secure aggregation and encryption of user data.Experimental results demonstrate that HPPM-AMICFA effectively resists various attack strategies while minimizing time costs,thereby safeguarding user data in the smart grid.展开更多
Hybrid Power-line/Visible-light Communication(HPVC)network has been one of the most promising Cooperative Communication(CC)technologies for constructing Smart Home due to its superior communication reliability and har...Hybrid Power-line/Visible-light Communication(HPVC)network has been one of the most promising Cooperative Communication(CC)technologies for constructing Smart Home due to its superior communication reliability and hardware efficiency.Current research on HPVC networks focuses on the performance analysis and optimization of the Physical(PHY)layer,where the Power Line Communication(PLC)component only serves as the backbone to provide power to light Emitting Diode(LED)devices.So designing a Media Access Control(MAC)protocol remains a great challenge because it allows both PLC and Visible Light Communication(VLC)components to operate data transmission,i.e.,to achieve a true HPVC network CC.To solve this problem,we propose a new HPC network MAC protocol(HPVC MAC)based on Carrier Sense Multiple Access/Collision Avoidance(CSMA/CA)by combining IEEE 802.15.7 and IEEE 1901 standards.Firstly,we add an Additional Assistance(AA)layer to provide the channel selection strategies for sensor stations,so that they can complete data transmission on the selected channel via the specified CSMA/CA mechanism,respectively.Based on this,we give a detailed working principle of the HPVC MAC,followed by the construction of a joint analytical model for mathematicalmathematical validation of the HPVC MAC.In the modeling process,the impacts of PHY layer settings(including channel fading types and additive noise feature),CSMA/CA mechanisms of 802.15.7 and 1901,and practical configurations(such as traffic rate,transit buffer size)are comprehensively taken into consideration.Moreover,we prove the proposed analytical model has the solvability.Finally,through extensive simulations,we characterize the HPVC MAC performance under different system parameters and verify the correctness of the corresponding analytical model with an average error rate of 4.62%between the simulation and analytical results.展开更多
The use of Amazon Web Services is growing rapidly as more users are adopting the technology.It has various functionalities that can be used by large corporates and individuals as well.Sentiment analysis is used to bui...The use of Amazon Web Services is growing rapidly as more users are adopting the technology.It has various functionalities that can be used by large corporates and individuals as well.Sentiment analysis is used to build an intelligent system that can study the opinions of the people and help to classify those related emotions.In this research work,sentiment analysis is performed on the AWS Elastic Compute Cloud(EC2)through Twitter data.The data is managed to the EC2 by using elastic load balancing.The collected data is subjected to preprocessing approaches to clean the data,and then machine learning-based logistic regression is employed to categorize the sentiments into positive and negative sentiments.High accuracy of 94.17%is obtained through the proposed machine learning model which is higher than the other models that are developed using the existing algorithms.展开更多
Assimilation configurations have significant impacts on analysis results and subsequent forecasts. A squall line system that occurred on 23 April 2007 over southern China was used to investigate the impacts of the dat...Assimilation configurations have significant impacts on analysis results and subsequent forecasts. A squall line system that occurred on 23 April 2007 over southern China was used to investigate the impacts of the data assimilation frequency of radar data on analyses and forecasts. A three-dimensional variational system was used to assimilate radial velocity data,and a cloud analysis system was used for reflectivity assimilation with a 2-h assimilation window covering the initial stage of the squall line. Two operators of radar reflectivity for cloud analyses corresponding to single-and double-moment schemes were used. In this study, we examined the sensitivity of assimilation frequency using 10-, 20-, 30-, and 60-min assimilation intervals. The results showed that analysis fields were not consistent with model dynamics and microphysics in general;thus, model states, including dynamic and microphysical variables, required approximately 20 min to reach a new balance after data assimilation in all experiments. Moreover, a 20-min data assimilation interval generally produced better forecasts for both single-and double-moment schemes in terms of equitable threat and bias scores. We conclude that a higher data assimilation frequency can produce a more intense cold pool and rear inflow jets but does not necessarily lead to a better forecast.展开更多
Oil spill-induced vapor cloud explosions in a confined space can cause catastrophic consequences.In this work,investigation was conducted on the catastrophic pipeline leak,oil spill,and the resulting vapor cloud explo...Oil spill-induced vapor cloud explosions in a confined space can cause catastrophic consequences.In this work,investigation was conducted on the catastrophic pipeline leak,oil spill,and the resulting vapor cloud explosion accident occurring in China in 2013 by modeling analysis,field surveys,and numerical simulations.The total amount of the spilled oil was up to2044.4 m3 due to improper disposal.The long residence time of the oil remaining in a confined space permitted the formation of explosive mixtures and caused the vapor cloud explosion.A numerical model was developed to estimate the consequence of the explosion based on volatilization testing results.The results show that the death-leading zone and the glass-breaking zone could be 18 m and 92 m,respectively,which are consistent with the field investigation.The severity of the explosion is related to the amount of the oil spill,properties of oil,and volatilization time.It is recommended that a comprehensive risk assessment be conducted to analyze the possible consequences upon oil spilling into a confined space.Prompt collection and ventilation measures should be taken immediately after the spill occurs to reduce the time for oil volatilization and prevent the mixture from reaching its explosive limit.展开更多
Chinese FengYun-2C(FY-2C) satellite data were combined into the Local Analysis and Prediction System(LAPS) model to obtain three-dimensional cloud parameters and rain content. These parameters analyzed by LAPS were us...Chinese FengYun-2C(FY-2C) satellite data were combined into the Local Analysis and Prediction System(LAPS) model to obtain three-dimensional cloud parameters and rain content. These parameters analyzed by LAPS were used to initialize the Global/Regional Assimilation and Prediction System model(GRAPES) in China to predict precipitation in a rainstorm case in the country. Three prediction experiments were conducted and were used to investigate the impacts of FY-2C satellite data on cloud analysis of LAPS and on short range precipitation forecasts. In the first experiment, the initial cloud fields was zero value. In the second, the initial cloud fields were cloud liquid water, cloud ice, and rain content derived from LAPS without combining the satellite data. In the third experiment, the initial cloud fields were cloud liquid water, cloud ice, and rain content derived from LAPS including satellite data. The results indicated that the FY-2C satellite data combination in LAPS can show more realistic cloud distributions, and the model simulation for precipitation in 1–6 h had certain improvements over that when satellite data and complex cloud analysis were not applied.展开更多
Cloud storage is essential for managing user data to store and retrieve from the distributed data centre.The storage service is distributed as pay a service for accessing the size to collect the data.Due to the massiv...Cloud storage is essential for managing user data to store and retrieve from the distributed data centre.The storage service is distributed as pay a service for accessing the size to collect the data.Due to the massive amount of data stored in the data centre containing similar information and file structures remaining in multi-copy,duplication leads to increase storage space.The potential deduplication system doesn’t make efficient data reduction because of inaccuracy in finding similar data analysis.It creates a complex nature to increase the storage consumption under cost.To resolve this problem,this paper proposes an efficient storage reduction called Hash-Indexing Block-based Deduplication(HIBD)based on Segmented Bind Linkage(SBL)Methods for reducing storage in a cloud environment.Initially,preprocessing is done using the sparse augmentation technique.Further,the preprocessed files are segmented into blocks to make Hash-Index.The block of the contents is compared with other files through Semantic Content Source Deduplication(SCSD),which identifies the similar content presence between the file.Based on the content presence count,the Distance Vector Weightage Correlation(DVWC)estimates the document similarity weight,and related files are grouped into a cluster.Finally,the segmented bind linkage compares the document to find duplicate content in the cluster using similarity weight based on the coefficient match case.This implementation helps identify the data redundancy efficiently and reduces the service cost in distributed cloud storage.展开更多
The availability and advancements of cloud computing service models such as IaaS, SaaS, and PaaS;introducing on-demand self-service, auto scaling, easy maintenance, and pay as you go, has dramatically transformed the ...The availability and advancements of cloud computing service models such as IaaS, SaaS, and PaaS;introducing on-demand self-service, auto scaling, easy maintenance, and pay as you go, has dramatically transformed the way organizations design and operate their datacenters. However, some organizations still have many concerns like: security, governance, lack of expertise, and migration. The purpose of this paper is to discuss the cloud computing customers’ opinions, feedbacks, attitudes, and emotions towards cloud computing services using sentiment analysis. The associated aim, is to help people and organizations to understand the benefits and challenges of cloud services from the general public’s perspective view as well as opinions about existing cloud providers, focusing on three main cloud providers: Azure, Amazon Web Services (AWS) and Google Cloud. The methodology used in this paper is based on sentiment analysis applied to the tweets that were extracted from social media platform (Twitter) via its search API. We have extracted a sample of 11,000 tweets and each cloud provider has almost similar proportion of the tweets based on relevant hashtags and keywords. Analysis starts by combining the tweets in order to find the overall polarity about cloud computing, then breaking the tweets to find the specific polarity for each cloud provider. Bing and NRC Lexicons are employed to measure the polarity and emotion of the terms in the tweets. The overall polarity classification of the tweets across all cloud providers shows 68.5% positive and 31.5% negative percentages. More specifically, Azure shows 63.8% positive and 36.2% negative tweets, Google Cloud shows 72.6% positive and 27.4% negative tweets and AWS shows 69.1% positive and 30.9% negative tweets.展开更多
Nowadays,theuse of Avatars that are unique digital depictions has increased by users to access Metaverse—a virtual reality environment—through multiple devices and for various purposes.Therefore,the Avatar and Metav...Nowadays,theuse of Avatars that are unique digital depictions has increased by users to access Metaverse—a virtual reality environment—through multiple devices and for various purposes.Therefore,the Avatar and Metaverse are being developed with a new theory,application,and design,necessitating the association of more personal data and devices of targeted users every day.This Avatar and Metaverse technology explosion raises privacy and security concerns,leading to cyber attacks.MV-Honeypot,or Metaverse-Honeypot,as a commercial off-the-shelf solution that can counter these cyber attack-causing vulnerabilities,should be developed.To fill this gap,we study user’s engagements with Avatars in Metaverse,analyze possible security vulnerabilities,and create a model named Simplified Avatar Relationship Association with Non-linear Gradient(SARANG)that draws the full diagram of infrastructure components and data flow through accessing Metaverse in this paper.We also determine the most significant threat for each component’s cyberattacks that will affect user data and Avatars.As a result,the commercial off-the-shelf(COTS)of the MV-Honeypot must be established.展开更多
Cloud computing has drastically changed the delivery and consumption of live streaming content.The designs,challenges,and possible uses of cloud computing for live streaming are studied.A comprehensive overview of the...Cloud computing has drastically changed the delivery and consumption of live streaming content.The designs,challenges,and possible uses of cloud computing for live streaming are studied.A comprehensive overview of the technical and business issues surrounding cloudbased live streaming is provided,including the benefits of cloud computing,the various live streaming architectures,and the challenges that live streaming service providers face in delivering high‐quality,real‐time services.The different techniques used to improve the performance of video streaming,such as adaptive bit‐rate streaming,multicast distribution,and edge computing are discussed and the necessity of low‐latency and high‐quality video transmission in cloud‐based live streaming is underlined.Issues such as improving user experience and live streaming service performance using cutting‐edge technology,like artificial intelligence and machine learning are discussed.In addition,the legal and regulatory implications of cloud‐based live streaming,including issues with network neutrality,data privacy,and content moderation are addressed.The future of cloud computing for live streaming is examined in the section that follows,and it looks at the most likely new developments in terms of trends and technology.For technology vendors,live streaming service providers,and regulators,the findings have major policy‐relevant implications.Suggestions on how stakeholders should address these concerns and take advantage of the potential presented by this rapidly evolving sector,as well as insights into the key challenges and opportunities associated with cloud‐based live streaming are provided.展开更多
This paper proposes several quantitative characteristics to study convective systems using observations from Doppler weather radars and geostationary satellites. Specifically, in order to measure the convective intens...This paper proposes several quantitative characteristics to study convective systems using observations from Doppler weather radars and geostationary satellites. Specifically, in order to measure the convective intensity of each system, a new index, named the "Convective Intensity Ratio" (CIR), is defined as the ratio between the area of strong radar echoes at the upper level and the size of the convective cell itself. Based on these quantitative characteristics, the evolution of convective cells, surface rainfall intensity, rainfall area and convectively generated anvil clouds can be studied, and the relationships between them can also be analyzed. After testing nine meso-β-scale convective systems over North China during 2006–2007, the results were as follows: (1) the CIR was highly correlated with surface rainfall intensity, and the correlation reached a maximum when the CIR led rainfall intensity by 6–30 mins. The maximum CIR could be at most ~30 mins before the maximum rainfall intensity. (2) Convective systems with larger maximum CIRs usually had colder cloud-tops. (3) The maximum area of anvil cloud appeared 0.5–1.5 h after rainfall intensity began to weaken. The maximum area of anvil cloud and the time lag between maximum rainfall intensity and the maximum area of anvil cloud both increased with the CIR.展开更多
Cloud computing plays a significant role in modern information technology, providing organizations with numerous benefits, including flexibility, scalability, and cost-efficiency. However, it has become essential for ...Cloud computing plays a significant role in modern information technology, providing organizations with numerous benefits, including flexibility, scalability, and cost-efficiency. However, it has become essential for organizations to ensure the security of their applications, data, and cloud-based networks to use cloud services effectively. This systematic literature review aims to determine the latest information regarding cloud computing security, with a specific emphasis on threats and mitigation strategies. Additionally, it highlights some common threats related to cloud computing security, such as distributed denial-of-service (DDoS) attacks, account hijacking, malware attacks, and data breaches. This research also explores some mitigation strategies, including security awareness training, vulnerability management, security information and event management (SIEM), identity and access management (IAM), and encryption techniques. It discusses emerging trends in cloud security, such as integrating artificial intelligence (AI) and machine learning (ML), serverless computing, and containerization, as well as the effectiveness of the shared responsibility model and its related challenges. The importance of user awareness and the impact of emerging technologies on cloud security have also been discussed in detail to mitigate security risks. A literature review of previous research and scholarly articles has also been conducted to provide insights regarding cloud computing security. It shows the need for continuous research and innovation to address emerging threats and maintain a security-conscious culture in the company.展开更多
Light Detection And Ranging (LiDAR) is a well-established active remote sensing technology that can provide accurate digital elevation measurements for the terrain and non-ground objects such as vegetations and buildi...Light Detection And Ranging (LiDAR) is a well-established active remote sensing technology that can provide accurate digital elevation measurements for the terrain and non-ground objects such as vegetations and buildings, etc. Non-ground objects need to be removed for creation of a Digital Terrain Model (DTM) which is a continuous surface representing only ground surface points. This study aimed at comparative analysis of three main filtering approaches for stripping off non-ground objects namely;Gaussian low pass filter, focal analysis mean filter and DTM slope-based filter of varying window sizes in creation of a reliable DTM from airborne LiDAR point clouds. A sample of LiDAR data provided by the ISPRS WG III/4 captured at Vaihingen in Germany over a pure residential area has been used in the analysis. Visual analysis has indicated that Gaussian low pass filter has given blurred DTMs of attenuated high-frequency objects and emphasized low-frequency objects while it has achieved improved removal of non-ground object at larger window sizes. Focal analysis mean filter has shown better removal of nonground objects compared to Gaussian low pass filter especially at large window sizes where details of non-ground objects almost have diminished in the DTMs from window sizes of 25 × 25 and greater. DTM slope-based filter has created bare earth models that have been full of gabs at the positions of the non-ground objects where the sizes and numbers of that gabs have increased with increasing the window sizes of filter. Those gaps have been closed through exploitation of the spline interpolation method in order to get continuous surface representing bare earth landscape. Comparative analysis has shown that the minimum elevations of the DTMs increase with increasing the filter widow sizes till 21 × 21 and 31 × 31 for the Gaussian low pass filter and the focal analysis mean filter respectively. On the other hand, the DTM slope-based filter has kept the minimum elevation of the original data, that could be due to noise in the LiDAR data unchanged. Alternatively, the three approaches have produced DTMs of decreasing maximum elevation values and consequently decreasing ranges of elevations due to increases in the filter window sizes. Moreover, the standard deviations of the created DTMs from the three filters have decreased with increasing the filter window sizes however, the decreases have been continuous and steady in the cases of the Gaussian low pass filter and the focal analysis mean filters while in the case of the DTM slope-based filter the standard deviations of the created DTMs have decreased with high rates till window size of 31 × 31 then they have kept unchanged due to more increases in the filter window sizes.展开更多
Background: Hypertension is a universal risk factor for cardiovascular diseases and is thus the leading cause of death worldwide. The identification of novel prognostic and pathogenesis biomarkers plays a key role in ...Background: Hypertension is a universal risk factor for cardiovascular diseases and is thus the leading cause of death worldwide. The identification of novel prognostic and pathogenesis biomarkers plays a key role in disease management. Methods: The GSE145854 and GSE164494 datasets were downloaded from the Gene Expression Omnibus (GEO) database and used for screening and validating hypertension signature genes, respectively. Gene Ontology (GO) enrichment analysis was performed on the differentially expressed genes (DEGs) related to calcium ion metabolism in patients with hypertension. The core genes related to immune infiltration were analyzed and screened, and the activity of the signature genes and related pathways was quantified using gene set enrichment analysis (GSEA). The infiltration of immune cells in the blood samples was analyzed, and the DEGs that were abnormally expressed in the clinical blood samples of patients with hypertension were verified via RT-qPCR. Results: A total of 176 DEGs were screened. GO showed that DEGs was involved in the regulation of calcium ion metabolism in biological processes (BP), actin mediated cell contraction, negative regulation of cell movement, and calcium ion transmembrane transport, and in the regulation of protease activity in molecular functions (MF). KEGG analysis revealed that the DEGs were involved mainly in the cGMP-PKG signaling pathway, ubiquitin-protein transferase, tight junction-associated proteins, and the regulation of myocardial cells. MF analysis revealed the immune infiltration function of the cells. RT-qPCR revealed that the expression of Cacna1d, Serpine1, Slc8a3, and Trpc4 was up regulated in hypertension, the expression of Myoz2 and Slc25a23 was down regulated. Conclusion: Cacna1d, Serpine1, Slc8a3, Trpc4, Myoz2 and Slc25a23 may be involved in the regulation of calcium metabolism pathways and play key roles in hypertension. These differentially expressed calcium metabolism-related genes may serve as prognostic markers of hypertension.展开更多
基金This research was funded by the National Natural Science Foundation of China(Grant Number 61902069)Natural Science Foundation of Fujian Province of China(Grant Number 2021J011068)+1 种基金Research Initiation Fund Program of Fujian University of Technology(GY-S24002,GY-Z21048)Fujian Provincial Department of Science and Technology Industrial Guidance Project(Grant Number 2022H0025).
文摘The Advanced Metering Infrastructure(AMI),as a crucial subsystem in the smart grid,is responsible for measuring user electricity consumption and plays a vital role in communication between providers and consumers.However,with the advancement of information and communication technology,new security and privacy challenges have emerged for AMI.To address these challenges and enhance the security and privacy of user data in the smart grid,a Hierarchical Privacy Protection Model in Advanced Metering Infrastructure based on Cloud and Fog Assistance(HPPM-AMICFA)is proposed in this paper.The proposed model integrates cloud and fog computing with hierarchical threshold encryption,offering a flexible and efficient privacy protection solution that significantly enhances data security in the smart grid.The methodology involves setting user protection levels by processing missing data and utilizing fuzzy comprehensive analysis to evaluate user importance,thereby assigning appropriate protection levels.Furthermore,a hierarchical threshold encryption algorithm is developed to provide differentiated protection strategies for fog nodes based on user IDs,ensuring secure aggregation and encryption of user data.Experimental results demonstrate that HPPM-AMICFA effectively resists various attack strategies while minimizing time costs,thereby safeguarding user data in the smart grid.
基金supported by the National Natural Science Foundation of China(No.61772386)National Key Research and Development Project(No.2018YFB1305001)Fundamental Research Funds for the Central Universities(No.KJ02072021-0119).
文摘Hybrid Power-line/Visible-light Communication(HPVC)network has been one of the most promising Cooperative Communication(CC)technologies for constructing Smart Home due to its superior communication reliability and hardware efficiency.Current research on HPVC networks focuses on the performance analysis and optimization of the Physical(PHY)layer,where the Power Line Communication(PLC)component only serves as the backbone to provide power to light Emitting Diode(LED)devices.So designing a Media Access Control(MAC)protocol remains a great challenge because it allows both PLC and Visible Light Communication(VLC)components to operate data transmission,i.e.,to achieve a true HPVC network CC.To solve this problem,we propose a new HPC network MAC protocol(HPVC MAC)based on Carrier Sense Multiple Access/Collision Avoidance(CSMA/CA)by combining IEEE 802.15.7 and IEEE 1901 standards.Firstly,we add an Additional Assistance(AA)layer to provide the channel selection strategies for sensor stations,so that they can complete data transmission on the selected channel via the specified CSMA/CA mechanism,respectively.Based on this,we give a detailed working principle of the HPVC MAC,followed by the construction of a joint analytical model for mathematicalmathematical validation of the HPVC MAC.In the modeling process,the impacts of PHY layer settings(including channel fading types and additive noise feature),CSMA/CA mechanisms of 802.15.7 and 1901,and practical configurations(such as traffic rate,transit buffer size)are comprehensively taken into consideration.Moreover,we prove the proposed analytical model has the solvability.Finally,through extensive simulations,we characterize the HPVC MAC performance under different system parameters and verify the correctness of the corresponding analytical model with an average error rate of 4.62%between the simulation and analytical results.
基金This research project was supported by the Deanship of Scientific Research,Prince Sattam Bin Abdulaziz University,KSA,Project Grant No.2021/01/17783,Sha M,www.psau.edu.sa.
文摘The use of Amazon Web Services is growing rapidly as more users are adopting the technology.It has various functionalities that can be used by large corporates and individuals as well.Sentiment analysis is used to build an intelligent system that can study the opinions of the people and help to classify those related emotions.In this research work,sentiment analysis is performed on the AWS Elastic Compute Cloud(EC2)through Twitter data.The data is managed to the EC2 by using elastic load balancing.The collected data is subjected to preprocessing approaches to clean the data,and then machine learning-based logistic regression is employed to categorize the sentiments into positive and negative sentiments.High accuracy of 94.17%is obtained through the proposed machine learning model which is higher than the other models that are developed using the existing algorithms.
基金supported by the National Key R&D Program of China (Grant No.2017YFC1502104)the National Natural Science Foundation of China (Grant Nos.41775099 and 41605026)Grant No.NJCAR2016ZD02,and the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD)
文摘Assimilation configurations have significant impacts on analysis results and subsequent forecasts. A squall line system that occurred on 23 April 2007 over southern China was used to investigate the impacts of the data assimilation frequency of radar data on analyses and forecasts. A three-dimensional variational system was used to assimilate radial velocity data,and a cloud analysis system was used for reflectivity assimilation with a 2-h assimilation window covering the initial stage of the squall line. Two operators of radar reflectivity for cloud analyses corresponding to single-and double-moment schemes were used. In this study, we examined the sensitivity of assimilation frequency using 10-, 20-, 30-, and 60-min assimilation intervals. The results showed that analysis fields were not consistent with model dynamics and microphysics in general;thus, model states, including dynamic and microphysical variables, required approximately 20 min to reach a new balance after data assimilation in all experiments. Moreover, a 20-min data assimilation interval generally produced better forecasts for both single-and double-moment schemes in terms of equitable threat and bias scores. We conclude that a higher data assimilation frequency can produce a more intense cold pool and rear inflow jets but does not necessarily lead to a better forecast.
基金supported by China Scholarship Council(201809110035)the State Key Research and Development Plan Project of China(2016YFC0801500).
文摘Oil spill-induced vapor cloud explosions in a confined space can cause catastrophic consequences.In this work,investigation was conducted on the catastrophic pipeline leak,oil spill,and the resulting vapor cloud explosion accident occurring in China in 2013 by modeling analysis,field surveys,and numerical simulations.The total amount of the spilled oil was up to2044.4 m3 due to improper disposal.The long residence time of the oil remaining in a confined space permitted the formation of explosive mixtures and caused the vapor cloud explosion.A numerical model was developed to estimate the consequence of the explosion based on volatilization testing results.The results show that the death-leading zone and the glass-breaking zone could be 18 m and 92 m,respectively,which are consistent with the field investigation.The severity of the explosion is related to the amount of the oil spill,properties of oil,and volatilization time.It is recommended that a comprehensive risk assessment be conducted to analyze the possible consequences upon oil spilling into a confined space.Prompt collection and ventilation measures should be taken immediately after the spill occurs to reduce the time for oil volatilization and prevent the mixture from reaching its explosive limit.
基金supported by the National Natural Science Foundation of China (41375025, 41275114, and 41275039)the National High Technology Research and Development Program of China (863 Program, 2012AA120903)+1 种基金the Public Benefit Research Foundation of the China Meteorological Administration (GYHY201106044 and GYHY201406001)the China Meteorological Administration Torrential Flood Project
文摘Chinese FengYun-2C(FY-2C) satellite data were combined into the Local Analysis and Prediction System(LAPS) model to obtain three-dimensional cloud parameters and rain content. These parameters analyzed by LAPS were used to initialize the Global/Regional Assimilation and Prediction System model(GRAPES) in China to predict precipitation in a rainstorm case in the country. Three prediction experiments were conducted and were used to investigate the impacts of FY-2C satellite data on cloud analysis of LAPS and on short range precipitation forecasts. In the first experiment, the initial cloud fields was zero value. In the second, the initial cloud fields were cloud liquid water, cloud ice, and rain content derived from LAPS without combining the satellite data. In the third experiment, the initial cloud fields were cloud liquid water, cloud ice, and rain content derived from LAPS including satellite data. The results indicated that the FY-2C satellite data combination in LAPS can show more realistic cloud distributions, and the model simulation for precipitation in 1–6 h had certain improvements over that when satellite data and complex cloud analysis were not applied.
文摘Cloud storage is essential for managing user data to store and retrieve from the distributed data centre.The storage service is distributed as pay a service for accessing the size to collect the data.Due to the massive amount of data stored in the data centre containing similar information and file structures remaining in multi-copy,duplication leads to increase storage space.The potential deduplication system doesn’t make efficient data reduction because of inaccuracy in finding similar data analysis.It creates a complex nature to increase the storage consumption under cost.To resolve this problem,this paper proposes an efficient storage reduction called Hash-Indexing Block-based Deduplication(HIBD)based on Segmented Bind Linkage(SBL)Methods for reducing storage in a cloud environment.Initially,preprocessing is done using the sparse augmentation technique.Further,the preprocessed files are segmented into blocks to make Hash-Index.The block of the contents is compared with other files through Semantic Content Source Deduplication(SCSD),which identifies the similar content presence between the file.Based on the content presence count,the Distance Vector Weightage Correlation(DVWC)estimates the document similarity weight,and related files are grouped into a cluster.Finally,the segmented bind linkage compares the document to find duplicate content in the cluster using similarity weight based on the coefficient match case.This implementation helps identify the data redundancy efficiently and reduces the service cost in distributed cloud storage.
文摘The availability and advancements of cloud computing service models such as IaaS, SaaS, and PaaS;introducing on-demand self-service, auto scaling, easy maintenance, and pay as you go, has dramatically transformed the way organizations design and operate their datacenters. However, some organizations still have many concerns like: security, governance, lack of expertise, and migration. The purpose of this paper is to discuss the cloud computing customers’ opinions, feedbacks, attitudes, and emotions towards cloud computing services using sentiment analysis. The associated aim, is to help people and organizations to understand the benefits and challenges of cloud services from the general public’s perspective view as well as opinions about existing cloud providers, focusing on three main cloud providers: Azure, Amazon Web Services (AWS) and Google Cloud. The methodology used in this paper is based on sentiment analysis applied to the tweets that were extracted from social media platform (Twitter) via its search API. We have extracted a sample of 11,000 tweets and each cloud provider has almost similar proportion of the tweets based on relevant hashtags and keywords. Analysis starts by combining the tweets in order to find the overall polarity about cloud computing, then breaking the tweets to find the specific polarity for each cloud provider. Bing and NRC Lexicons are employed to measure the polarity and emotion of the terms in the tweets. The overall polarity classification of the tweets across all cloud providers shows 68.5% positive and 31.5% negative percentages. More specifically, Azure shows 63.8% positive and 36.2% negative tweets, Google Cloud shows 72.6% positive and 27.4% negative tweets and AWS shows 69.1% positive and 30.9% negative tweets.
基金supported by the Institute of Information&Communications Technology Planning&Evaluation(IITP)(Project Nos.2022-0-00701,10%,RS-2023-00228996,10%,RS-2022-00165794,10%)the ICTR&DProgram of MSIT/IITP(ProjectNo.2021-0-01816,10%)a National Research Foundation of Korea(NRF)grant funded by the Korean Government(Project No.RS2023-00208460,60%).
文摘Nowadays,theuse of Avatars that are unique digital depictions has increased by users to access Metaverse—a virtual reality environment—through multiple devices and for various purposes.Therefore,the Avatar and Metaverse are being developed with a new theory,application,and design,necessitating the association of more personal data and devices of targeted users every day.This Avatar and Metaverse technology explosion raises privacy and security concerns,leading to cyber attacks.MV-Honeypot,or Metaverse-Honeypot,as a commercial off-the-shelf solution that can counter these cyber attack-causing vulnerabilities,should be developed.To fill this gap,we study user’s engagements with Avatars in Metaverse,analyze possible security vulnerabilities,and create a model named Simplified Avatar Relationship Association with Non-linear Gradient(SARANG)that draws the full diagram of infrastructure components and data flow through accessing Metaverse in this paper.We also determine the most significant threat for each component’s cyberattacks that will affect user data and Avatars.As a result,the commercial off-the-shelf(COTS)of the MV-Honeypot must be established.
文摘Cloud computing has drastically changed the delivery and consumption of live streaming content.The designs,challenges,and possible uses of cloud computing for live streaming are studied.A comprehensive overview of the technical and business issues surrounding cloudbased live streaming is provided,including the benefits of cloud computing,the various live streaming architectures,and the challenges that live streaming service providers face in delivering high‐quality,real‐time services.The different techniques used to improve the performance of video streaming,such as adaptive bit‐rate streaming,multicast distribution,and edge computing are discussed and the necessity of low‐latency and high‐quality video transmission in cloud‐based live streaming is underlined.Issues such as improving user experience and live streaming service performance using cutting‐edge technology,like artificial intelligence and machine learning are discussed.In addition,the legal and regulatory implications of cloud‐based live streaming,including issues with network neutrality,data privacy,and content moderation are addressed.The future of cloud computing for live streaming is examined in the section that follows,and it looks at the most likely new developments in terms of trends and technology.For technology vendors,live streaming service providers,and regulators,the findings have major policy‐relevant implications.Suggestions on how stakeholders should address these concerns and take advantage of the potential presented by this rapidly evolving sector,as well as insights into the key challenges and opportunities associated with cloud‐based live streaming are provided.
基金supported bythe National High Technology Research and Development Program of China (Grant No. 2006AA122106)the Na-tional Natural Science Foundation of China (Grant No.40875019) the Foundation of Basic Scientific Researchand Operation of Chinese Academy of Meteorological Sci-ence (Grant No. 2007Y004)
文摘This paper proposes several quantitative characteristics to study convective systems using observations from Doppler weather radars and geostationary satellites. Specifically, in order to measure the convective intensity of each system, a new index, named the "Convective Intensity Ratio" (CIR), is defined as the ratio between the area of strong radar echoes at the upper level and the size of the convective cell itself. Based on these quantitative characteristics, the evolution of convective cells, surface rainfall intensity, rainfall area and convectively generated anvil clouds can be studied, and the relationships between them can also be analyzed. After testing nine meso-β-scale convective systems over North China during 2006–2007, the results were as follows: (1) the CIR was highly correlated with surface rainfall intensity, and the correlation reached a maximum when the CIR led rainfall intensity by 6–30 mins. The maximum CIR could be at most ~30 mins before the maximum rainfall intensity. (2) Convective systems with larger maximum CIRs usually had colder cloud-tops. (3) The maximum area of anvil cloud appeared 0.5–1.5 h after rainfall intensity began to weaken. The maximum area of anvil cloud and the time lag between maximum rainfall intensity and the maximum area of anvil cloud both increased with the CIR.
文摘Cloud computing plays a significant role in modern information technology, providing organizations with numerous benefits, including flexibility, scalability, and cost-efficiency. However, it has become essential for organizations to ensure the security of their applications, data, and cloud-based networks to use cloud services effectively. This systematic literature review aims to determine the latest information regarding cloud computing security, with a specific emphasis on threats and mitigation strategies. Additionally, it highlights some common threats related to cloud computing security, such as distributed denial-of-service (DDoS) attacks, account hijacking, malware attacks, and data breaches. This research also explores some mitigation strategies, including security awareness training, vulnerability management, security information and event management (SIEM), identity and access management (IAM), and encryption techniques. It discusses emerging trends in cloud security, such as integrating artificial intelligence (AI) and machine learning (ML), serverless computing, and containerization, as well as the effectiveness of the shared responsibility model and its related challenges. The importance of user awareness and the impact of emerging technologies on cloud security have also been discussed in detail to mitigate security risks. A literature review of previous research and scholarly articles has also been conducted to provide insights regarding cloud computing security. It shows the need for continuous research and innovation to address emerging threats and maintain a security-conscious culture in the company.
文摘Light Detection And Ranging (LiDAR) is a well-established active remote sensing technology that can provide accurate digital elevation measurements for the terrain and non-ground objects such as vegetations and buildings, etc. Non-ground objects need to be removed for creation of a Digital Terrain Model (DTM) which is a continuous surface representing only ground surface points. This study aimed at comparative analysis of three main filtering approaches for stripping off non-ground objects namely;Gaussian low pass filter, focal analysis mean filter and DTM slope-based filter of varying window sizes in creation of a reliable DTM from airborne LiDAR point clouds. A sample of LiDAR data provided by the ISPRS WG III/4 captured at Vaihingen in Germany over a pure residential area has been used in the analysis. Visual analysis has indicated that Gaussian low pass filter has given blurred DTMs of attenuated high-frequency objects and emphasized low-frequency objects while it has achieved improved removal of non-ground object at larger window sizes. Focal analysis mean filter has shown better removal of nonground objects compared to Gaussian low pass filter especially at large window sizes where details of non-ground objects almost have diminished in the DTMs from window sizes of 25 × 25 and greater. DTM slope-based filter has created bare earth models that have been full of gabs at the positions of the non-ground objects where the sizes and numbers of that gabs have increased with increasing the window sizes of filter. Those gaps have been closed through exploitation of the spline interpolation method in order to get continuous surface representing bare earth landscape. Comparative analysis has shown that the minimum elevations of the DTMs increase with increasing the filter widow sizes till 21 × 21 and 31 × 31 for the Gaussian low pass filter and the focal analysis mean filter respectively. On the other hand, the DTM slope-based filter has kept the minimum elevation of the original data, that could be due to noise in the LiDAR data unchanged. Alternatively, the three approaches have produced DTMs of decreasing maximum elevation values and consequently decreasing ranges of elevations due to increases in the filter window sizes. Moreover, the standard deviations of the created DTMs from the three filters have decreased with increasing the filter window sizes however, the decreases have been continuous and steady in the cases of the Gaussian low pass filter and the focal analysis mean filters while in the case of the DTM slope-based filter the standard deviations of the created DTMs have decreased with high rates till window size of 31 × 31 then they have kept unchanged due to more increases in the filter window sizes.
文摘Background: Hypertension is a universal risk factor for cardiovascular diseases and is thus the leading cause of death worldwide. The identification of novel prognostic and pathogenesis biomarkers plays a key role in disease management. Methods: The GSE145854 and GSE164494 datasets were downloaded from the Gene Expression Omnibus (GEO) database and used for screening and validating hypertension signature genes, respectively. Gene Ontology (GO) enrichment analysis was performed on the differentially expressed genes (DEGs) related to calcium ion metabolism in patients with hypertension. The core genes related to immune infiltration were analyzed and screened, and the activity of the signature genes and related pathways was quantified using gene set enrichment analysis (GSEA). The infiltration of immune cells in the blood samples was analyzed, and the DEGs that were abnormally expressed in the clinical blood samples of patients with hypertension were verified via RT-qPCR. Results: A total of 176 DEGs were screened. GO showed that DEGs was involved in the regulation of calcium ion metabolism in biological processes (BP), actin mediated cell contraction, negative regulation of cell movement, and calcium ion transmembrane transport, and in the regulation of protease activity in molecular functions (MF). KEGG analysis revealed that the DEGs were involved mainly in the cGMP-PKG signaling pathway, ubiquitin-protein transferase, tight junction-associated proteins, and the regulation of myocardial cells. MF analysis revealed the immune infiltration function of the cells. RT-qPCR revealed that the expression of Cacna1d, Serpine1, Slc8a3, and Trpc4 was up regulated in hypertension, the expression of Myoz2 and Slc25a23 was down regulated. Conclusion: Cacna1d, Serpine1, Slc8a3, Trpc4, Myoz2 and Slc25a23 may be involved in the regulation of calcium metabolism pathways and play key roles in hypertension. These differentially expressed calcium metabolism-related genes may serve as prognostic markers of hypertension.