The objective of reliability-based design optimization(RBDO)is to minimize the optimization objective while satisfying the corresponding reliability requirements.However,the nested loop characteristic reduces the effi...The objective of reliability-based design optimization(RBDO)is to minimize the optimization objective while satisfying the corresponding reliability requirements.However,the nested loop characteristic reduces the efficiency of RBDO algorithm,which hinders their application to high-dimensional engineering problems.To address these issues,this paper proposes an efficient decoupled RBDO method combining high dimensional model representation(HDMR)and the weight-point estimation method(WPEM).First,we decouple the RBDO model using HDMR and WPEM.Second,Lagrange interpolation is used to approximate a univariate function.Finally,based on the results of the first two steps,the original nested loop reliability optimization model is completely transformed into a deterministic design optimization model that can be solved by a series of mature constrained optimization methods without any additional calculations.Two numerical examples of a planar 10-bar structure and an aviation hydraulic piping system with 28 design variables are analyzed to illustrate the performance and practicability of the proposed method.展开更多
Latent factor(LF) models are highly effective in extracting useful knowledge from High-Dimensional and Sparse(HiDS) matrices which are commonly seen in various industrial applications. An LF model usually adopts itera...Latent factor(LF) models are highly effective in extracting useful knowledge from High-Dimensional and Sparse(HiDS) matrices which are commonly seen in various industrial applications. An LF model usually adopts iterative optimizers,which may consume many iterations to achieve a local optima,resulting in considerable time cost. Hence, determining how to accelerate the training process for LF models has become a significant issue. To address this, this work proposes a randomized latent factor(RLF) model. It incorporates the principle of randomized learning techniques from neural networks into the LF analysis of HiDS matrices, thereby greatly alleviating computational burden. It also extends a standard learning process for randomized neural networks in context of LF analysis to make the resulting model represent an HiDS matrix correctly.Experimental results on three HiDS matrices from industrial applications demonstrate that compared with state-of-the-art LF models, RLF is able to achieve significantly higher computational efficiency and comparable prediction accuracy for missing data.I provides an important alternative approach to LF analysis of HiDS matrices, which is especially desired for industrial applications demanding highly efficient models.展开更多
To solve the problem of target damage assessment when fragments attack target under uncertain projectile and target intersection in an air defense intercept,this paper proposes a method for calculating target damage p...To solve the problem of target damage assessment when fragments attack target under uncertain projectile and target intersection in an air defense intercept,this paper proposes a method for calculating target damage probability leveraging spatio-temporal finite multilayer fragments distribution and the target damage assessment algorithm based on cloud model theory.Drawing on the spatial dispersion characteristics of fragments of projectile proximity explosion,we divide into a finite number of fragments distribution planes based on the time series in space,set up a fragment layer dispersion model grounded in the time series and intersection criterion for determining the effective penetration of each layer of fragments into the target.Building on the precondition that the multilayer fragments of the time series effectively assail the target,we also establish the damage criterion of the perforation and penetration damage and deduce the damage probability calculation model.Taking the damage probability of the fragment layer in the spatio-temporal sequence to the target as the input state variable,we introduce cloud model theory to research the target damage assessment method.Combining the equivalent simulation experiment,the scientific and rational nature of the proposed method were validated through quantitative calculations and comparative analysis.展开更多
Because all the known integrable models possess Schwarzian forms with Mobious transformation invariance,it may be one of the best ways to find new integrable models starting from some suitable Mobious transformation i...Because all the known integrable models possess Schwarzian forms with Mobious transformation invariance,it may be one of the best ways to find new integrable models starting from some suitable Mobious transformation invariant equations. In this paper, we study the Painlevé integrability of some special (3+1)-dimensional Schwarzian models.展开更多
This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hac...This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.展开更多
In view of the limitations of traditional measurement methods in the field of building information,such as complex operation,low timeliness and poor accuracy,a new way of combining three-dimensional scanning technolog...In view of the limitations of traditional measurement methods in the field of building information,such as complex operation,low timeliness and poor accuracy,a new way of combining three-dimensional scanning technology and BIM(Building Information Modeling)model was discussed.Focused on the efficient acquisition of building geometric information using the fast-developing 3D point cloud technology,an improved deep learning-based 3D point cloud recognition method was proposed.The method optimised the network structure based on RandLA-Net to adapt to the large-scale point cloud processing requirements,while the semantic and instance features of the point cloud were integrated to significantly improve the recognition accuracy and provide a precise basis for BIM model remodeling.In addition,a visual BIM model generation system was developed,which systematically transformed the point cloud recognition results into BIM component parameters,automatically constructed BIM models,and promoted the open sharing and secondary development of models.The research results not only effectively promote the automation process of converting 3D point cloud data to refined BIM models,but also provide important technical support for promoting building informatisation and accelerating the construction of smart cities,showing a wide range of application potential and practical value.展开更多
In 1995, the Intergovernmental Panel on Climate Change (IPCC) released a thermodynamic model based on the Greenhouse Effect, aiming to forecast global temperatures. This study delves into the intricacies of that model...In 1995, the Intergovernmental Panel on Climate Change (IPCC) released a thermodynamic model based on the Greenhouse Effect, aiming to forecast global temperatures. This study delves into the intricacies of that model. Some interesting observations are revealed. The IPCC model equated average temperatures with average energy fluxes, which can cause significant errors. The model assumed that all energy fluxes remained constant, and the Earth emitted infrared radiation as if it were a blackbody. Neither of those conditions exists. The IPCC’s definition of Climate Change only includes events caused by human actions, excluding most causes. Satellite data aimed at the tops of clouds may have inferred a high Greenhouse Gas absorption flux. The model showed more energy coming from the atmosphere than absorbed from the sun, which may have caused a violation of the First and Second Laws of Thermodynamics. There were unexpectedly large gaps in the satellite data that aligned with various absorption bands of Greenhouse Gases, possibly caused by photon scattering associated with re-emissions. Based on science, we developed a cloud-based climate model that complied with the Radiation Laws and the First and Second Laws of Thermodynamics. The Cloud Model showed that 81.3% of the outgoing reflected and infrared radiation was applicable to the clouds and water vapor. In comparison, the involvement of CO<sub>2</sub> was only 0.04%, making it too minuscule to measure reliably.展开更多
Purpose-In order to solve the problem of inaccurate calculation of index weights,subjectivity and uncertainty of index assessment in the risk assessment process,this study aims to propose a scientific and reasonable c...Purpose-In order to solve the problem of inaccurate calculation of index weights,subjectivity and uncertainty of index assessment in the risk assessment process,this study aims to propose a scientific and reasonable centralized traffic control(CTC)system risk assessment method.Design/methodologylapproach-First,system-theoretic process analysis(STPA)is used to conduct risk analysis on the CTC system and constructs risk assessment indexes based on this analysis.Then,to enhance the accuracy of weight calculation,the fuzzy analytical hierarchy process(FAHP),fuzzy decision-making trial and evaluation laboratory(FDEMATEL)and entropy weight method are employed to calculate the subjective weight,relative weight and objective weight of each index.These three types of weights are combined using game theory to obtain the combined weight for each index.To reduce subjectivity and uncertainty in the assessment process,the backward cloud generator method is utilized to obtain the numerical character(NC)of the cloud model for each index.The NCs of the indexes are then weighted to derive the comprehensive cloud for risk assessment of the CTC system.This cloud model is used to obtain the CTC system's comprehensive risk assessment.The model's similarity measurement method gauges the likeness between the comprehensive risk assessment cloud and the risk standard cloud.Finally,this process yields the risk assessment results for the CTC system.Findings-The cloud model can handle the subjectivity and fuzziness in the risk assessment process well.The cloud model-based risk assessment method was applied to the CTC system risk assessment of a railway group and achieved good results.Originality/value-This study provides a cloud model-based method for risk assessment of CTC systems,which accurately calculates the weight of risk indexes and uses cloud models to reduce uncertainty and subjectivity in the assessment,achieving effective risk assessment of CTC systems.It can provide a reference and theoretical basis for risk management of the CTC system.展开更多
Cloud Computing is an uprising technology in the rapid growing IT world. The adaptation of cloud computing is increasing in very large scale business organizations to small institutions rapidly due to many advanced fe...Cloud Computing is an uprising technology in the rapid growing IT world. The adaptation of cloud computing is increasing in very large scale business organizations to small institutions rapidly due to many advanced features of cloud computing, such as SaaS, PaaS and IaaS service models. So, nowadays, many organizations are trying to implement Cloud Computing based ERP system to enjoy the benefits of cloud computing. To implement any ERP system, an organization usually faces many challenges. As a result, this research has introduced how easily this cloud system can be implemented in an organization. By using this ERP system, an organization can be benefited in many ways;especially Small and Medium Enterprises (SMEs) can enjoy the highest possible benefits from this system.展开更多
This paper studies the re-adjusted cross-validation method and a semiparametric regression model called the varying index coefficient model. We use the profile spline modal estimator method to estimate the coefficient...This paper studies the re-adjusted cross-validation method and a semiparametric regression model called the varying index coefficient model. We use the profile spline modal estimator method to estimate the coefficients of the parameter part of the Varying Index Coefficient Model (VICM), while the unknown function part uses the B-spline to expand. Moreover, we combine the above two estimation methods under the assumption of high-dimensional data. The results of data simulation and empirical analysis show that for the varying index coefficient model, the re-adjusted cross-validation method is better in terms of accuracy and stability than traditional methods based on ordinary least squares.展开更多
Model averaging has attracted increasing attention in recent years for the analysis of high-dimensional data. By weighting several competing statistical models suitably, model averaging attempts to achieve stable and ...Model averaging has attracted increasing attention in recent years for the analysis of high-dimensional data. By weighting several competing statistical models suitably, model averaging attempts to achieve stable and improved prediction. To obtain a better understanding of the available model averaging methods, their properties and the relationships between them, this paper is devoted to make a review on some recent progresses in high-dimensional model averaging from the frequentist perspective. Some future research topics are also discussed.展开更多
The method of cloud model with entropy weight was adopted for the prediction of rock burst classification. Some main factors of rock burst including the uniaxial compressive strength (σc), the tensile strength (σ...The method of cloud model with entropy weight was adopted for the prediction of rock burst classification. Some main factors of rock burst including the uniaxial compressive strength (σc), the tensile strength (σt), the tangential stress (σθ), the rock brittleness coefficient (σc/σt), the stress coefficient (σθ /σc) and the elastic energy index (Wet) are chosen to establish evaluation index system. The entropy?cloud model and criterion are obtained through 209 sets of rock burst samples from underground rock projects. The sensitivity of indicators is analyzed and 209 sets of rock burst samples are discriminated by this model. The discriminant results of the entropy-cloud model are compared with those of Bayes, KNN and RF methods. The results show that the sensitivity order of those factors from high to low is σ_θ /σ_c, σ_θ, W_(ct), σ_c/σ_t, σ_t, σ_c, and the entropy-cloud model has higher accuracy than Bayes, K-Nearest Neighbor algorithm (KNN) and Random Forest (RF) methods.展开更多
Aiming at the real-time fluctuation and nonlinear characteristics of the expressway short-term traffic flow forecasting the parameter projection pursuit regression PPPR model is applied to forecast the expressway traf...Aiming at the real-time fluctuation and nonlinear characteristics of the expressway short-term traffic flow forecasting the parameter projection pursuit regression PPPR model is applied to forecast the expressway traffic flow where the orthogonal Hermite polynomial is used to fit the ridge functions and the least square method is employed to determine the polynomial weight coefficient c.In order to efficiently optimize the projection direction a and the number M of ridge functions of the PPPR model the chaos cloud particle swarm optimization CCPSO algorithm is applied to optimize the parameters. The CCPSO-PPPR hybrid optimization model for expressway short-term traffic flow forecasting is established in which the CCPSO algorithm is used to optimize the optimal projection direction a in the inner layer while the number M of ridge functions is optimized in the outer layer.Traffic volume weather factors and travel date of the previous several time intervals of the road section are taken as the input influencing factors. Example forecasting and model comparison results indicate that the proposed model can obtain a better forecasting effect and its absolute error is controlled within [-6,6] which can meet the application requirements of expressway traffic flow forecasting.展开更多
Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes...Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes from two dimensional contours. With the development of measuring equipment, cloud points that contain more details of the object can be obtained conveniently. On the other hand, large quantity of sampled points brings difficulties to model reconstruction method. This paper first presents an algorithm to automatically reduce the number of cloud points under given tolerance. Triangle mesh surface from the simplified data set is reconstructed by the marching cubes algorithm. For various reasons, reconstructed mesh usually contains unwanted holes. An approach to create new triangles is proposed with optimized shape for covering the unexpected holes in triangle meshes. After hole filling, watertight triangle mesh can be directly output in STL format, which is widely used in rapid prototype manufacturing. Practical examples are included to demonstrate the method.展开更多
In order to reduce amount of data storage and improve processing capacity of the system, this paper proposes a new classification method of data source by combining phase synchronization model in network clusteri...In order to reduce amount of data storage and improve processing capacity of the system, this paper proposes a new classification method of data source by combining phase synchronization model in network clustering with cloud model. Firstly, taking data source as a complex network, after the topography of network is obtained, the cloud model of each node data is determined by fuzzy analytic hierarchy process (AHP). Secondly, by calculating expectation, entropy and hyper entropy of the cloud model, comprehensive coupling strength is got and then it is regarded as the edge weight of topography. Finally, distribution curve is obtained by iterating the phase of each node by means of phase synchronization model. Thus classification of data source is completed. This method can not only provide convenience for storage, cleaning and compression of data, but also improve the efficiency of data analysis.展开更多
Uncertainties existing in the process of dam deformation negatively influence deformation prediction. However, existing deformation pre- diction models seldom consider uncertainties. In this study, a cloud-Verhulst hy...Uncertainties existing in the process of dam deformation negatively influence deformation prediction. However, existing deformation pre- diction models seldom consider uncertainties. In this study, a cloud-Verhulst hybrid prediction model was established by combing a cloud model with the Verhulst model. The expectation, one of the cloud characteristic parameters, was obtained using the Verhulst model, and the other two cloud characteristic parameters, entropy and hyper-entropy, were calculated by introducing inertia weight. The hybrid prediction model was used to predict the dam deformation in a hydroelectric project. Comparison of the prediction results of the hybrid prediction model with those of a traditional statistical model and the monitoring values shows that the proposed model has higher prediction accuracy than the traditional sta- tistical model. It provides a new approach to predicting dam deformation under uncertain conditions.展开更多
The motion of particle clouds formed by dumping dredged material into quiescent waters is experimentally and numerically studied. In the numerical model, the particle phase is modeled by the dispersion model, and turb...The motion of particle clouds formed by dumping dredged material into quiescent waters is experimentally and numerically studied. In the numerical model, the particle phase is modeled by the dispersion model, and turbulence is calculated by the large eddy simulation. The governing equations, including the filtered Navier-Stokes equations and mass transport equation, are solved based on the operator-splitting algorithm and an implicit cubic spline interpolation scheme. The eddy viscosity is evaluated by the modified Smagorinsky model including the buoyancy term. Comparisons of main flow characteristics, including shape, size, average density excess, moving speed and the amount of particles deposited on the bed, between experimental and computational results show that the numerical model well predicts the motion of the cloud from the falling to spreading stage. The effects of silt-fence on the motion of the particle cloud are also investigated.展开更多
The accuracy of present flatness predictive method is limited and it just belongs to software simulation. In order to improve it, a novel flatness predictive model via T-S cloud reasoning network implemented by digita...The accuracy of present flatness predictive method is limited and it just belongs to software simulation. In order to improve it, a novel flatness predictive model via T-S cloud reasoning network implemented by digital signal processor(DSP) is proposed. First, the combination of genetic algorithm(GA) and simulated annealing algorithm(SAA) is put forward, called GA-SA algorithm, which can make full use of the global search ability of GA and local search ability of SA. Later, based on T-S cloud reasoning neural network, flatness predictive model is designed in DSP. And it is applied to 900 HC reversible cold rolling mill. Experimental results demonstrate that the flatness predictive model via T-S cloud reasoning network can run on the hardware DSP TMS320 F2812 with high accuracy and robustness by using GA-SA algorithm to optimize the model parameter.展开更多
With a focus on the difficulty of quantitatively describing the degree of nonuniformity of temporal and spatial distributions of water resources, quantitative research was carried out on the temporal and spatial distr...With a focus on the difficulty of quantitatively describing the degree of nonuniformity of temporal and spatial distributions of water resources, quantitative research was carried out on the temporal and spatial distribution characteristics of water resources in Guangdong Province from 1956 to 2000 based on a cloud model. The spatial variation of the temporal distribution characteristics and the temporal variation of the spatial distribution characteristics were both analyzed. In addition, the relationships between the numerical characteristics of the cloud model of temporal and spatial distributions of water resources and precipitation were also studied. The results show that, using a cloud model, it is possible to intuitively describe the temporal and spatial distribution characteristics of water resources in cloud images. Water resources in Guangdong Province and their temporal and spatial distribution characteristics are differentiated by their geographic locations. Downstream and coastal areas have a larger amount of water resources with greater uniformity and stronger stability in terms of temporal distribution. Regions with more precipitation possess larger amounts of water resources, and years with more precipitation show greater nonuniformity in the spatial distribution of water resources. The correlation between the nonuniformity of the temporal distribution and local precipitation is small, and no correlation is found between the stability of the nonuniformity of the temporal and spatial distributions of water resources and precipitation. The amount of water resources in Guangdong Province shows an increasing trend from 1956 to 2000, the nonuniformity of the spatial distribution of water resources declines, and the stability of the nonuniformity of the spatial distribution of water resources is enhanced.展开更多
基金supported by the Innovation Fund Project of the Gansu Education Department(Grant No.2021B-099).
文摘The objective of reliability-based design optimization(RBDO)is to minimize the optimization objective while satisfying the corresponding reliability requirements.However,the nested loop characteristic reduces the efficiency of RBDO algorithm,which hinders their application to high-dimensional engineering problems.To address these issues,this paper proposes an efficient decoupled RBDO method combining high dimensional model representation(HDMR)and the weight-point estimation method(WPEM).First,we decouple the RBDO model using HDMR and WPEM.Second,Lagrange interpolation is used to approximate a univariate function.Finally,based on the results of the first two steps,the original nested loop reliability optimization model is completely transformed into a deterministic design optimization model that can be solved by a series of mature constrained optimization methods without any additional calculations.Two numerical examples of a planar 10-bar structure and an aviation hydraulic piping system with 28 design variables are analyzed to illustrate the performance and practicability of the proposed method.
基金supported in part by the National Natural Science Foundation of China (6177249391646114)+1 种基金Chongqing research program of technology innovation and application (cstc2017rgzn-zdyfX0020)in part by the Pioneer Hundred Talents Program of Chinese Academy of Sciences
文摘Latent factor(LF) models are highly effective in extracting useful knowledge from High-Dimensional and Sparse(HiDS) matrices which are commonly seen in various industrial applications. An LF model usually adopts iterative optimizers,which may consume many iterations to achieve a local optima,resulting in considerable time cost. Hence, determining how to accelerate the training process for LF models has become a significant issue. To address this, this work proposes a randomized latent factor(RLF) model. It incorporates the principle of randomized learning techniques from neural networks into the LF analysis of HiDS matrices, thereby greatly alleviating computational burden. It also extends a standard learning process for randomized neural networks in context of LF analysis to make the resulting model represent an HiDS matrix correctly.Experimental results on three HiDS matrices from industrial applications demonstrate that compared with state-of-the-art LF models, RLF is able to achieve significantly higher computational efficiency and comparable prediction accuracy for missing data.I provides an important alternative approach to LF analysis of HiDS matrices, which is especially desired for industrial applications demanding highly efficient models.
基金supported by National Natural Science Foundation of China(Grant No.62073256)the Shaanxi Provincial Science and Technology Department(Grant No.2023-YBGY-342).
文摘To solve the problem of target damage assessment when fragments attack target under uncertain projectile and target intersection in an air defense intercept,this paper proposes a method for calculating target damage probability leveraging spatio-temporal finite multilayer fragments distribution and the target damage assessment algorithm based on cloud model theory.Drawing on the spatial dispersion characteristics of fragments of projectile proximity explosion,we divide into a finite number of fragments distribution planes based on the time series in space,set up a fragment layer dispersion model grounded in the time series and intersection criterion for determining the effective penetration of each layer of fragments into the target.Building on the precondition that the multilayer fragments of the time series effectively assail the target,we also establish the damage criterion of the perforation and penetration damage and deduce the damage probability calculation model.Taking the damage probability of the fragment layer in the spatio-temporal sequence to the target as the input state variable,we introduce cloud model theory to research the target damage assessment method.Combining the equivalent simulation experiment,the scientific and rational nature of the proposed method were validated through quantitative calculations and comparative analysis.
文摘Because all the known integrable models possess Schwarzian forms with Mobious transformation invariance,it may be one of the best ways to find new integrable models starting from some suitable Mobious transformation invariant equations. In this paper, we study the Painlevé integrability of some special (3+1)-dimensional Schwarzian models.
文摘This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system.
文摘In view of the limitations of traditional measurement methods in the field of building information,such as complex operation,low timeliness and poor accuracy,a new way of combining three-dimensional scanning technology and BIM(Building Information Modeling)model was discussed.Focused on the efficient acquisition of building geometric information using the fast-developing 3D point cloud technology,an improved deep learning-based 3D point cloud recognition method was proposed.The method optimised the network structure based on RandLA-Net to adapt to the large-scale point cloud processing requirements,while the semantic and instance features of the point cloud were integrated to significantly improve the recognition accuracy and provide a precise basis for BIM model remodeling.In addition,a visual BIM model generation system was developed,which systematically transformed the point cloud recognition results into BIM component parameters,automatically constructed BIM models,and promoted the open sharing and secondary development of models.The research results not only effectively promote the automation process of converting 3D point cloud data to refined BIM models,but also provide important technical support for promoting building informatisation and accelerating the construction of smart cities,showing a wide range of application potential and practical value.
文摘In 1995, the Intergovernmental Panel on Climate Change (IPCC) released a thermodynamic model based on the Greenhouse Effect, aiming to forecast global temperatures. This study delves into the intricacies of that model. Some interesting observations are revealed. The IPCC model equated average temperatures with average energy fluxes, which can cause significant errors. The model assumed that all energy fluxes remained constant, and the Earth emitted infrared radiation as if it were a blackbody. Neither of those conditions exists. The IPCC’s definition of Climate Change only includes events caused by human actions, excluding most causes. Satellite data aimed at the tops of clouds may have inferred a high Greenhouse Gas absorption flux. The model showed more energy coming from the atmosphere than absorbed from the sun, which may have caused a violation of the First and Second Laws of Thermodynamics. There were unexpectedly large gaps in the satellite data that aligned with various absorption bands of Greenhouse Gases, possibly caused by photon scattering associated with re-emissions. Based on science, we developed a cloud-based climate model that complied with the Radiation Laws and the First and Second Laws of Thermodynamics. The Cloud Model showed that 81.3% of the outgoing reflected and infrared radiation was applicable to the clouds and water vapor. In comparison, the involvement of CO<sub>2</sub> was only 0.04%, making it too minuscule to measure reliably.
基金National Natural Science Foundation of China under Grant 62203468Technological Research and Development Program of China State Railway Group Co.,Ltd.under Grant J2023G007+2 种基金Young Elite Scientist Sponsorship Program by China Association for Science and Technology(CAST)under Grant 2022QNRC001Youth Talent Program Supported by China Railway SocietyResearch Program of Beijing Hua-Tie Information Technology Corporation Limited under Grant 2023HT02.
文摘Purpose-In order to solve the problem of inaccurate calculation of index weights,subjectivity and uncertainty of index assessment in the risk assessment process,this study aims to propose a scientific and reasonable centralized traffic control(CTC)system risk assessment method.Design/methodologylapproach-First,system-theoretic process analysis(STPA)is used to conduct risk analysis on the CTC system and constructs risk assessment indexes based on this analysis.Then,to enhance the accuracy of weight calculation,the fuzzy analytical hierarchy process(FAHP),fuzzy decision-making trial and evaluation laboratory(FDEMATEL)and entropy weight method are employed to calculate the subjective weight,relative weight and objective weight of each index.These three types of weights are combined using game theory to obtain the combined weight for each index.To reduce subjectivity and uncertainty in the assessment process,the backward cloud generator method is utilized to obtain the numerical character(NC)of the cloud model for each index.The NCs of the indexes are then weighted to derive the comprehensive cloud for risk assessment of the CTC system.This cloud model is used to obtain the CTC system's comprehensive risk assessment.The model's similarity measurement method gauges the likeness between the comprehensive risk assessment cloud and the risk standard cloud.Finally,this process yields the risk assessment results for the CTC system.Findings-The cloud model can handle the subjectivity and fuzziness in the risk assessment process well.The cloud model-based risk assessment method was applied to the CTC system risk assessment of a railway group and achieved good results.Originality/value-This study provides a cloud model-based method for risk assessment of CTC systems,which accurately calculates the weight of risk indexes and uses cloud models to reduce uncertainty and subjectivity in the assessment,achieving effective risk assessment of CTC systems.It can provide a reference and theoretical basis for risk management of the CTC system.
文摘Cloud Computing is an uprising technology in the rapid growing IT world. The adaptation of cloud computing is increasing in very large scale business organizations to small institutions rapidly due to many advanced features of cloud computing, such as SaaS, PaaS and IaaS service models. So, nowadays, many organizations are trying to implement Cloud Computing based ERP system to enjoy the benefits of cloud computing. To implement any ERP system, an organization usually faces many challenges. As a result, this research has introduced how easily this cloud system can be implemented in an organization. By using this ERP system, an organization can be benefited in many ways;especially Small and Medium Enterprises (SMEs) can enjoy the highest possible benefits from this system.
文摘This paper studies the re-adjusted cross-validation method and a semiparametric regression model called the varying index coefficient model. We use the profile spline modal estimator method to estimate the coefficients of the parameter part of the Varying Index Coefficient Model (VICM), while the unknown function part uses the B-spline to expand. Moreover, we combine the above two estimation methods under the assumption of high-dimensional data. The results of data simulation and empirical analysis show that for the varying index coefficient model, the re-adjusted cross-validation method is better in terms of accuracy and stability than traditional methods based on ordinary least squares.
文摘Model averaging has attracted increasing attention in recent years for the analysis of high-dimensional data. By weighting several competing statistical models suitably, model averaging attempts to achieve stable and improved prediction. To obtain a better understanding of the available model averaging methods, their properties and the relationships between them, this paper is devoted to make a review on some recent progresses in high-dimensional model averaging from the frequentist perspective. Some future research topics are also discussed.
基金Projects(51474252,51274253)supported by the National Natural Science Foundation of ChinaProject(2015CX005)supported by the Innovation Driven Plan of Central South University,ChinaProject(2016zzts095)supported by the Fundamental Research Funds for the Central Universities,China
文摘The method of cloud model with entropy weight was adopted for the prediction of rock burst classification. Some main factors of rock burst including the uniaxial compressive strength (σc), the tensile strength (σt), the tangential stress (σθ), the rock brittleness coefficient (σc/σt), the stress coefficient (σθ /σc) and the elastic energy index (Wet) are chosen to establish evaluation index system. The entropy?cloud model and criterion are obtained through 209 sets of rock burst samples from underground rock projects. The sensitivity of indicators is analyzed and 209 sets of rock burst samples are discriminated by this model. The discriminant results of the entropy-cloud model are compared with those of Bayes, KNN and RF methods. The results show that the sensitivity order of those factors from high to low is σ_θ /σ_c, σ_θ, W_(ct), σ_c/σ_t, σ_t, σ_c, and the entropy-cloud model has higher accuracy than Bayes, K-Nearest Neighbor algorithm (KNN) and Random Forest (RF) methods.
基金The National Natural Science Foundation of China(No.71101014,50679008)Specialized Research Fund for the Doctoral Program of Higher Education(No.200801411105)the Science and Technology Project of the Department of Communications of Henan Province(No.2010D107-4)
文摘Aiming at the real-time fluctuation and nonlinear characteristics of the expressway short-term traffic flow forecasting the parameter projection pursuit regression PPPR model is applied to forecast the expressway traffic flow where the orthogonal Hermite polynomial is used to fit the ridge functions and the least square method is employed to determine the polynomial weight coefficient c.In order to efficiently optimize the projection direction a and the number M of ridge functions of the PPPR model the chaos cloud particle swarm optimization CCPSO algorithm is applied to optimize the parameters. The CCPSO-PPPR hybrid optimization model for expressway short-term traffic flow forecasting is established in which the CCPSO algorithm is used to optimize the optimal projection direction a in the inner layer while the number M of ridge functions is optimized in the outer layer.Traffic volume weather factors and travel date of the previous several time intervals of the road section are taken as the input influencing factors. Example forecasting and model comparison results indicate that the proposed model can obtain a better forecasting effect and its absolute error is controlled within [-6,6] which can meet the application requirements of expressway traffic flow forecasting.
文摘Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes from two dimensional contours. With the development of measuring equipment, cloud points that contain more details of the object can be obtained conveniently. On the other hand, large quantity of sampled points brings difficulties to model reconstruction method. This paper first presents an algorithm to automatically reduce the number of cloud points under given tolerance. Triangle mesh surface from the simplified data set is reconstructed by the marching cubes algorithm. For various reasons, reconstructed mesh usually contains unwanted holes. An approach to create new triangles is proposed with optimized shape for covering the unexpected holes in triangle meshes. After hole filling, watertight triangle mesh can be directly output in STL format, which is widely used in rapid prototype manufacturing. Practical examples are included to demonstrate the method.
基金National Natural Science Foundation of China(No.61171057,No.61503345)Science Foundation for North University of China(No.110246)+1 种基金Specialized Research Fund for Doctoral Program of Higher Education of China(No.20121420110004)International Office of Shanxi Province Education Department of China,and Basic Research Project in Shanxi Province(Young Foundation)
文摘In order to reduce amount of data storage and improve processing capacity of the system, this paper proposes a new classification method of data source by combining phase synchronization model in network clustering with cloud model. Firstly, taking data source as a complex network, after the topography of network is obtained, the cloud model of each node data is determined by fuzzy analytic hierarchy process (AHP). Secondly, by calculating expectation, entropy and hyper entropy of the cloud model, comprehensive coupling strength is got and then it is regarded as the edge weight of topography. Finally, distribution curve is obtained by iterating the phase of each node by means of phase synchronization model. Thus classification of data source is completed. This method can not only provide convenience for storage, cleaning and compression of data, but also improve the efficiency of data analysis.
基金supported by the National Natural Science Foundation of China(Grant No.51379162)the Water Conservancy Science and Technology Innovation Project of Guangdong Province(Grant No.2016-06)
文摘Uncertainties existing in the process of dam deformation negatively influence deformation prediction. However, existing deformation pre- diction models seldom consider uncertainties. In this study, a cloud-Verhulst hybrid prediction model was established by combing a cloud model with the Verhulst model. The expectation, one of the cloud characteristic parameters, was obtained using the Verhulst model, and the other two cloud characteristic parameters, entropy and hyper-entropy, were calculated by introducing inertia weight. The hybrid prediction model was used to predict the dam deformation in a hydroelectric project. Comparison of the prediction results of the hybrid prediction model with those of a traditional statistical model and the monitoring values shows that the proposed model has higher prediction accuracy than the traditional sta- tistical model. It provides a new approach to predicting dam deformation under uncertain conditions.
基金This study was supported by the Grant-in-Aid for Science Research of the Ministry of Education and Culture, Japan, under the Grant No. 08455232.
文摘The motion of particle clouds formed by dumping dredged material into quiescent waters is experimentally and numerically studied. In the numerical model, the particle phase is modeled by the dispersion model, and turbulence is calculated by the large eddy simulation. The governing equations, including the filtered Navier-Stokes equations and mass transport equation, are solved based on the operator-splitting algorithm and an implicit cubic spline interpolation scheme. The eddy viscosity is evaluated by the modified Smagorinsky model including the buoyancy term. Comparisons of main flow characteristics, including shape, size, average density excess, moving speed and the amount of particles deposited on the bed, between experimental and computational results show that the numerical model well predicts the motion of the cloud from the falling to spreading stage. The effects of silt-fence on the motion of the particle cloud are also investigated.
基金Project(E2015203354)supported by Natural Science Foundation of Steel United Research Fund of Hebei Province,ChinaProject(ZD2016100)supported by the Science and the Technology Research Key Project of High School of Hebei Province,China+1 种基金Project(LJRC013)supported by the University Innovation Team of Hebei Province Leading Talent Cultivation,ChinaProject(16LGY015)supported by the Basic Research Special Breeding of Yanshan University,China
文摘The accuracy of present flatness predictive method is limited and it just belongs to software simulation. In order to improve it, a novel flatness predictive model via T-S cloud reasoning network implemented by digital signal processor(DSP) is proposed. First, the combination of genetic algorithm(GA) and simulated annealing algorithm(SAA) is put forward, called GA-SA algorithm, which can make full use of the global search ability of GA and local search ability of SA. Later, based on T-S cloud reasoning neural network, flatness predictive model is designed in DSP. And it is applied to 900 HC reversible cold rolling mill. Experimental results demonstrate that the flatness predictive model via T-S cloud reasoning network can run on the hardware DSP TMS320 F2812 with high accuracy and robustness by using GA-SA algorithm to optimize the model parameter.
基金supported by the National Science and Technology Major Project of Water Pollution Control and Treatment(Grants No.2014ZX07405002,2012ZX07506007,2012ZX07506006,and 2012ZX07506002)the Natural Science Foundation of the Anhui Higher Education Institutions of China(Grant No.KJ2016A868)the Priority Academic Program Development of Jiangsu Higher Education Institutions
文摘With a focus on the difficulty of quantitatively describing the degree of nonuniformity of temporal and spatial distributions of water resources, quantitative research was carried out on the temporal and spatial distribution characteristics of water resources in Guangdong Province from 1956 to 2000 based on a cloud model. The spatial variation of the temporal distribution characteristics and the temporal variation of the spatial distribution characteristics were both analyzed. In addition, the relationships between the numerical characteristics of the cloud model of temporal and spatial distributions of water resources and precipitation were also studied. The results show that, using a cloud model, it is possible to intuitively describe the temporal and spatial distribution characteristics of water resources in cloud images. Water resources in Guangdong Province and their temporal and spatial distribution characteristics are differentiated by their geographic locations. Downstream and coastal areas have a larger amount of water resources with greater uniformity and stronger stability in terms of temporal distribution. Regions with more precipitation possess larger amounts of water resources, and years with more precipitation show greater nonuniformity in the spatial distribution of water resources. The correlation between the nonuniformity of the temporal distribution and local precipitation is small, and no correlation is found between the stability of the nonuniformity of the temporal and spatial distributions of water resources and precipitation. The amount of water resources in Guangdong Province shows an increasing trend from 1956 to 2000, the nonuniformity of the spatial distribution of water resources declines, and the stability of the nonuniformity of the spatial distribution of water resources is enhanced.