The analysis of remote sensing image areas is needed for climate detec-tion and management,especially for monitoringflood disasters in critical environ-ments and applications.Satellites are mostly used to detect disast...The analysis of remote sensing image areas is needed for climate detec-tion and management,especially for monitoringflood disasters in critical environ-ments and applications.Satellites are mostly used to detect disasters on Earth,and they have advantages in capturing Earth images.Using the control technique,Earth images can be used to obtain detailed terrain information.Since the acquisi-tion of satellite and aerial imagery,this system has been able to detectfloods,and with increasing convenience,flood detection has become more desirable in the last few years.In this paper,a Big Data Set-based Progressive Image Classification Algorithm(PICA)system is introduced to implement an image processing tech-nique,detect disasters,and determine results with the help of the PICA,which allows disaster analysis to be extracted more effectively.The PICA is essential to overcoming strong shadows,for proper access to disaster characteristics to false positives by operators,and to false predictions that affect the impact of the disas-ter.The PICA creates tailoring and adjustments obtained from satellite images before training and post-disaster aerial image data patches.Two types of proposed PICA systems detect disasters faster and more accurately(95.6%).展开更多
The progressive and mixing algorithm(PAMA) is a method for surface modeling and editing,which is developed for effective and flexible applications in many environments,such as computer-aided design(CAD) and computer-a...The progressive and mixing algorithm(PAMA) is a method for surface modeling and editing,which is developed for effective and flexible applications in many environments,such as computer-aided design(CAD) and computer-aided geometric design(CAGD).In this paper,the construction scheme and continuities of PAMA are discussed,which provide a mathematics analysis of PAMA.The analysis and results show that the PAMA provides a new method of surface modeling and editing with four more degrees of freedom for designers to manipulate a 3D object.展开更多
In order to resolve grid distortions in finite element method(FEM), the meshless numerical method which is called general particle dynamics(GPD) was presented to simulate the large deformation and failure of geomateri...In order to resolve grid distortions in finite element method(FEM), the meshless numerical method which is called general particle dynamics(GPD) was presented to simulate the large deformation and failure of geomaterials. The Mohr-Coulomb strength criterion was implemented into the code to describe the elasto-brittle behaviours of geomaterials while the solid-structure(reinforcing pile) interaction was simulated as an elasto-brittle material. The Weibull statistical approach was applied to describing the heterogeneity of geomaterials. As an application of general particle dynamics to slopes, the interaction between the slopes and the reinforcing pile was modelled. The contact between the geomaterials and the reinforcing pile was modelled by using the coupling condition associated with a Lennard-Jones repulsive force. The safety factor, corresponding to the minimum shear strength reduction factor "R", was obtained, and the slip surface of the slope was determined. The numerical results are in good agreement with those obtained from limit equilibrium method and finite element method. It indicates that the proposed geomaterial-structure interaction algorithm works well in the GPD framework.展开更多
In this paper, we construct a Bayesian framework combining Type-Ⅰ progressively hybrid censoring scheme and competing risks which are independently distributed as exponentiated Weibull distribution with one scale par...In this paper, we construct a Bayesian framework combining Type-Ⅰ progressively hybrid censoring scheme and competing risks which are independently distributed as exponentiated Weibull distribution with one scale parameter and two shape parameters. Since there exist unknown hyper-parameters in prior density functions of shape parameters, we consider the hierarchical priors to obtain the individual marginal posterior density functions,Bayesian estimates and highest posterior density credible intervals. As explicit expressions of estimates cannot be obtained, the componentwise updating algorithm of Metropolis-Hastings method is employed to compute the numerical results. Finally, it is concluded that Bayesian estimates have a good performance.展开更多
The primary focus of this paper is to design a progressive restoration plan for an enterprise data center environment following a partial or full disruption. Repairing and restoring disrupted components in an enterpri...The primary focus of this paper is to design a progressive restoration plan for an enterprise data center environment following a partial or full disruption. Repairing and restoring disrupted components in an enterprise data center requires a significant amount of time and human effort. Following a major disruption, the recovery process involves multiple stages, and during each stage, the partially recovered infrastructures can provide limited services to users at some degraded service level. However, how fast and efficiently an enterprise infrastructure can be recovered de- pends on how the recovery mechanism restores the disrupted components, considering the inter-dependencies between services, along with the limitations of expert human operators. The entire problem turns out to be NP- hard and rather complex, and we devise an efficient meta-heuristic to solve the problem. By considering some real-world examples, we show that the proposed meta-heuristic provides very accurate results, and still runs 600-2800 times faster than the optimal solution obtained from a general purpose mathematical solver [1].展开更多
To improve the performance of Saitou and Nei's algorithm (SN) and Studier and Keppler's improved algorithm (SK) for constructing neighbor-joining phylogenetic trees and reduce the time complexity of the computat...To improve the performance of Saitou and Nei's algorithm (SN) and Studier and Keppler's improved algorithm (SK) for constructing neighbor-joining phylogenetic trees and reduce the time complexity of the computation, a fast algorithm is proposed. The proposed algorithm includes three techniques. First, a linear array A[N] is introduced to store the sum of every row of the distance matrix (the same as SK), which can eliminate many repeated computations. Secondly, the value of A [i] is computed only once at the beginning of the algorithm, and is updated by three elements in the iteration. Thirdly, a very compact formula for the sum of all the branch lengths of operational taxonomic units (OTUs) i and j is designed, and the correctness of the formula is proved. The experimental results show that the proposed algorithm is from tens to hundreds times faster than SN and roughly two times faster than SK when N increases, constructing a tree with 2 000 OTUs in 3 min on a current desktop computer. To earn the time with the cost of the space and reduce the computations in the innermost loop are the basic solutions for algorithms with many loops.展开更多
针对逐步Ⅱ型删失数据下Burr Type X分布的参数估计问题,提出模型参数的一种新的贝叶斯估计及相应的最大后验密度(HPD)置信区间.假设伽玛分布为待估参数的先验分布,考虑待估参数的条件后验分布未知、单峰且近似对称,选取以正态分布为提...针对逐步Ⅱ型删失数据下Burr Type X分布的参数估计问题,提出模型参数的一种新的贝叶斯估计及相应的最大后验密度(HPD)置信区间.假设伽玛分布为待估参数的先验分布,考虑待估参数的条件后验分布未知、单峰且近似对称,选取以正态分布为提议分布的Metropolis-Hastings(MH)算法生成后验样本,基于后验样本在平方误差损失函数下得到待估参数的贝叶斯估计和HPD置信区间.将基于MH算法得到的贝叶斯估计和HPD置信区间与基于EM算法得到的极大似然估计和置信区间在均方误差准则和精度意义下进行比较.Monte-Carlo模拟结果表明,基于MH算法得到的估计在均方误差准则下优于基于EM算法得到的极大似然估计,基于MH算法得到的HPD置信区间长度小于基于EM算法得到的置信区间长度.展开更多
基金funded by Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia,under grant No.(PNURSP2022R161).
文摘The analysis of remote sensing image areas is needed for climate detec-tion and management,especially for monitoringflood disasters in critical environ-ments and applications.Satellites are mostly used to detect disasters on Earth,and they have advantages in capturing Earth images.Using the control technique,Earth images can be used to obtain detailed terrain information.Since the acquisi-tion of satellite and aerial imagery,this system has been able to detectfloods,and with increasing convenience,flood detection has become more desirable in the last few years.In this paper,a Big Data Set-based Progressive Image Classification Algorithm(PICA)system is introduced to implement an image processing tech-nique,detect disasters,and determine results with the help of the PICA,which allows disaster analysis to be extracted more effectively.The PICA is essential to overcoming strong shadows,for proper access to disaster characteristics to false positives by operators,and to false predictions that affect the impact of the disas-ter.The PICA creates tailoring and adjustments obtained from satellite images before training and post-disaster aerial image data patches.Two types of proposed PICA systems detect disasters faster and more accurately(95.6%).
基金the 2009 Excellent Going Abroad Experts Training Program in Hebei Province,China(No.[2010]76)
文摘The progressive and mixing algorithm(PAMA) is a method for surface modeling and editing,which is developed for effective and flexible applications in many environments,such as computer-aided design(CAD) and computer-aided geometric design(CAGD).In this paper,the construction scheme and continuities of PAMA are discussed,which provide a mathematics analysis of PAMA.The analysis and results show that the PAMA provides a new method of surface modeling and editing with four more degrees of freedom for designers to manipulate a 3D object.
基金Projects(51325903,51279218)supported by the National Natural Science Foundation of ChinaProject(cstc2013kjrcljrccj0001)supported by the Natural Science Foundation Project of CQ CSTC,ChinaProject(20130191110037)supported by Research fund by the Doctoral Program of Higher Education of China
文摘In order to resolve grid distortions in finite element method(FEM), the meshless numerical method which is called general particle dynamics(GPD) was presented to simulate the large deformation and failure of geomaterials. The Mohr-Coulomb strength criterion was implemented into the code to describe the elasto-brittle behaviours of geomaterials while the solid-structure(reinforcing pile) interaction was simulated as an elasto-brittle material. The Weibull statistical approach was applied to describing the heterogeneity of geomaterials. As an application of general particle dynamics to slopes, the interaction between the slopes and the reinforcing pile was modelled. The contact between the geomaterials and the reinforcing pile was modelled by using the coupling condition associated with a Lennard-Jones repulsive force. The safety factor, corresponding to the minimum shear strength reduction factor "R", was obtained, and the slip surface of the slope was determined. The numerical results are in good agreement with those obtained from limit equilibrium method and finite element method. It indicates that the proposed geomaterial-structure interaction algorithm works well in the GPD framework.
基金Supported by the National Natural Science Foundation of China(71571144,71401134,71171164,11701406) Supported by the International Cooperation and Exchanges in Science and Technology Program of Shaanxi Province(2016KW-033)
文摘In this paper, we construct a Bayesian framework combining Type-Ⅰ progressively hybrid censoring scheme and competing risks which are independently distributed as exponentiated Weibull distribution with one scale parameter and two shape parameters. Since there exist unknown hyper-parameters in prior density functions of shape parameters, we consider the hierarchical priors to obtain the individual marginal posterior density functions,Bayesian estimates and highest posterior density credible intervals. As explicit expressions of estimates cannot be obtained, the componentwise updating algorithm of Metropolis-Hastings method is employed to compute the numerical results. Finally, it is concluded that Bayesian estimates have a good performance.
文摘The primary focus of this paper is to design a progressive restoration plan for an enterprise data center environment following a partial or full disruption. Repairing and restoring disrupted components in an enterprise data center requires a significant amount of time and human effort. Following a major disruption, the recovery process involves multiple stages, and during each stage, the partially recovered infrastructures can provide limited services to users at some degraded service level. However, how fast and efficiently an enterprise infrastructure can be recovered de- pends on how the recovery mechanism restores the disrupted components, considering the inter-dependencies between services, along with the limitations of expert human operators. The entire problem turns out to be NP- hard and rather complex, and we devise an efficient meta-heuristic to solve the problem. By considering some real-world examples, we show that the proposed meta-heuristic provides very accurate results, and still runs 600-2800 times faster than the optimal solution obtained from a general purpose mathematical solver [1].
文摘To improve the performance of Saitou and Nei's algorithm (SN) and Studier and Keppler's improved algorithm (SK) for constructing neighbor-joining phylogenetic trees and reduce the time complexity of the computation, a fast algorithm is proposed. The proposed algorithm includes three techniques. First, a linear array A[N] is introduced to store the sum of every row of the distance matrix (the same as SK), which can eliminate many repeated computations. Secondly, the value of A [i] is computed only once at the beginning of the algorithm, and is updated by three elements in the iteration. Thirdly, a very compact formula for the sum of all the branch lengths of operational taxonomic units (OTUs) i and j is designed, and the correctness of the formula is proved. The experimental results show that the proposed algorithm is from tens to hundreds times faster than SN and roughly two times faster than SK when N increases, constructing a tree with 2 000 OTUs in 3 min on a current desktop computer. To earn the time with the cost of the space and reduce the computations in the innermost loop are the basic solutions for algorithms with many loops.
文摘针对逐步Ⅱ型删失数据下Burr Type X分布的参数估计问题,提出模型参数的一种新的贝叶斯估计及相应的最大后验密度(HPD)置信区间.假设伽玛分布为待估参数的先验分布,考虑待估参数的条件后验分布未知、单峰且近似对称,选取以正态分布为提议分布的Metropolis-Hastings(MH)算法生成后验样本,基于后验样本在平方误差损失函数下得到待估参数的贝叶斯估计和HPD置信区间.将基于MH算法得到的贝叶斯估计和HPD置信区间与基于EM算法得到的极大似然估计和置信区间在均方误差准则和精度意义下进行比较.Monte-Carlo模拟结果表明,基于MH算法得到的估计在均方误差准则下优于基于EM算法得到的极大似然估计,基于MH算法得到的HPD置信区间长度小于基于EM算法得到的置信区间长度.