In reliability engineering,the observations of the variables of interest are always limited due to cost or schedule constraints.Consequently,the epistemic uncertainty,which derives from lack of knowledge and informati...In reliability engineering,the observations of the variables of interest are always limited due to cost or schedule constraints.Consequently,the epistemic uncertainty,which derives from lack of knowledge and information,plays a vital influence on the reliability evaluation.Belief reliability is a new reliability metric that takes the impact of epistemic uncertainty into consideration and belief reliability distribution is fundamental to belief reliability application.This paper develops a new method called graduation formula to construct belief reliability distribution with limited observations.The developed method constructs the belief reliability distribution by determining the corresponding belief degrees of the observations.An algorithm is designed for the graduation formula as it is a set of transcendental equations,which is difficult to determine the analytical solution.The developed method and the proposed algorithm are illustrated by two numerical examples to show their efficiency and future application.展开更多
As the share of wind power in power systems continues to increase, the limited predictability of wind power generation brings serious potential risks to power system reliability. Previous research works have generally...As the share of wind power in power systems continues to increase, the limited predictability of wind power generation brings serious potential risks to power system reliability. Previous research works have generally described the uncertainty of wind power forecast errors(WPFEs) based on normal distribution or other standard distribution models, which only characterize the aleatory uncertainty. In fact, epistemic uncertainty in WPFE modeling due to limited data and knowledge should also be addressed. This paper proposes a multi-source information fusion method(MSIFM) to quantify WPFEs when considering both aleatory and epistemic uncertainties. An extended focal element(EFE) selection method based on the adequacy of historical data is developed to consider the characteristics of WPFEs. Two supplementary expert information sources are modeled to improve the accuracy in the case of insufficient historical data. An operation reliability evaluation technique is also developed considering the proposed WPFE model. Finally,a double-layer Monte Carlo simulation method is introduced to generate a time-series output of the wind power. The effectiveness and accuracy of the proposed MSIFM are demonstrated through simulation results.展开更多
Aiming at the problem that the epistemic uncertain parameters exist in an acoustic field, an evidence theory-based finite element method (ETFEM) is proposed by introducing the evidence theory, in which the focal ele...Aiming at the problem that the epistemic uncertain parameters exist in an acoustic field, an evidence theory-based finite element method (ETFEM) is proposed by introducing the evidence theory, in which the focal element and basic probability assignment (BPA) are used to describe the epistemic uncertainty. In order to reduce the computational cost, the interval analysis technique based on perturbation method is adopted to acquire the approximate sound pressure response bounds for each focal element. The corresponding formulations of intervals of expectation and standard deviation of the sound pressure response with epistemic uncertainty are deduced. The sound pressure response of a 2D acoustic tube and a 2D car acoustic cavity with epistemic uncertain parameters are analyzed by the proposed method. The proposed method is verified through the comparison of the analysis results of random acoustic field with that of epistemic uncertain acoustic field. Numerical analysis results show that the proposed method can analyze the 2D acoustic field with epistemic uncertainty effectively, and has good prospect of engineering application.展开更多
In this paper, a data-driven prognostic model capable to deal with different sources of uncertainty is proposed. The main novelty factor is the application of a mathematical framework, namely a Random Fuzzy Variable (...In this paper, a data-driven prognostic model capable to deal with different sources of uncertainty is proposed. The main novelty factor is the application of a mathematical framework, namely a Random Fuzzy Variable (RFV) approach, for the representation and propagation of the different uncertainty sources affecting </span><span style="font-family:Verdana;">Prognostic Health Management (PHM) applications: measurement, future and model uncertainty. </span><span style="font-family:Verdana;">In this way, it is possible to deal not only with measurement noise and model parameters uncertainty due to the stochastic nature of the degradation process, but also with systematic effects, such as systematic errors in the measurement process, incomplete knowledge of the degradation process, subjective belief about model parameters. Furthermore, the low analytical complexity of the employed prognostic model allows to easily propagate the measurement and parameters uncertainty into the RUL forecast, with no need of extensive Monte Carlo loops, so that low requirements in terms of computation power are needed. The model has been applied to two real application cases, showing high accuracy output, resulting in a potential</span></span><span style="font-family:Verdana;">ly</span><span style="font-family:Verdana;"> effective tool for predictive maintenance in different industrial sectors.展开更多
Prior research on the resilience of critical infrastructure usually utilizes the network model to characterize the structure of the components so that a quantitative representation of resilience can be obtained. Parti...Prior research on the resilience of critical infrastructure usually utilizes the network model to characterize the structure of the components so that a quantitative representation of resilience can be obtained. Particularly, network component importance is addressed to express its significance in shaping the resilience performance of the whole system. Due to the intrinsic complexity of the problem, some idealized assumptions are exerted on the resilience-optimization problem to find partial solutions. This paper seeks to exploit the dynamic aspect of system resilience, i.e., the scheduling problem of link recovery in the post-disruption phase.The aim is to analyze the recovery strategy of the system with more practical assumptions, especially inhomogeneous time cost among links. In view of this, the presented work translates the resilience-maximization recovery plan into the dynamic decisionmaking of runtime recovery option. A heuristic scheme is devised to treat the core problem of link selection in an ongoing style.Through Monte Carlo simulation, the link recovery order rendered by the proposed scheme demonstrates excellent resilience performance as well as accommodation with uncertainty caused by epistemic knowledge.展开更多
Caisson breakwaters are mainly constructed in deep waters to protect an area against waves.These breakwaters are con-ventionally designed based on the concept of the safety factor.However,the wave loads and resistance...Caisson breakwaters are mainly constructed in deep waters to protect an area against waves.These breakwaters are con-ventionally designed based on the concept of the safety factor.However,the wave loads and resistance of structures have epistemic or aleatory uncertainties.Furthermore,sliding failure is one of the most important failure modes of caisson breakwaters.In most previous studies,for assessment purposes,uncertainties,such as wave and wave period variation,were ignored.Therefore,in this study,Bayesian reliability analysis is implemented to assess the failure probability of the sliding of Tombak port breakwater in the Persian Gulf.The mean and standard deviations were taken as random variables to consider dismissed uncertainties.For this purpose,the frst-order reliability method(FORM)and the frst principal curvature cor-rection in FORM are used to calculate the reliability index.The performances of these methods are verifed by importance sampling through Monte Carlo simulation(MCS).In addition,the reliability index sensitivities of each random variable are calculated to evaluate the importance of diferent random variables while calculating the caisson sliding.The results show that the reliability index is most sensitive to the coefcients of friction,wave height,and caisson weight(or concrete density).The sensitivity of the failure probability of each of the random variables and their uncertainties are calculated by the derivative method.Finally,the Bayesian regression is implemented to predict the statistical properties of breakwater sliding with non-informative priors,which are compared to Goda’s formulation,used in breakwater design standards.The analysis shows that the model posterior for the sliding of a caisson breakwater has a mean and standard deviation of 0.039 and 0.022,respectively.A normal quantile analysis and residual analysis are also performed to evaluate the correctness of the model responses.展开更多
Uncertainty design can take account of aleatory and epistemic uncertainty in optimal processes.Aleatory uncertainty and epistemic uncertainty can be expressed as evidence theory uniformly, and evidence theory is used ...Uncertainty design can take account of aleatory and epistemic uncertainty in optimal processes.Aleatory uncertainty and epistemic uncertainty can be expressed as evidence theory uniformly, and evidence theory is used to describe the uncertainty. Transferring and response with evidence theory for structural optimal design are introduced. The principle of response evaluation is also set up. Finally, the cantilever beam in a test system is optimized in the introduced optimization process, and the results are estimated by the evaluation principle. The optimal process is validated after the optimization of beam.展开更多
This paper proposes a novel model named as “imprecise stochastic process model” to handle the dynamic uncertainty with insufficient sample information in real-world problems. In the imprecise stochastic process mode...This paper proposes a novel model named as “imprecise stochastic process model” to handle the dynamic uncertainty with insufficient sample information in real-world problems. In the imprecise stochastic process model, the imprecise probabilistic model rather than a precise probability distribution function is employed to characterize the uncertainty at each time point for a time-variant parameter, which provides an effective tool for problems with limited experimental samples. The linear correlation between variables at different time points for imprecise stochastic processes is described by defining the auto-correlation coefficient function and the crosscorrelation coefficient function. For the convenience of analysis, this paper gives the definition of the P-box-based imprecise stochastic process and categorizes it into two classes: parameterized and non-parameterized P-box-based imprecise stochastic processes. Besides, a time-variant reliability analysis approach is developed based on the P-box-based imprecise stochastic process model,through which the interval of dynamic reliability for a structure under uncertain dynamic excitations or time-variant factors can be obtained. Finally, the effectiveness of the proposed method is verified by investigating three numerical examples.展开更多
Model calibration is the procedure that adjusts the unknown parameters in order to fit the model to experimental data and improve predictive capability.However,it is difficult to implement the procedure because of the...Model calibration is the procedure that adjusts the unknown parameters in order to fit the model to experimental data and improve predictive capability.However,it is difficult to implement the procedure because of the aleatory uncertainty.In this paper,a new method of model calibration based on uncertainty propagation is investigated.The calibration process is described as an optimization problem.A two-stage nested uncertainty propagation method is proposed to resolve this problem.Monte Carlo Simulation method is applied for the inner loop to propagate the aleatory uncertainty.Optimization method is applied for the outer loop to propagate the epistemic uncertainty.The optimization objective function is the consistency between the result of the inner loop and the experimental data.Thus,different consistency measurement methods for unary output and multivariate outputs are proposed as the optimization objective function.Finally,the thermal challenge problem is given to validate the reasonableness and effectiveness of the proposed method.展开更多
In this work we consider a general notion of distributional sensitivity,which measures the variation in solutions of a given physical/mathematical system with respect to the variation of probability distribution of th...In this work we consider a general notion of distributional sensitivity,which measures the variation in solutions of a given physical/mathematical system with respect to the variation of probability distribution of the inputs.This is distinctively different from the classical sensitivity analysis,which studies the changes of solutions with respect to the values of the inputs.The general idea is measurement of sensitivity of outputs with respect to probability distributions,which is a well-studied concept in related disciplines.We adapt these ideas to present a quantitative framework in the context of uncertainty quantification for measuring such a kind of sensitivity and a set of efficient algorithms to approximate the distributional sensitivity numerically.A remarkable feature of the algorithms is that they do not incur additional computational effort in addition to a one-time stochastic solver.Therefore,an accurate stochastic computation with respect to a prior input distribution is needed only once,and the ensuing distributional sensitivity computation for different input distributions is a post-processing step.We prove that an accurate numerical model leads to accurate calculations of this sensitivity,which applies not just to slowly-converging Monte-Carlo estimates,but also to exponentially convergent spectral approximations.We provide computational examples to demonstrate the ease of applicability and verify the convergence claims.展开更多
Fragility curves are commonly used in civil engineering to assess the vulnerability of structures to earthquakes. The probability of failure associated with a prescribed criterion (e.g., the maximal inter-storey drif...Fragility curves are commonly used in civil engineering to assess the vulnerability of structures to earthquakes. The probability of failure associated with a prescribed criterion (e.g., the maximal inter-storey drift of a building exceeding a certain threshold) is represented as a function of the intensity of the earthquake ground motion (e.g., peak ground acceleration or spectral acceleration). The classical approach relies on assuming a lognormal shape of the fragility curves; it is thus parametric. In this paper, we introduce two non-parametric approaches to establish the fragility curves without employing the above assumption, namely binned Monte Carlo simulation and kernel density estimation. As an illustration, we compute the fragility curves for a three-storey steel frame using a large number of synthetic ground motions. The curves obtained with the non-parametric approaches are compared with respective curves based on the lognormal assumption. A similar comparison is presented for a case when a limited number of recorded ground motions is available. It is found that the accuracy of the lognormal curves depends on the ground motion intensity measure, the failure criterion and most importantly, on the employed method for estimating the parameters of the lognormal shape.展开更多
It is mainly deals with models of wireless communication channels,and in particular with the difficulty of finding an exact model which is both mathematically easy to deal with and physically accurate.It is argued tha...It is mainly deals with models of wireless communication channels,and in particular with the difficulty of finding an exact model which is both mathematically easy to deal with and physically accurate.It is argued that a certain amount of uncertainty necessarily following the use of inaccurate channel models should be accepted,and suitable tools used to evaluate its effects on analysis and design.With our approach,lower and upper bounds on required performance parameters are derived under no assumption of exact knowledge of the underlying probability distributions,and model uncertainty effects are propagated throughout calculations.Sepcial attention is given to the derivation of upper and lower bounds on system performance when this is determined by random variables whose dependence is not exactly known.展开更多
基金the National Natural Science Foundation of China(6157304371671009).
文摘In reliability engineering,the observations of the variables of interest are always limited due to cost or schedule constraints.Consequently,the epistemic uncertainty,which derives from lack of knowledge and information,plays a vital influence on the reliability evaluation.Belief reliability is a new reliability metric that takes the impact of epistemic uncertainty into consideration and belief reliability distribution is fundamental to belief reliability application.This paper develops a new method called graduation formula to construct belief reliability distribution with limited observations.The developed method constructs the belief reliability distribution by determining the corresponding belief degrees of the observations.An algorithm is designed for the graduation formula as it is a set of transcendental equations,which is difficult to determine the analytical solution.The developed method and the proposed algorithm are illustrated by two numerical examples to show their efficiency and future application.
基金supported by the Joint Research Fund in Smart Grid (No.U1966601) under cooperative agreement between the National Natural Science Foundation of China (NSFC) and State Grid Corporation of China。
文摘As the share of wind power in power systems continues to increase, the limited predictability of wind power generation brings serious potential risks to power system reliability. Previous research works have generally described the uncertainty of wind power forecast errors(WPFEs) based on normal distribution or other standard distribution models, which only characterize the aleatory uncertainty. In fact, epistemic uncertainty in WPFE modeling due to limited data and knowledge should also be addressed. This paper proposes a multi-source information fusion method(MSIFM) to quantify WPFEs when considering both aleatory and epistemic uncertainties. An extended focal element(EFE) selection method based on the adequacy of historical data is developed to consider the characteristics of WPFEs. Two supplementary expert information sources are modeled to improve the accuracy in the case of insufficient historical data. An operation reliability evaluation technique is also developed considering the proposed WPFE model. Finally,a double-layer Monte Carlo simulation method is introduced to generate a time-series output of the wind power. The effectiveness and accuracy of the proposed MSIFM are demonstrated through simulation results.
基金supported by the National Natural Science Foundation of China(11572121)Independent Research Project of State Key Laboratory of Advanced Design and Manufacturing for Vehicle Body(71375004)
文摘Aiming at the problem that the epistemic uncertain parameters exist in an acoustic field, an evidence theory-based finite element method (ETFEM) is proposed by introducing the evidence theory, in which the focal element and basic probability assignment (BPA) are used to describe the epistemic uncertainty. In order to reduce the computational cost, the interval analysis technique based on perturbation method is adopted to acquire the approximate sound pressure response bounds for each focal element. The corresponding formulations of intervals of expectation and standard deviation of the sound pressure response with epistemic uncertainty are deduced. The sound pressure response of a 2D acoustic tube and a 2D car acoustic cavity with epistemic uncertain parameters are analyzed by the proposed method. The proposed method is verified through the comparison of the analysis results of random acoustic field with that of epistemic uncertain acoustic field. Numerical analysis results show that the proposed method can analyze the 2D acoustic field with epistemic uncertainty effectively, and has good prospect of engineering application.
文摘In this paper, a data-driven prognostic model capable to deal with different sources of uncertainty is proposed. The main novelty factor is the application of a mathematical framework, namely a Random Fuzzy Variable (RFV) approach, for the representation and propagation of the different uncertainty sources affecting </span><span style="font-family:Verdana;">Prognostic Health Management (PHM) applications: measurement, future and model uncertainty. </span><span style="font-family:Verdana;">In this way, it is possible to deal not only with measurement noise and model parameters uncertainty due to the stochastic nature of the degradation process, but also with systematic effects, such as systematic errors in the measurement process, incomplete knowledge of the degradation process, subjective belief about model parameters. Furthermore, the low analytical complexity of the employed prognostic model allows to easily propagate the measurement and parameters uncertainty into the RUL forecast, with no need of extensive Monte Carlo loops, so that low requirements in terms of computation power are needed. The model has been applied to two real application cases, showing high accuracy output, resulting in a potential</span></span><span style="font-family:Verdana;">ly</span><span style="font-family:Verdana;"> effective tool for predictive maintenance in different industrial sectors.
基金supported by the National Natural Science Foundation of China(51479158)the Fundamental Research Funds for the Central Universities(WUT:2018III061GX)
文摘Prior research on the resilience of critical infrastructure usually utilizes the network model to characterize the structure of the components so that a quantitative representation of resilience can be obtained. Particularly, network component importance is addressed to express its significance in shaping the resilience performance of the whole system. Due to the intrinsic complexity of the problem, some idealized assumptions are exerted on the resilience-optimization problem to find partial solutions. This paper seeks to exploit the dynamic aspect of system resilience, i.e., the scheduling problem of link recovery in the post-disruption phase.The aim is to analyze the recovery strategy of the system with more practical assumptions, especially inhomogeneous time cost among links. In view of this, the presented work translates the resilience-maximization recovery plan into the dynamic decisionmaking of runtime recovery option. A heuristic scheme is devised to treat the core problem of link selection in an ongoing style.Through Monte Carlo simulation, the link recovery order rendered by the proposed scheme demonstrates excellent resilience performance as well as accommodation with uncertainty caused by epistemic knowledge.
文摘Caisson breakwaters are mainly constructed in deep waters to protect an area against waves.These breakwaters are con-ventionally designed based on the concept of the safety factor.However,the wave loads and resistance of structures have epistemic or aleatory uncertainties.Furthermore,sliding failure is one of the most important failure modes of caisson breakwaters.In most previous studies,for assessment purposes,uncertainties,such as wave and wave period variation,were ignored.Therefore,in this study,Bayesian reliability analysis is implemented to assess the failure probability of the sliding of Tombak port breakwater in the Persian Gulf.The mean and standard deviations were taken as random variables to consider dismissed uncertainties.For this purpose,the frst-order reliability method(FORM)and the frst principal curvature cor-rection in FORM are used to calculate the reliability index.The performances of these methods are verifed by importance sampling through Monte Carlo simulation(MCS).In addition,the reliability index sensitivities of each random variable are calculated to evaluate the importance of diferent random variables while calculating the caisson sliding.The results show that the reliability index is most sensitive to the coefcients of friction,wave height,and caisson weight(or concrete density).The sensitivity of the failure probability of each of the random variables and their uncertainties are calculated by the derivative method.Finally,the Bayesian regression is implemented to predict the statistical properties of breakwater sliding with non-informative priors,which are compared to Goda’s formulation,used in breakwater design standards.The analysis shows that the model posterior for the sliding of a caisson breakwater has a mean and standard deviation of 0.039 and 0.022,respectively.A normal quantile analysis and residual analysis are also performed to evaluate the correctness of the model responses.
文摘Uncertainty design can take account of aleatory and epistemic uncertainty in optimal processes.Aleatory uncertainty and epistemic uncertainty can be expressed as evidence theory uniformly, and evidence theory is used to describe the uncertainty. Transferring and response with evidence theory for structural optimal design are introduced. The principle of response evaluation is also set up. Finally, the cantilever beam in a test system is optimized in the introduced optimization process, and the results are estimated by the evaluation principle. The optimal process is validated after the optimization of beam.
基金supported by the Science Challenge Project,China(No.TZ2018007)the National Science Fund for Distinguished Young Scholars,China(No.51725502)+2 种基金the Foundation for Innovative Research Groups of the National Natural Science Foundation of China(No.51621004)the Fundamental Research Foundation of China(No.JCKY2020110C105)the National Natural Science Foundation of China(No.52105253)。
文摘This paper proposes a novel model named as “imprecise stochastic process model” to handle the dynamic uncertainty with insufficient sample information in real-world problems. In the imprecise stochastic process model, the imprecise probabilistic model rather than a precise probability distribution function is employed to characterize the uncertainty at each time point for a time-variant parameter, which provides an effective tool for problems with limited experimental samples. The linear correlation between variables at different time points for imprecise stochastic processes is described by defining the auto-correlation coefficient function and the crosscorrelation coefficient function. For the convenience of analysis, this paper gives the definition of the P-box-based imprecise stochastic process and categorizes it into two classes: parameterized and non-parameterized P-box-based imprecise stochastic processes. Besides, a time-variant reliability analysis approach is developed based on the P-box-based imprecise stochastic process model,through which the interval of dynamic reliability for a structure under uncertain dynamic excitations or time-variant factors can be obtained. Finally, the effectiveness of the proposed method is verified by investigating three numerical examples.
基金This work is supported by the National Natural Science Foundation of China(Grant No.61403097)the Fundamental Research Funds for the Central Universities(Grant No.HIT.NSRIF.2015035).
文摘Model calibration is the procedure that adjusts the unknown parameters in order to fit the model to experimental data and improve predictive capability.However,it is difficult to implement the procedure because of the aleatory uncertainty.In this paper,a new method of model calibration based on uncertainty propagation is investigated.The calibration process is described as an optimization problem.A two-stage nested uncertainty propagation method is proposed to resolve this problem.Monte Carlo Simulation method is applied for the inner loop to propagate the aleatory uncertainty.Optimization method is applied for the outer loop to propagate the epistemic uncertainty.The optimization objective function is the consistency between the result of the inner loop and the experimental data.Thus,different consistency measurement methods for unary output and multivariate outputs are proposed as the optimization objective function.Finally,the thermal challenge problem is given to validate the reasonableness and effectiveness of the proposed method.
基金supported by NSF awards DMS-0645035 and IIS-0914447AFOSR award FA9550-08-1-0353DOE award DE-FC52-08NA28617.
文摘In this work we consider a general notion of distributional sensitivity,which measures the variation in solutions of a given physical/mathematical system with respect to the variation of probability distribution of the inputs.This is distinctively different from the classical sensitivity analysis,which studies the changes of solutions with respect to the values of the inputs.The general idea is measurement of sensitivity of outputs with respect to probability distributions,which is a well-studied concept in related disciplines.We adapt these ideas to present a quantitative framework in the context of uncertainty quantification for measuring such a kind of sensitivity and a set of efficient algorithms to approximate the distributional sensitivity numerically.A remarkable feature of the algorithms is that they do not incur additional computational effort in addition to a one-time stochastic solver.Therefore,an accurate stochastic computation with respect to a prior input distribution is needed only once,and the ensuing distributional sensitivity computation for different input distributions is a post-processing step.We prove that an accurate numerical model leads to accurate calculations of this sensitivity,which applies not just to slowly-converging Monte-Carlo estimates,but also to exponentially convergent spectral approximations.We provide computational examples to demonstrate the ease of applicability and verify the convergence claims.
文摘Fragility curves are commonly used in civil engineering to assess the vulnerability of structures to earthquakes. The probability of failure associated with a prescribed criterion (e.g., the maximal inter-storey drift of a building exceeding a certain threshold) is represented as a function of the intensity of the earthquake ground motion (e.g., peak ground acceleration or spectral acceleration). The classical approach relies on assuming a lognormal shape of the fragility curves; it is thus parametric. In this paper, we introduce two non-parametric approaches to establish the fragility curves without employing the above assumption, namely binned Monte Carlo simulation and kernel density estimation. As an illustration, we compute the fragility curves for a three-storey steel frame using a large number of synthetic ground motions. The curves obtained with the non-parametric approaches are compared with respective curves based on the lognormal assumption. A similar comparison is presented for a case when a limited number of recorded ground motions is available. It is found that the accuracy of the lognormal curves depends on the ground motion intensity measure, the failure criterion and most importantly, on the employed method for estimating the parameters of the lognormal shape.
文摘It is mainly deals with models of wireless communication channels,and in particular with the difficulty of finding an exact model which is both mathematically easy to deal with and physically accurate.It is argued that a certain amount of uncertainty necessarily following the use of inaccurate channel models should be accepted,and suitable tools used to evaluate its effects on analysis and design.With our approach,lower and upper bounds on required performance parameters are derived under no assumption of exact knowledge of the underlying probability distributions,and model uncertainty effects are propagated throughout calculations.Sepcial attention is given to the derivation of upper and lower bounds on system performance when this is determined by random variables whose dependence is not exactly known.