In time-variant reliability problems,there are a lot of uncertain variables from different sources.Therefore,it is important to consider these uncertainties in engineering.In addition,time-variant reliability problems...In time-variant reliability problems,there are a lot of uncertain variables from different sources.Therefore,it is important to consider these uncertainties in engineering.In addition,time-variant reliability problems typically involve a complexmultilevel nested optimization problem,which can result in an enormous amount of computation.To this end,this paper studies the time-variant reliability evaluation of structures with stochastic and bounded uncertainties using a mixed probability and convex set model.In this method,the stochastic process of a limit-state function with mixed uncertain parameters is first discretized and then converted into a timeindependent reliability problem.Further,to solve the double nested optimization problem in hybrid reliability calculation,an efficient iterative scheme is designed in standard uncertainty space to determine the most probable point(MPP).The limit state function is linearized at these points,and an innovative random variable is defined to solve the equivalent static reliability analysis model.The effectiveness of the proposed method is verified by two benchmark numerical examples and a practical engineering problem.展开更多
Traditional structural reliability analysis methods adopt precise probabilities to quantify uncertainties and they are suitable for systems with sufficient statistical data.However,the problem of insufficient data is ...Traditional structural reliability analysis methods adopt precise probabilities to quantify uncertainties and they are suitable for systems with sufficient statistical data.However,the problem of insufficient data is often encountered in practical engineering.Thus,structural reliability analysis methods under insufficient data have caught more and more attentions in recent years and a lot of nonprobabilistic reliability analysis methods are put forward to deal with the problem of insufficient data.Non-probabilistic structural reliability analysis methods based on fuzzy set,Dempster-Shafer theory,interval analysis and other theories have got a lot of achievements both in theoretical and practical aspects and they have been successfully applied in structural reliability analysis of largescale complex systems with small samples and few statistical data.In addition to non-probabilistic structural reliability analysis methods,structural reliability analysis based on imprecise probability theory is a new method proposed in recent years.Study on structural reliability analysis using imprecise probability theory is still at the start stage,thus the generalization of imprecise structural reliability model is very important.In this paper,the imprecise probability was developed as an effective way to handle uncertainties,the detailed procedures of imprecise structural reliability analysis was introduced,and several specific imprecise structural reliability models which are most effective for engineering systems were given.At last,an engineering example of a cantilever beam was given to illustrate the effectiveness of the method emphasized here.By comparing with interval structural reliability analysis,the result obtained from imprecise structural reliability model is a little conservative than the one resulted from interval structural reliability analysis for imprecise structural reliability analysis model considers that the probability of each value is taken from an interval.展开更多
The application of the saddlepoint approximation to reliability analysis of dynamic systems is investigated. The failure event in reliability problems is formulated as the exceedance of a single performance variable o...The application of the saddlepoint approximation to reliability analysis of dynamic systems is investigated. The failure event in reliability problems is formulated as the exceedance of a single performance variable over a prescribed threshold level. The saddlepoint approximation technique provides a choice to estimate the cumulative distribution function (CDF) of the performance variable. The failure probability is obtained as the value of the complement CDF at a specified threshold. The method requires computing the saddlepoint from a simple algebraic equation that depends on the cumulant generating function (CGF) of the performance variable. A method for calculating the saddlepoint using random samples of the performance variable is presented. The applicable region of the saddlepoint approximation is discussed in detail. A 10-story shear building model with white noise excitation illustrates the accuracy and efficiency of the proposed methodology.展开更多
Statistical analysis was done on simultaneous wave and wind using data recorded by discus-shape wave buoy. The area is located in the southern Caspian Sea near the Anzali Port. Recorded wave data were obtained through...Statistical analysis was done on simultaneous wave and wind using data recorded by discus-shape wave buoy. The area is located in the southern Caspian Sea near the Anzali Port. Recorded wave data were obtained through directional spectrum wave analysis. Recorded wind direction and wind speed were obtained through the related time series as well. For 12-month measurements(May 25 2007-2008), statistical calculations were done to specify the value of nonlinear auto-correlation of wave and wind using the probability distribution function of wave characteristics and statistical analysis in various time periods. The paper also presents and analyzes the amount of wave energy for the area mentioned on the basis of available database. Analyses showed a suitable comparison between the amounts of wave energy in different seasons. As a result, the best period for the largest amount of wave energy was known. Results showed that in the research period, the mean wave and wind auto correlation were about three hours. Among the probability distribution functions, i.e Weibull, Normal, Lognormal and Rayleigh, "Weibull" had the best consistency with experimental distribution function shown in different diagrams for each season. Results also showed that the mean wave energy in the research period was about 49.88 k W/m and the maximum density of wave energy was found in February and March, 2010.展开更多
Based on statistics principle,random error and systematic error were considered and the volumetric properties of the two mixtures types,namely A and B,were statistically analyzed using different distribution methods.S...Based on statistics principle,random error and systematic error were considered and the volumetric properties of the two mixtures types,namely A and B,were statistically analyzed using different distribution methods.Seventy-two samples of mixture A and fifty-two of mixture B were fabricated using the Marshall method.The probability distributions were compared on the basis of goodness of fit.Weibull model was found to be most appropriate model for describing the asphalt mixtures volumetric properties distribution.The two-parameter Weibull distribution function applied well to model the bulk specific gravity and voids filled with asphalt data,whereas,the three-parameter Weibull distribution appeared to be more appropriate in the discussing of air voids and voids in mineral aggregate.The experimetal results is revealed that compared with the mean value,the peak value of Weibull distribution was suggested as an alternative and more powerful parameter for describing the test data distribution characteristic.The analysis of test results also revealed that there were significant differences in the volumetric properties of the two tested mixtures for the same confidence level.The confidence interval decreased with the decreasing in reliability.展开更多
Security is a vital parameter to conserve energy in wireless sensor networks(WSN).Trust management in the WSN is a crucial process as trust is utilized when collaboration is important for accomplishing trustworthy dat...Security is a vital parameter to conserve energy in wireless sensor networks(WSN).Trust management in the WSN is a crucial process as trust is utilized when collaboration is important for accomplishing trustworthy data transmission.But the available routing techniques do not involve security in the design of routing techniques.This study develops a novel statistical analysis with dingo optimizer enabled reliable routing scheme(SADO-RRS)for WSN.The proposed SADO-RRS technique aims to detect the existence of attacks and optimal routes in WSN.In addition,the presented SADORRS technique derives a new statistics based linear discriminant analysis(LDA)for attack detection,Moreover,a trust based dingo optimizer(TBDO)algorithm is applied for optimal route selection in the WSN and accomplishes secure data transmission in WSN.Besides,the TBDO algorithm involves the derivation of the fitness function involving different input variables of WSN.For demonstrating the enhanced outcomes of the SADO-RRS technique,a wide range of simulations was carried out and the outcomes demonstrated the enhanced outcomes of the SADO-RRS technique.展开更多
This paper presents the data on operation reliability indices and relevant analyses toward China's conventional power generating units in 2009.The units brought into the statistical analysis include 100-MW or abov...This paper presents the data on operation reliability indices and relevant analyses toward China's conventional power generating units in 2009.The units brought into the statistical analysis include 100-MW or above thermal generating units,40-MW or above hydro generating units,and all nuclear generating units.The reliability indices embodied include utilization hours,times and hours of scheduled outages,times and hours of unscheduled outages,equivalent forced outage rate and equivalent availability factor.展开更多
Slope stability assessment is a geotechnical problem characterized by many sources of uncertainty. In clas- sical reliability analysis, only the randomness of uncertainties is taken into account but the fuzziness of t...Slope stability assessment is a geotechnical problem characterized by many sources of uncertainty. In clas- sical reliability analysis, only the randomness of uncertainties is taken into account but the fuzziness of them is ignored. In this paper, a fuzzy probability approach and a fuzzy JC method are presented for the reliability analysis. The two methods have been applied to stability analysis of a certain slope of permanent ship lock in the Three-Gorges Project. The results obtained from these two methods are basically the same. However, compared with the fuzzy probability means, the fuzzy JC method can reflect the real situation better because it uses a fuzzy-based analysis applied to not only limit state equation but also mechanical parameters.展开更多
Carrying on a series of compression and shear tests by a large number of specimens,reliabilities of T300/QY8911 laminated composite were studied when dispersibility models were described.The results show that the stre...Carrying on a series of compression and shear tests by a large number of specimens,reliabilities of T300/QY8911 laminated composite were studied when dispersibility models were described.The results show that the stress is linearly dependent on the strain and the damage modes of specimens are brittle fracture for both kinds of tests.Dispersibility models of compression and shear strength are expressed as Rc~N(415.39,6 586.36) and Rs~ln(5.071 8,0.155 3),respectively.When normal and lognormal distributions were used to describe the dispersibility models of compression and shear strength,and the compression or shear load follows the normal distribution,the almost same failure probability can be obtained from different reliability analysis methods.展开更多
The reliability of radiotherapy was evaluated and effective approaches were obtained in order to improve radiotherapy quality by using the Probabilistic Safety Assessment(PSA) method. This study investigated the feasi...The reliability of radiotherapy was evaluated and effective approaches were obtained in order to improve radiotherapy quality by using the Probabilistic Safety Assessment(PSA) method. This study investigated the feasibility of the PSA method being applied to radiotherapy through Image-guided Radiotherapy(IGRT) and chest tumor irradiation. A fault tree has been constructed after analyzing causal relationship of the events.After calculating Risk A, a total inaccuracy radiotherapy probability and the importance of all base events were obtained. The probability of inaccurate radiotherapy was 2.87%. Under the condition that the target delineation was perfectly right, the accuracy of radiotherapy significantly improved. With the calculation without Conebeam Computed Tomography(CBCT) being corrected before irradiation, the accuracy significantly decreased.The most important events were connected with the human factor. Improving human technical level could enhance radiotherapy quality control efficiently.展开更多
A simplified bi-variable human error probability calculation method is developed by incorporating two common performance condition( CPC) factors, which are modified from factors employed in cognitive reliability and e...A simplified bi-variable human error probability calculation method is developed by incorporating two common performance condition( CPC) factors, which are modified from factors employed in cognitive reliability and error analysis method(CREAM) to take into account the characteristics of shipping operations. After the influencing factors are identified, Markov method is used to calculate the values of human reliability. The proposed method does not rely on the involvement of experts in the field of human factor nor depend on historical accidents or human error statistics. It is applied to the case of the crew on board of an ocean going dry bulk carrier. The caculated results agree with the actual case, which verifies the validity of the model.展开更多
Human Reliability Analysis(HRA)is an important part in safety assessment of a large complex system.Human Cognitive Reliability(HCR)model is a method of evaluating the probability that operators fail to complete during...Human Reliability Analysis(HRA)is an important part in safety assessment of a large complex system.Human Cognitive Reliability(HCR)model is a method of evaluating the probability that operators fail to complete during diagnostic decision making within a limited time,which is widely used in HRA.In the application of this method,cognitive patterns of humans are required to be considered and classified,and this process often relies on the evaluation opinions of experts which is highly subjective and uncertain.How to effectively express and process this uncertain and subjective information plays a critical role in improving the accuracy and applicability of HCR.In this paper,a new model was proposed to deal with the uncertain information which exists in the processes of cognitive pattern classification in HCR.First,an evaluation panel was constructed based on expert opinions and processing including setting corresponding anchor points and qualitative indicators of different cognitive patterns,and mapping them to fuzzy numbers and unit intervals.Second,based on the evaluation panel,different analysts judge the cognitive pattern types of actual specific events and provide the level of confidence he or she has in the judgments.Finally,the evaluation opinions of multiple analysts were expressed and fused based on the Dempster-Shafer Evidence Theory(DSET),and the fused results were applied to the HCR model to obtain the Human Error Probability(HEP).A case study was used to demonstrate the procedure and effectiveness of the proposed method.展开更多
The aim of the paper is to present a newly developed approach for reliability-based design optimization. It is based on double loop framework where the outer loop of algorithm covers the optimization part of process o...The aim of the paper is to present a newly developed approach for reliability-based design optimization. It is based on double loop framework where the outer loop of algorithm covers the optimization part of process of reliability-based optimization and reliability constrains are calculated in inner loop. Innovation of suggested approach is in application of newly developed optimization strategy based on multilevel simulation using an advanced Latin Hypercube Sampling technique. This method is called Aimed multilevel sampling and it is designated for optimization of problems where only limited number of simulations is possible to perform due to enormous com- putational demands.展开更多
In traditional Bayesian software reliability models, it was assume that all probabilities are precise. In practical applications the parameters of the probability distributions are often under uncertainty due to stron...In traditional Bayesian software reliability models, it was assume that all probabilities are precise. In practical applications the parameters of the probability distributions are often under uncertainty due to strong dependence on subjective information of experts' judgments on sparse statistical data. In this paper, a quasi-Bayesian software reliability model using interval-valued probabilities to clearly quantify experts' prior beliefs on possible intervals of the parameters of the probability distributions is presented. The model integrates experts' judgments with statistical data to obtain more convincible assessments of software reliability with small samples. For some actual data sets, the presented model yields better predictions than the Jelinski-Moranda (JM) model using maximum likelihood (ML).展开更多
In modeling reliability data,the exponential distribution is commonly used due to its simplicity.For estimating the parameter of the exponential distribution,classical estimators including maximum likelihood estimator...In modeling reliability data,the exponential distribution is commonly used due to its simplicity.For estimating the parameter of the exponential distribution,classical estimators including maximum likelihood estimator represent the most commonly used method and are well known to be efficient.However,the maximum likelihood estimator is highly sensitive in the presence of contamination or outliers.In this study,a robust and efficient estimator of the exponential distribution parameter was proposed based on the probability integral transform statistic.To examine the robustness of this new estimator,asymptotic variance,breakdown point,and gross error sensitivity were derived.This new estimator offers reasonable protection against outliers besides being simple to compute.Furthermore,a simulation study was conducted to compare the performance of this new estimator with the maximum likelihood estimator,weighted likelihood estimator,and M-scale estimator in the presence of outliers.Finally,a statistical analysis of three reliability data sets was conducted to demonstrate the performance of the proposed estimator.展开更多
Using spreadsheets and ranges for pairwise judgments,candidate probability distributions are generated for the decision-maker to consider.This replaces event-by-event determination of probabilities.Basic statistics of...Using spreadsheets and ranges for pairwise judgments,candidate probability distributions are generated for the decision-maker to consider.This replaces event-by-event determination of probabilities.Basic statistics of the distributions are then used to determine a final distribution for decision purposes as in buy,sell,or hold.展开更多
It is possible for certain building structures to encounter both the seismic load and blast load during their service life.With the development of the economy and the increase of security demand,the need for design of...It is possible for certain building structures to encounter both the seismic load and blast load during their service life.With the development of the economy and the increase of security demand,the need for design of building structures against multi-hazard is becoming more and more obvious.Therefore,the damage analysis of building structures under the combined action of multiple hazards has become a very urgent requirement for disaster prevention and reduction.In this paper,the refined finite element model of reinforced concrete(RC)columns is established by using the explicit dynamic analysis software LS-DYNA.Combined with the Monte Carlo method,the damage law of RC columns under the combined action of random single earthquake or explosion disaster and multi-hazard is studied,and the damage groups are distinguished according to the damage index.Based on the support vector machine(SVM)algorithm,the dividing line between different damage degree groups is determined,and a rapid method for determining the damage degree of RC columns under the combined seismic and blast loads is proposed.Finally,suggestions for the design of RC column against multi-disaster are put forward.展开更多
For the gradual maturity of Bayesian survival analysis theory,as well as the defects of the traditional methods for storage reliability evaluation,the Bayesian survival analysis method is proposed to build regression ...For the gradual maturity of Bayesian survival analysis theory,as well as the defects of the traditional methods for storage reliability evaluation,the Bayesian survival analysis method is proposed to build regression models for reliability in the random truncated test.These models can reflect the influences of different environments on the ammunition storage lifetime.As an example,the common exponential distribution is used here,and Markov chain Monte Carlo(MCMC)method based on Gibbs sampling dynamically simulates the Markov chain of the parameters' posterior distribution.Also,the parameters' Bayesian estimations are calculated in the random truncated condition.The simulation results show that the proposed method is effective and directly perceived.展开更多
Based on the generalized probabilistic finite element method, this paper presents an approximate solution technique for general multi-degree-of-freedom nonlinear random vibration systems with random parameters. The fo...Based on the generalized probabilistic finite element method, this paper presents an approximate solution technique for general multi-degree-of-freedom nonlinear random vibration systems with random parameters. The fourth-moment technique, maximum entropy theory and incomplete probability information theory are employed to systematically develop a reliability analysis method for dynamic random structural systems with correlation failure modes under unavailable joint probability density functions of basic random variables. The first passage problem of multi-degree-of-freedom nonlinear random vibration systems is solved.展开更多
The application of reliability analysis and reliability sensitivity analysis methods to complicated structures faces two main challenges:small failure probability(typical less than 10-5)and time-demanding mechanical m...The application of reliability analysis and reliability sensitivity analysis methods to complicated structures faces two main challenges:small failure probability(typical less than 10-5)and time-demanding mechanical models.This paper proposes an improved active learning surrogate model method,which combines the advantages of the classical Active Kriging–Monte Carlo Simulation(AK-MCS)procedure and the Adaptive Linked Importance Sampling(ALIS)procedure.The proposed procedure can,on the one hand,adaptively produce a series of intermediate sampling density approaching the quasi-optimal Importance Sampling(IS)density,on the other hand,adaptively generate a set of intermediate surrogate models approaching the true failure surface of the rare failure event.Then,the small failure probability and the corresponding reliability sensitivity indices are efficiently estimated by their IS estimators based on the quasi-optimal IS density and the surrogate models.Compared with the classical AK-MCS and Active Kriging–Importance Sampling(AK-IS)procedure,the proposed method neither need to build very large sample pool even when the failure probability is extremely small,nor need to estimate the Most Probable Points(MPPs),thus it is computationally more efficient and more applicable especially for problems with multiple MPPs.The effectiveness and engineering applicability of the proposed method are demonstrated by one numerical test example and two engineering applications.展开更多
基金partially supported by the National Natural Science Foundation of China(52375238)Science and Technology Program of Guangzhou(202201020213,202201020193,202201010399)GZHU-HKUST Joint Research Fund(YH202109).
文摘In time-variant reliability problems,there are a lot of uncertain variables from different sources.Therefore,it is important to consider these uncertainties in engineering.In addition,time-variant reliability problems typically involve a complexmultilevel nested optimization problem,which can result in an enormous amount of computation.To this end,this paper studies the time-variant reliability evaluation of structures with stochastic and bounded uncertainties using a mixed probability and convex set model.In this method,the stochastic process of a limit-state function with mixed uncertain parameters is first discretized and then converted into a timeindependent reliability problem.Further,to solve the double nested optimization problem in hybrid reliability calculation,an efficient iterative scheme is designed in standard uncertainty space to determine the most probable point(MPP).The limit state function is linearized at these points,and an innovative random variable is defined to solve the equivalent static reliability analysis model.The effectiveness of the proposed method is verified by two benchmark numerical examples and a practical engineering problem.
基金Joint Funds of the National Natual Foundation of China(NSAF)(No.U1330130)
文摘Traditional structural reliability analysis methods adopt precise probabilities to quantify uncertainties and they are suitable for systems with sufficient statistical data.However,the problem of insufficient data is often encountered in practical engineering.Thus,structural reliability analysis methods under insufficient data have caught more and more attentions in recent years and a lot of nonprobabilistic reliability analysis methods are put forward to deal with the problem of insufficient data.Non-probabilistic structural reliability analysis methods based on fuzzy set,Dempster-Shafer theory,interval analysis and other theories have got a lot of achievements both in theoretical and practical aspects and they have been successfully applied in structural reliability analysis of largescale complex systems with small samples and few statistical data.In addition to non-probabilistic structural reliability analysis methods,structural reliability analysis based on imprecise probability theory is a new method proposed in recent years.Study on structural reliability analysis using imprecise probability theory is still at the start stage,thus the generalization of imprecise structural reliability model is very important.In this paper,the imprecise probability was developed as an effective way to handle uncertainties,the detailed procedures of imprecise structural reliability analysis was introduced,and several specific imprecise structural reliability models which are most effective for engineering systems were given.At last,an engineering example of a cantilever beam was given to illustrate the effectiveness of the method emphasized here.By comparing with interval structural reliability analysis,the result obtained from imprecise structural reliability model is a little conservative than the one resulted from interval structural reliability analysis for imprecise structural reliability analysis model considers that the probability of each value is taken from an interval.
基金Research Committee of University of Macao Under Grant No. G074/05-06S/YKV/FST UMAC.
文摘The application of the saddlepoint approximation to reliability analysis of dynamic systems is investigated. The failure event in reliability problems is formulated as the exceedance of a single performance variable over a prescribed threshold level. The saddlepoint approximation technique provides a choice to estimate the cumulative distribution function (CDF) of the performance variable. The failure probability is obtained as the value of the complement CDF at a specified threshold. The method requires computing the saddlepoint from a simple algebraic equation that depends on the cumulant generating function (CGF) of the performance variable. A method for calculating the saddlepoint using random samples of the performance variable is presented. The applicable region of the saddlepoint approximation is discussed in detail. A 10-story shear building model with white noise excitation illustrates the accuracy and efficiency of the proposed methodology.
文摘Statistical analysis was done on simultaneous wave and wind using data recorded by discus-shape wave buoy. The area is located in the southern Caspian Sea near the Anzali Port. Recorded wave data were obtained through directional spectrum wave analysis. Recorded wind direction and wind speed were obtained through the related time series as well. For 12-month measurements(May 25 2007-2008), statistical calculations were done to specify the value of nonlinear auto-correlation of wave and wind using the probability distribution function of wave characteristics and statistical analysis in various time periods. The paper also presents and analyzes the amount of wave energy for the area mentioned on the basis of available database. Analyses showed a suitable comparison between the amounts of wave energy in different seasons. As a result, the best period for the largest amount of wave energy was known. Results showed that in the research period, the mean wave and wind auto correlation were about three hours. Among the probability distribution functions, i.e Weibull, Normal, Lognormal and Rayleigh, "Weibull" had the best consistency with experimental distribution function shown in different diagrams for each season. Results also showed that the mean wave energy in the research period was about 49.88 k W/m and the maximum density of wave energy was found in February and March, 2010.
基金Funded by the National Natural Science Foundation of China (No. S50778057) the Research Fund for the Doctoral Program of Higher Education (No. 20060213002)
文摘Based on statistics principle,random error and systematic error were considered and the volumetric properties of the two mixtures types,namely A and B,were statistically analyzed using different distribution methods.Seventy-two samples of mixture A and fifty-two of mixture B were fabricated using the Marshall method.The probability distributions were compared on the basis of goodness of fit.Weibull model was found to be most appropriate model for describing the asphalt mixtures volumetric properties distribution.The two-parameter Weibull distribution function applied well to model the bulk specific gravity and voids filled with asphalt data,whereas,the three-parameter Weibull distribution appeared to be more appropriate in the discussing of air voids and voids in mineral aggregate.The experimetal results is revealed that compared with the mean value,the peak value of Weibull distribution was suggested as an alternative and more powerful parameter for describing the test data distribution characteristic.The analysis of test results also revealed that there were significant differences in the volumetric properties of the two tested mixtures for the same confidence level.The confidence interval decreased with the decreasing in reliability.
基金This project was funded by the Deanship of Scientific Research(DSR),King Abdulaziz University,Jeddah,Saudi Arabia under Grant No.(KEP-81-130-42)The authors,therefore acknowledge with thanks DSR technical and financial support。
文摘Security is a vital parameter to conserve energy in wireless sensor networks(WSN).Trust management in the WSN is a crucial process as trust is utilized when collaboration is important for accomplishing trustworthy data transmission.But the available routing techniques do not involve security in the design of routing techniques.This study develops a novel statistical analysis with dingo optimizer enabled reliable routing scheme(SADO-RRS)for WSN.The proposed SADO-RRS technique aims to detect the existence of attacks and optimal routes in WSN.In addition,the presented SADORRS technique derives a new statistics based linear discriminant analysis(LDA)for attack detection,Moreover,a trust based dingo optimizer(TBDO)algorithm is applied for optimal route selection in the WSN and accomplishes secure data transmission in WSN.Besides,the TBDO algorithm involves the derivation of the fitness function involving different input variables of WSN.For demonstrating the enhanced outcomes of the SADO-RRS technique,a wide range of simulations was carried out and the outcomes demonstrated the enhanced outcomes of the SADO-RRS technique.
文摘This paper presents the data on operation reliability indices and relevant analyses toward China's conventional power generating units in 2009.The units brought into the statistical analysis include 100-MW or above thermal generating units,40-MW or above hydro generating units,and all nuclear generating units.The reliability indices embodied include utilization hours,times and hours of scheduled outages,times and hours of unscheduled outages,equivalent forced outage rate and equivalent availability factor.
文摘Slope stability assessment is a geotechnical problem characterized by many sources of uncertainty. In clas- sical reliability analysis, only the randomness of uncertainties is taken into account but the fuzziness of them is ignored. In this paper, a fuzzy probability approach and a fuzzy JC method are presented for the reliability analysis. The two methods have been applied to stability analysis of a certain slope of permanent ship lock in the Three-Gorges Project. The results obtained from these two methods are basically the same. However, compared with the fuzzy probability means, the fuzzy JC method can reflect the real situation better because it uses a fuzzy-based analysis applied to not only limit state equation but also mechanical parameters.
基金Project(51175424) supported by the National Natural Science FoundationProject(B07050) supported by the 111 Project,ChinaProject (JC20110257) supported by the Basic Research Foundation of Northwestern Polytechnical University
文摘Carrying on a series of compression and shear tests by a large number of specimens,reliabilities of T300/QY8911 laminated composite were studied when dispersibility models were described.The results show that the stress is linearly dependent on the strain and the damage modes of specimens are brittle fracture for both kinds of tests.Dispersibility models of compression and shear strength are expressed as Rc~N(415.39,6 586.36) and Rs~ln(5.071 8,0.155 3),respectively.When normal and lognormal distributions were used to describe the dispersibility models of compression and shear strength,and the compression or shear load follows the normal distribution,the almost same failure probability can be obtained from different reliability analysis methods.
基金Supported by the National Natural Science Foundation of China(No.81101132)the Natural Science Foundation of Anhui Province(No.11040606Q55)
文摘The reliability of radiotherapy was evaluated and effective approaches were obtained in order to improve radiotherapy quality by using the Probabilistic Safety Assessment(PSA) method. This study investigated the feasibility of the PSA method being applied to radiotherapy through Image-guided Radiotherapy(IGRT) and chest tumor irradiation. A fault tree has been constructed after analyzing causal relationship of the events.After calculating Risk A, a total inaccuracy radiotherapy probability and the importance of all base events were obtained. The probability of inaccurate radiotherapy was 2.87%. Under the condition that the target delineation was perfectly right, the accuracy of radiotherapy significantly improved. With the calculation without Conebeam Computed Tomography(CBCT) being corrected before irradiation, the accuracy significantly decreased.The most important events were connected with the human factor. Improving human technical level could enhance radiotherapy quality control efficiently.
基金Supported by the National Basic Research Program of China("973"Program,No.2014CB046804)National Natural Science Foundation of China(No.51239008)+1 种基金Foundation of State Key Laboratory of Marine Engineering of Shanghai Jiaotong UniversityFoundation for Innovative Research Groups of National Natural Science Foundation of China(No.51021004)
文摘A simplified bi-variable human error probability calculation method is developed by incorporating two common performance condition( CPC) factors, which are modified from factors employed in cognitive reliability and error analysis method(CREAM) to take into account the characteristics of shipping operations. After the influencing factors are identified, Markov method is used to calculate the values of human reliability. The proposed method does not rely on the involvement of experts in the field of human factor nor depend on historical accidents or human error statistics. It is applied to the case of the crew on board of an ocean going dry bulk carrier. The caculated results agree with the actual case, which verifies the validity of the model.
基金supported by Shanghai Natural Science Foundation(Grant No.19ZR1420700)sponsored by Shanghai Rising-Star Program(Grant No.21QA1403400)Shanghai Key Laboratory of Power Station Automation Technology(Grant No.13DZ2273800).
文摘Human Reliability Analysis(HRA)is an important part in safety assessment of a large complex system.Human Cognitive Reliability(HCR)model is a method of evaluating the probability that operators fail to complete during diagnostic decision making within a limited time,which is widely used in HRA.In the application of this method,cognitive patterns of humans are required to be considered and classified,and this process often relies on the evaluation opinions of experts which is highly subjective and uncertain.How to effectively express and process this uncertain and subjective information plays a critical role in improving the accuracy and applicability of HCR.In this paper,a new model was proposed to deal with the uncertain information which exists in the processes of cognitive pattern classification in HCR.First,an evaluation panel was constructed based on expert opinions and processing including setting corresponding anchor points and qualitative indicators of different cognitive patterns,and mapping them to fuzzy numbers and unit intervals.Second,based on the evaluation panel,different analysts judge the cognitive pattern types of actual specific events and provide the level of confidence he or she has in the judgments.Finally,the evaluation opinions of multiple analysts were expressed and fused based on the Dempster-Shafer Evidence Theory(DSET),and the fused results were applied to the HCR model to obtain the Human Error Probability(HEP).A case study was used to demonstrate the procedure and effectiveness of the proposed method.
基金support of projects of Ministry of Education of Czech Republic KONTAKT No.LH12062previous achievements worked out under the project of Technological Agency of Czech Republic No.TA01011019.
文摘The aim of the paper is to present a newly developed approach for reliability-based design optimization. It is based on double loop framework where the outer loop of algorithm covers the optimization part of process of reliability-based optimization and reliability constrains are calculated in inner loop. Innovation of suggested approach is in application of newly developed optimization strategy based on multilevel simulation using an advanced Latin Hypercube Sampling technique. This method is called Aimed multilevel sampling and it is designated for optimization of problems where only limited number of simulations is possible to perform due to enormous com- putational demands.
基金supported by the National High-Technology Research and Development Program of China (Grant Nos.2006AA01Z187,2007AA040605)
文摘In traditional Bayesian software reliability models, it was assume that all probabilities are precise. In practical applications the parameters of the probability distributions are often under uncertainty due to strong dependence on subjective information of experts' judgments on sparse statistical data. In this paper, a quasi-Bayesian software reliability model using interval-valued probabilities to clearly quantify experts' prior beliefs on possible intervals of the parameters of the probability distributions is presented. The model integrates experts' judgments with statistical data to obtain more convincible assessments of software reliability with small samples. For some actual data sets, the presented model yields better predictions than the Jelinski-Moranda (JM) model using maximum likelihood (ML).
基金This work is supported by the Universiti Kebangsaan Malaysia[Grant Number DIP-2018-038].
文摘In modeling reliability data,the exponential distribution is commonly used due to its simplicity.For estimating the parameter of the exponential distribution,classical estimators including maximum likelihood estimator represent the most commonly used method and are well known to be efficient.However,the maximum likelihood estimator is highly sensitive in the presence of contamination or outliers.In this study,a robust and efficient estimator of the exponential distribution parameter was proposed based on the probability integral transform statistic.To examine the robustness of this new estimator,asymptotic variance,breakdown point,and gross error sensitivity were derived.This new estimator offers reasonable protection against outliers besides being simple to compute.Furthermore,a simulation study was conducted to compare the performance of this new estimator with the maximum likelihood estimator,weighted likelihood estimator,and M-scale estimator in the presence of outliers.Finally,a statistical analysis of three reliability data sets was conducted to demonstrate the performance of the proposed estimator.
文摘Using spreadsheets and ranges for pairwise judgments,candidate probability distributions are generated for the decision-maker to consider.This replaces event-by-event determination of probabilities.Basic statistics of the distributions are then used to determine a final distribution for decision purposes as in buy,sell,or hold.
基金supported by the National Natural Science Foundation of China (Grant Nos.51878445,51938011 and 51908405)。
文摘It is possible for certain building structures to encounter both the seismic load and blast load during their service life.With the development of the economy and the increase of security demand,the need for design of building structures against multi-hazard is becoming more and more obvious.Therefore,the damage analysis of building structures under the combined action of multiple hazards has become a very urgent requirement for disaster prevention and reduction.In this paper,the refined finite element model of reinforced concrete(RC)columns is established by using the explicit dynamic analysis software LS-DYNA.Combined with the Monte Carlo method,the damage law of RC columns under the combined action of random single earthquake or explosion disaster and multi-hazard is studied,and the damage groups are distinguished according to the damage index.Based on the support vector machine(SVM)algorithm,the dividing line between different damage degree groups is determined,and a rapid method for determining the damage degree of RC columns under the combined seismic and blast loads is proposed.Finally,suggestions for the design of RC column against multi-disaster are put forward.
基金Sponsored by National Nature Science Foundation of China(70771038).
文摘For the gradual maturity of Bayesian survival analysis theory,as well as the defects of the traditional methods for storage reliability evaluation,the Bayesian survival analysis method is proposed to build regression models for reliability in the random truncated test.These models can reflect the influences of different environments on the ammunition storage lifetime.As an example,the common exponential distribution is used here,and Markov chain Monte Carlo(MCMC)method based on Gibbs sampling dynamically simulates the Markov chain of the parameters' posterior distribution.Also,the parameters' Bayesian estimations are calculated in the random truncated condition.The simulation results show that the proposed method is effective and directly perceived.
基金This work was supported by the National Natural Science Foundation of China(Grant Nos.50175043,19990510)the 973 Project Foundation of China(1998020320)the Foundation for University Key Teacher by Ministry of Education of China.
文摘Based on the generalized probabilistic finite element method, this paper presents an approximate solution technique for general multi-degree-of-freedom nonlinear random vibration systems with random parameters. The fourth-moment technique, maximum entropy theory and incomplete probability information theory are employed to systematically develop a reliability analysis method for dynamic random structural systems with correlation failure modes under unavailable joint probability density functions of basic random variables. The first passage problem of multi-degree-of-freedom nonlinear random vibration systems is solved.
基金supported by National Natural Science Foundation of China(Nos.51905430,51608446)the Fundamental Research Fund for Central Universities(No.3102018zy011)+1 种基金the supports of Alexander von Humboldt Foundation of Germanythe Top International University Visiting Program for Outstanding Young scholars of Northwestern Polytechnical University。
文摘The application of reliability analysis and reliability sensitivity analysis methods to complicated structures faces two main challenges:small failure probability(typical less than 10-5)and time-demanding mechanical models.This paper proposes an improved active learning surrogate model method,which combines the advantages of the classical Active Kriging–Monte Carlo Simulation(AK-MCS)procedure and the Adaptive Linked Importance Sampling(ALIS)procedure.The proposed procedure can,on the one hand,adaptively produce a series of intermediate sampling density approaching the quasi-optimal Importance Sampling(IS)density,on the other hand,adaptively generate a set of intermediate surrogate models approaching the true failure surface of the rare failure event.Then,the small failure probability and the corresponding reliability sensitivity indices are efficiently estimated by their IS estimators based on the quasi-optimal IS density and the surrogate models.Compared with the classical AK-MCS and Active Kriging–Importance Sampling(AK-IS)procedure,the proposed method neither need to build very large sample pool even when the failure probability is extremely small,nor need to estimate the Most Probable Points(MPPs),thus it is computationally more efficient and more applicable especially for problems with multiple MPPs.The effectiveness and engineering applicability of the proposed method are demonstrated by one numerical test example and two engineering applications.