Probabilistic model checking has been widely applied to quantitative analysis of stochastic systems, e.g., analyzing the performance, reliability and survivability of computer and communication systems. In this paper,...Probabilistic model checking has been widely applied to quantitative analysis of stochastic systems, e.g., analyzing the performance, reliability and survivability of computer and communication systems. In this paper, we extend the application of probabilistic model checking to the vehicle to vehicle(V2V) networks. We first develop a continuous-time Markov chain(CTMC) model for the considered V2V network, after that, the PRISM language is adopted to describe the CTMC model, and continuous-time stochastic logic is used to describe the objective survivability properties. In the analysis, two typical failures are considered, namely the node failure and the link failure, respectively induced by external malicious attacks on a target V2V node, and interrupt in a communication link. Considering these failures, their impacts on the network survivability are demonstrated. It is shown that with increasing failure strength, the network survivability is reduced. On the other hand, the network survivability can be improved with increasing repair rate. The proposed probabilistic model checking-based approach can be effectively used in survivability analysis for the V2V networks, moreover, it is anticipated that the approach can be conveniently extended to other networks.展开更多
In order to describe and control the stress distribution and total deformation of bladed disk assemblies used in the aeroengine, a highly efficient and precise method of probabilistic analysis which is called extremum...In order to describe and control the stress distribution and total deformation of bladed disk assemblies used in the aeroengine, a highly efficient and precise method of probabilistic analysis which is called extremum response surface method(ERSM) is produced based on the previous deterministic analysis results with the finite element model(FEM). In this work, many key nonlinear factors, such as the dynamic feature of the temperature load, the centrifugal force and the boundary conditions, are taken into consideration for the model. The changing patterns with time of bladed disk assemblies about stress distribution and total deformation are obtained during the deterministic analysis, and at the same time, the largest deformation and stress nodes of bladed disk assemblies are found and taken as input target of probabilistic analysis in a scientific and reasonable way. Not only their reliability, historical sample, extreme response surface(ERS) and the cumulative probability distribution function but also their sensitivity and effect probability are obtained. Main factors affecting stress distribution and total deformation of bladed disk assemblies are investigated through the sensitivity analysis of the model. Finally, compared with the response surface method(RSM) and the Monte Carlo simulation(MCS), the results show that this new approach is effective.展开更多
A novel approach named aligned mixture probabilistic principal component analysis(AMPPCA) is proposed in this study for fault detection of multimode chemical processes. In order to exploit within-mode correlations,the...A novel approach named aligned mixture probabilistic principal component analysis(AMPPCA) is proposed in this study for fault detection of multimode chemical processes. In order to exploit within-mode correlations,the AMPPCA algorithm first estimates a statistical description for each operating mode by applying mixture probabilistic principal component analysis(MPPCA). As a comparison, the combined MPPCA is employed where monitoring results are softly integrated according to posterior probabilities of the test sample in each local model. For exploiting the cross-mode correlations, which may be useful but are inadvertently neglected due to separately held monitoring approaches, a global monitoring model is constructed by aligning all local models together. In this way, both within-mode and cross-mode correlations are preserved in this integrated space. Finally, the utility and feasibility of AMPPCA are demonstrated through a non-isothermal continuous stirred tank reactor and the TE benchmark process.展开更多
Select link analysis provides information of where traffic comes from and goes to at selected links.This disaggregate information has wide applications in practice.The state-of-the-art planning software packages often...Select link analysis provides information of where traffic comes from and goes to at selected links.This disaggregate information has wide applications in practice.The state-of-the-art planning software packages often adopt the user equilibrium(UE) model for select link analysis.However,empirical studies have repeatedly revealed that the stochastic user equilibrium model more accurately predicts observed mean and variance of choices than the UE model.This paper proposes an alternative select link analysis method by making use of the recently developed logit–weibit hybrid model,to alleviate the drawbacks of both logit and weibit models while keeping a closed-form route choice probability expression.To enhance the applicability in large-scale networks,Bell’s stochastic loading method originally developed for logit model is adapted to the hybrid model.The features of the proposed method are twofold:(1) unique O–D-specific link flow pattern and more plausible behavioral realism attributed to the hybrid route choice model and(2) applicability in large-scale networks due to the link-based stochastic loading method.An illustrative network example and a case study in a large-scale network are conducted to demonstrate the efficiency and effectiveness of the proposed select link analysis method as well as applications of O–D-specific link flow information.A visualizationmethod is also proposed to enhance the understanding of O–D-specific link flow originally in the form of a matrix.展开更多
Generally, the finite element analysis of a structure is completed under deterministic inputs.However,uncertainties corresponding to geometrical dimensions,material properties, boundary conditions cannot be neglected ...Generally, the finite element analysis of a structure is completed under deterministic inputs.However,uncertainties corresponding to geometrical dimensions,material properties, boundary conditions cannot be neglected in engineering applications. The probabilistic methods are the most popular techniques to handle these uncertain parameters but subjective results could be obtained if insufficient information is unavailable. Non-probabilistic methods can be alternatively employed,which has led to the procedures for nonprobabilistic finite element analysis. Each non-probabilistic finite element analysis method consists of two individual parts,including the core algorithm and pre-processing procedure. In this context,three types of algorithms and two typical pre-processing procedures as well as their effectiveness are described in detail,based on which novel hybrid algorithms can be conceived for the specific problems and the future work in this research field can be fostered.展开更多
New sequencing technologies such as Illumina/Solexa, SOLiD/ABI, and 454/Roche, revolutionized the biological researches. In this context, the SOLiD platform has a particular sequencing type, known as multiplex run, wh...New sequencing technologies such as Illumina/Solexa, SOLiD/ABI, and 454/Roche, revolutionized the biological researches. In this context, the SOLiD platform has a particular sequencing type, known as multiplex run, which enables the sequencing of several samples in a single run. It implies in cost reduction and simplifies the analysis of related samples. Meanwhile, this sequencing type requires an additional filtering step to ensure the reliability of the results. Thus, we propose in this paper a probabilistic model which considers the intrinsic characteristics of each sequencing to characterize multiplex runs and filter low-quality data, increasing the data analysis reliability of multiplex sequencing performed on SOLiD. The results show that the proposed model proves to be satisfactory due to: 1) identification of faults in the sequencing process;2) adaptation and development of new protocols for sample preparation;3) the assignment of a degree of confidence to the data generated;and 4) guiding a filtering process, without discarding useful sequences in an arbitrary manner.展开更多
Background: With mounting global environmental, social and economic pressures the resilience and stability of forests and thus the provisioning of vital ecosystem services is increasingly threatened. Intensified moni...Background: With mounting global environmental, social and economic pressures the resilience and stability of forests and thus the provisioning of vital ecosystem services is increasingly threatened. Intensified monitoring can help to detect ecological threats and changes earlier, but monitoring resources are limited. Participatory forest monitoring with the help of "citizen scientists" can provide additional resources for forest monitoring and at the same time help to communicate with stakeholders and the general public. Examples for citizen science projects in the forestry domain can be found but a solid, applicable larger framework to utilise public participation in the area of forest monitoring seems to be lacking. We propose that a better understanding of shared and related topics in citizen science and forest monitoring might be a first step towards such a framework. Methods: We conduct a systematic meta-analysis of 1015 publication abstracts addressing "forest monitoring" and "citizen science" in order to explore the combined topical landscape of these subjects. We employ 'topic modelling an unsupervised probabilistic machine learning method, to identify latent shared topics in the analysed publications. Results: We find that large shared topics exist, but that these are primarily topics that would be expected in scientific publications in general. Common domain-specific topics are under-represented and indicate a topical separation of the two document sets on "forest monitoring" and "citizen science" and thus the represented domains. While topic modelling as a method proves to be a scalable and useful analytical tool, we propose that our approach could deliver even more useful data if a larger document set and full-text publications would be available for analysis. Conclusions: We propose that these results, together with the observation of non-shared but related topics, point at under-utilised opportunities for public participation in forest monitoring. Citizen science could be applied as a versatile tool in forest ecosystems monitoring, complementing traditional forest monitoring programmes, assisting early threat recognition and helping to connect forest management with the general public. We conclude that our presented approach should be pursued further as it may aid the understanding and setup of citizen science efforts in the forest monitoring domain.展开更多
The recent outbreak of COVID-19 has caused millions of deaths worldwide and a huge societal and economic impact in virtually all countries. A large variety of mathematical models to describe the dynamics of COVID-19 t...The recent outbreak of COVID-19 has caused millions of deaths worldwide and a huge societal and economic impact in virtually all countries. A large variety of mathematical models to describe the dynamics of COVID-19 transmission have been reported. Among them, Bayesian probabilistic models of COVID-19 transmission dynamics have been very efficient in the interpretation of early data from the beginning of the pandemic, helping to estimate the impact of non-pharmacological measures in each country, and forecasting the evolution of the pandemic in different potential scenarios. These models use probability distribution curves to describe key dynamic aspects of the transmission, like the probability for every infected person of infecting other individuals, dying or recovering, with parameters obtained from experimental epidemiological data. However, the impact of vaccine-induced immunity, which has been key for controlling the public health emergency caused by the pandemic, has been more challenging to describe in these models, due to the complexity of experimental data. Here we report different probability distribution curves to model the acquisition and decay of immunity after vaccination. We discuss the mathematical background and how these models can be integrated in existing Bayesian probabilistic models to provide a good estimation of the dynamics of COVID-19 transmission during the entire pandemic period.展开更多
Probabilistic analysis in the field of seismic landslide hazard assessment is often based on an estimate of uncertainties of geological, geotechnical,geomorphological and seismological parameters.However, real situati...Probabilistic analysis in the field of seismic landslide hazard assessment is often based on an estimate of uncertainties of geological, geotechnical,geomorphological and seismological parameters.However, real situations are very complex and thus uncertainties of some parameters such as water content conditions and critical displacement are difficult to describe with accurate mathematical models. In this study, we present a probabilistic methodology based on the probabilistic seismic hazard analysis method and the Newmark’s displacement model. The Tianshui seismic zone(105°00′-106°00′ E, 34°20′-34°40′ N) in the northeastern Tibetan Plateau were used as an example. Arias intensity with three standard probabilities of exceedance(63%, 10%, and 2% in 50 years) in accordance with building design provisions were used to compute Newmark displacements by incorporating the effects of topographic amplification.Probable scenarios of water content condition were considered and three water content conditions(dry,wet and saturated) were adopted to simulate the effect of pore-water on slope. The influence of 5 cm and 10 cm critical displacements were investigated in order to analyze the sensitivity of critical displacement to the probabilities of earthquake-induced landslide occurrence. The results show that water content in particular, have a great influence on the distribution of high seismic landslide hazard areas. Generally, the dry coverage analysis represents a lower bound for susceptibility and hazard assessment, and the saturated coverage analysis represents an upper bound to some extent. Moreover, high seismic landslide hazard areas are also influenced by the critical displacements. The slope failure probabilities during future earthquakes with critical displacements of 5 cm can increase by a factor of 1.2 to 2.3 as compared to that of 10 cm. It suggests that more efforts are required in order to obtain reasonable threshold values for slope failure. Considering the probable scenarios of water content condition which is varied with seasons, seismic landslide hazard assessments are carried out for frequent, occasional and rare earthquake occurrences in the Tianshui region, which can provide a valuable reference for landslide hazard management and infrastructure design in mountainous seismic zones.展开更多
It is necessary to pay particular attention to the uncertainties that exist in an engineering problem to reduce the risk of seismic damage of infrastructures against natural hazards.Moreover,certain structural perform...It is necessary to pay particular attention to the uncertainties that exist in an engineering problem to reduce the risk of seismic damage of infrastructures against natural hazards.Moreover,certain structural performance levels should be satisfied during strong earthquakes.However,these performance levels have been only well described for aboveground structures.This study investigates the main uncertainties involved in the performance-based seismic analysis of a multi-story subway station.More than 100 pulse-like and no pulse-like ground motions have been selected.In this regard,an effective framework is presented,based on a set of nonlinear static and dynamic analyses performed by OpenSees code.The probabilistic seismic demand models for computing the free-field shear strain of soil and racking ratio of structure are proposed.These models result in less variability compared with existing relations,and make it possible to evaluate a wider range of uncertainties through reliability analysis in Rtx software using the Monte Carlo sampling method.This work is performed for three different structural performance levels(denoted as PL1ePL3).It is demonstrated that the error terms related to the magnitude and location of earthquake excitations and also the corresponding attenuation relationships have been the most important parameters.Therefore,using a faultestructure model would be inevitable for the reliability analysis of subway stations.It is found that the higher performance level(i.e.PL3)has more sensitivity to random variables than the others.In this condition,the pulse-like ground motions have a major contribution to the vulnerability of subway stations.展开更多
This study is to investigate what factors and how they affect tours (trip chains) behavior. The key issue is the understanding and definition of tour and tour level mode. Also, these definitions should fit for the dat...This study is to investigate what factors and how they affect tours (trip chains) behavior. The key issue is the understanding and definition of tour and tour level mode. Also, these definitions should fit for the data. A semi-home based tour definition is stated, and a competing mode based tour mode is defined. Based on the definition, this study used Madison Area Data from National Household Survey to estimate a MNL structured model. It is found that travel distance could be a positive factor for car mode. Meanwhile, the number of trips is also a positive factor for choosing car.展开更多
The objective of this paper is to present a review of different calibration and classification methods for functional data in the context of chemometric applications. In chemometric, it is usual to measure certain par...The objective of this paper is to present a review of different calibration and classification methods for functional data in the context of chemometric applications. In chemometric, it is usual to measure certain parameters in terms of a set of spectrometric curves that are observed in a finite set of points (functional data). Although the predictor variable is clearly functional, this problem is usually solved by using multivariate calibration techniques that consider it as a finite set of variables associated with the observed points (wavelengths or times). But these explicative variables are highly correlated and it is therefore more informative to reconstruct first the true functional form of the predictor curves. Although it has been published in several articles related to the implementation of functional data analysis techniques in chemometric, their power to solve real problems is not yet well known. Because of this the extension of multivariate calibration techniques (linear regression, principal component regression and partial least squares) and classification methods (linear discriminant analysis and logistic regression) to the functional domain and some relevant chemometric applications are reviewed in this paper.展开更多
It's been well recognized for the big role played by innovative rural cooperative financial organizations in terms of spreading farmers' operation risk,increasing farmers' income,and developing rural econo...It's been well recognized for the big role played by innovative rural cooperative financial organizations in terms of spreading farmers' operation risk,increasing farmers' income,and developing rural economy.However,no sufficient research has been conducted regarding those factors which may have effects on the farmer's willingness to participate in the new rural financial organization.This paper tries to fill out the gap of identifying various factors which may have potential influence on the farmer's willingness to participate in the new type of rural financial cooperatives.In the process,442 farmer households and small-micro-enterprises are sampled from the cooperative finance experiment villages in Panjin municipality of Liaoning province.The potential influencing factors are classified into four categories,including the famer household's characteristics,financial cooperative reputation,transaction costs,and service quality.A discrete Logit model is used for the parameter estimations.The results show that most assumed factors display statistical significance effect on the farmer's willingness to take part in the rural cooperative financing organizations but with different level of sensitivity.The cause and effect are fully discussed following by addressing policy issues related to the rural financing cooperative reforms.展开更多
A probabilistic equivalent method for doubly fed induction generator (DFIG) based wind farms is proposed in this paper.First,the wind farm equivalent model is assumed to be composed of three types of equivalent DFIGs ...A probabilistic equivalent method for doubly fed induction generator (DFIG) based wind farms is proposed in this paper.First,the wind farm equivalent model is assumed to be composed of three types of equivalent DFIGs with different dynamic characteristics.The structure of equivalent model remains constant,whereas the parameters change with the migration of different scenarios in the wind farm.Then,historical meteorological data are utilized to investigate the probability distribution of key equivalent parameters,such as capacity,wind speed and electrical impedance to the point of common coupling.Each type of equivalent DFIG is further clustered into several groups according to their active power output.Combinations are created to generate representative scenarios.The probabilistic equivalent model of wind farm is finally achieved after removing invalid combinations.Most matched representative scenarios can be predicted according to the real-time measurement.The equivalentmodel is applied to the probabilistic power flow calculation and the stability analysis of test systems.展开更多
The aim of this paper is to propose a theoretical approach for performing the nonprobabilistic reliability analysis of structure.Due to a great deal of uncertainties and limited measured data in engineering practice,t...The aim of this paper is to propose a theoretical approach for performing the nonprobabilistic reliability analysis of structure.Due to a great deal of uncertainties and limited measured data in engineering practice,the structural uncertain parameters were described as interval variables.The theoretical analysis model was developed by starting from the 2-D plane and 3-D space.In order to avoid the loss of probable failure points,the 2-D plane and 3-D space were respectively divided into two parts and three parts for further analysis.The study pointed out that the probable failure points only existed among extreme points and root points of the limit state function.Furthermore,the low-dimensional analytical scheme was extended to the high-dimensional case.Using the proposed approach,it is easy to find the most probable failure point and to acquire the reliability index through simple comparison directly.A number of equations used for calculating the extreme points and root points were also evaluated.This result was useful to avoid the loss of probable failure points and meaningful for optimizing searches in the research field.Finally,two kinds of examples were presented and compared with the existing computation.The good agreements show that the proposed theoretical analysis approach in the paper is correct.The efforts were conducted to improve the optimization method,to indicate the search direction and path,and to avoid only searching the local optimal solution which would result in missed probable failure points.展开更多
基金supported by the National Natural Science Foundation of China under Grant no. 61371113 and 61401240Graduate Student Research Innovation Program Foundation of Jiangsu Province no. YKC16006+1 种基金Graduate Student Research Innovation Program Foundation of Nantong University no. KYZZ160354Top-notch Academic Programs Project of Jiangsu Higher Education Institutions (PPZY2015B135)
文摘Probabilistic model checking has been widely applied to quantitative analysis of stochastic systems, e.g., analyzing the performance, reliability and survivability of computer and communication systems. In this paper, we extend the application of probabilistic model checking to the vehicle to vehicle(V2V) networks. We first develop a continuous-time Markov chain(CTMC) model for the considered V2V network, after that, the PRISM language is adopted to describe the CTMC model, and continuous-time stochastic logic is used to describe the objective survivability properties. In the analysis, two typical failures are considered, namely the node failure and the link failure, respectively induced by external malicious attacks on a target V2V node, and interrupt in a communication link. Considering these failures, their impacts on the network survivability are demonstrated. It is shown that with increasing failure strength, the network survivability is reduced. On the other hand, the network survivability can be improved with increasing repair rate. The proposed probabilistic model checking-based approach can be effectively used in survivability analysis for the V2V networks, moreover, it is anticipated that the approach can be conveniently extended to other networks.
基金Projects(51375032,51175017,51245027)supported by the National Natural Science Foundation of China
文摘In order to describe and control the stress distribution and total deformation of bladed disk assemblies used in the aeroengine, a highly efficient and precise method of probabilistic analysis which is called extremum response surface method(ERSM) is produced based on the previous deterministic analysis results with the finite element model(FEM). In this work, many key nonlinear factors, such as the dynamic feature of the temperature load, the centrifugal force and the boundary conditions, are taken into consideration for the model. The changing patterns with time of bladed disk assemblies about stress distribution and total deformation are obtained during the deterministic analysis, and at the same time, the largest deformation and stress nodes of bladed disk assemblies are found and taken as input target of probabilistic analysis in a scientific and reasonable way. Not only their reliability, historical sample, extreme response surface(ERS) and the cumulative probability distribution function but also their sensitivity and effect probability are obtained. Main factors affecting stress distribution and total deformation of bladed disk assemblies are investigated through the sensitivity analysis of the model. Finally, compared with the response surface method(RSM) and the Monte Carlo simulation(MCS), the results show that this new approach is effective.
基金Supported by the National Natural Science Foundation of China(61374140)Shanghai Pujiang Program(12PJ1402200)
文摘A novel approach named aligned mixture probabilistic principal component analysis(AMPPCA) is proposed in this study for fault detection of multimode chemical processes. In order to exploit within-mode correlations,the AMPPCA algorithm first estimates a statistical description for each operating mode by applying mixture probabilistic principal component analysis(MPPCA). As a comparison, the combined MPPCA is employed where monitoring results are softly integrated according to posterior probabilities of the test sample in each local model. For exploiting the cross-mode correlations, which may be useful but are inadvertently neglected due to separately held monitoring approaches, a global monitoring model is constructed by aligning all local models together. In this way, both within-mode and cross-mode correlations are preserved in this integrated space. Finally, the utility and feasibility of AMPPCA are demonstrated through a non-isothermal continuous stirred tank reactor and the TE benchmark process.
基金supported by National Natural Science Foundation of China(51408433)Fundamental Research Funds for the Central Universities of Chinathe Chenguang Program sponsored by Shanghai Education Development Foundation and Shanghai Municipal Education Commission
文摘Select link analysis provides information of where traffic comes from and goes to at selected links.This disaggregate information has wide applications in practice.The state-of-the-art planning software packages often adopt the user equilibrium(UE) model for select link analysis.However,empirical studies have repeatedly revealed that the stochastic user equilibrium model more accurately predicts observed mean and variance of choices than the UE model.This paper proposes an alternative select link analysis method by making use of the recently developed logit–weibit hybrid model,to alleviate the drawbacks of both logit and weibit models while keeping a closed-form route choice probability expression.To enhance the applicability in large-scale networks,Bell’s stochastic loading method originally developed for logit model is adapted to the hybrid model.The features of the proposed method are twofold:(1) unique O–D-specific link flow pattern and more plausible behavioral realism attributed to the hybrid route choice model and(2) applicability in large-scale networks due to the link-based stochastic loading method.An illustrative network example and a case study in a large-scale network are conducted to demonstrate the efficiency and effectiveness of the proposed select link analysis method as well as applications of O–D-specific link flow information.A visualizationmethod is also proposed to enhance the understanding of O–D-specific link flow originally in the form of a matrix.
基金Sponsored by the National Natural Science Foundation of China(Grant Nos.11432002,11372025 and 11602012)the National Key Research and Development Program(Grant No.2016YFB0200704)+1 种基金the Defense Industrial Technology Development Program(Grant Nos.JCKY2013601B001,JCKY2016601B001)the 111 Project(Grant No.B07009)
文摘Generally, the finite element analysis of a structure is completed under deterministic inputs.However,uncertainties corresponding to geometrical dimensions,material properties, boundary conditions cannot be neglected in engineering applications. The probabilistic methods are the most popular techniques to handle these uncertain parameters but subjective results could be obtained if insufficient information is unavailable. Non-probabilistic methods can be alternatively employed,which has led to the procedures for nonprobabilistic finite element analysis. Each non-probabilistic finite element analysis method consists of two individual parts,including the core algorithm and pre-processing procedure. In this context,three types of algorithms and two typical pre-processing procedures as well as their effectiveness are described in detail,based on which novel hybrid algorithms can be conceived for the specific problems and the future work in this research field can be fostered.
文摘New sequencing technologies such as Illumina/Solexa, SOLiD/ABI, and 454/Roche, revolutionized the biological researches. In this context, the SOLiD platform has a particular sequencing type, known as multiplex run, which enables the sequencing of several samples in a single run. It implies in cost reduction and simplifies the analysis of related samples. Meanwhile, this sequencing type requires an additional filtering step to ensure the reliability of the results. Thus, we propose in this paper a probabilistic model which considers the intrinsic characteristics of each sequencing to characterize multiplex runs and filter low-quality data, increasing the data analysis reliability of multiplex sequencing performed on SOLiD. The results show that the proposed model proves to be satisfactory due to: 1) identification of faults in the sequencing process;2) adaptation and development of new protocols for sample preparation;3) the assignment of a degree of confidence to the data generated;and 4) guiding a filtering process, without discarding useful sequences in an arbitrary manner.
文摘Background: With mounting global environmental, social and economic pressures the resilience and stability of forests and thus the provisioning of vital ecosystem services is increasingly threatened. Intensified monitoring can help to detect ecological threats and changes earlier, but monitoring resources are limited. Participatory forest monitoring with the help of "citizen scientists" can provide additional resources for forest monitoring and at the same time help to communicate with stakeholders and the general public. Examples for citizen science projects in the forestry domain can be found but a solid, applicable larger framework to utilise public participation in the area of forest monitoring seems to be lacking. We propose that a better understanding of shared and related topics in citizen science and forest monitoring might be a first step towards such a framework. Methods: We conduct a systematic meta-analysis of 1015 publication abstracts addressing "forest monitoring" and "citizen science" in order to explore the combined topical landscape of these subjects. We employ 'topic modelling an unsupervised probabilistic machine learning method, to identify latent shared topics in the analysed publications. Results: We find that large shared topics exist, but that these are primarily topics that would be expected in scientific publications in general. Common domain-specific topics are under-represented and indicate a topical separation of the two document sets on "forest monitoring" and "citizen science" and thus the represented domains. While topic modelling as a method proves to be a scalable and useful analytical tool, we propose that our approach could deliver even more useful data if a larger document set and full-text publications would be available for analysis. Conclusions: We propose that these results, together with the observation of non-shared but related topics, point at under-utilised opportunities for public participation in forest monitoring. Citizen science could be applied as a versatile tool in forest ecosystems monitoring, complementing traditional forest monitoring programmes, assisting early threat recognition and helping to connect forest management with the general public. We conclude that our presented approach should be pursued further as it may aid the understanding and setup of citizen science efforts in the forest monitoring domain.
文摘The recent outbreak of COVID-19 has caused millions of deaths worldwide and a huge societal and economic impact in virtually all countries. A large variety of mathematical models to describe the dynamics of COVID-19 transmission have been reported. Among them, Bayesian probabilistic models of COVID-19 transmission dynamics have been very efficient in the interpretation of early data from the beginning of the pandemic, helping to estimate the impact of non-pharmacological measures in each country, and forecasting the evolution of the pandemic in different potential scenarios. These models use probability distribution curves to describe key dynamic aspects of the transmission, like the probability for every infected person of infecting other individuals, dying or recovering, with parameters obtained from experimental epidemiological data. However, the impact of vaccine-induced immunity, which has been key for controlling the public health emergency caused by the pandemic, has been more challenging to describe in these models, due to the complexity of experimental data. Here we report different probability distribution curves to model the acquisition and decay of immunity after vaccination. We discuss the mathematical background and how these models can be integrated in existing Bayesian probabilistic models to provide a good estimation of the dynamics of COVID-19 transmission during the entire pandemic period.
基金funded by the National Key R&D Program (Grants No. 2018YFC1504601)National Natural Science Foundation of China (Grants No. 41572313 and 41702343)China Geological Survey Project (Grant No. DD20190717)
文摘Probabilistic analysis in the field of seismic landslide hazard assessment is often based on an estimate of uncertainties of geological, geotechnical,geomorphological and seismological parameters.However, real situations are very complex and thus uncertainties of some parameters such as water content conditions and critical displacement are difficult to describe with accurate mathematical models. In this study, we present a probabilistic methodology based on the probabilistic seismic hazard analysis method and the Newmark’s displacement model. The Tianshui seismic zone(105°00′-106°00′ E, 34°20′-34°40′ N) in the northeastern Tibetan Plateau were used as an example. Arias intensity with three standard probabilities of exceedance(63%, 10%, and 2% in 50 years) in accordance with building design provisions were used to compute Newmark displacements by incorporating the effects of topographic amplification.Probable scenarios of water content condition were considered and three water content conditions(dry,wet and saturated) were adopted to simulate the effect of pore-water on slope. The influence of 5 cm and 10 cm critical displacements were investigated in order to analyze the sensitivity of critical displacement to the probabilities of earthquake-induced landslide occurrence. The results show that water content in particular, have a great influence on the distribution of high seismic landslide hazard areas. Generally, the dry coverage analysis represents a lower bound for susceptibility and hazard assessment, and the saturated coverage analysis represents an upper bound to some extent. Moreover, high seismic landslide hazard areas are also influenced by the critical displacements. The slope failure probabilities during future earthquakes with critical displacements of 5 cm can increase by a factor of 1.2 to 2.3 as compared to that of 10 cm. It suggests that more efforts are required in order to obtain reasonable threshold values for slope failure. Considering the probable scenarios of water content condition which is varied with seasons, seismic landslide hazard assessments are carried out for frequent, occasional and rare earthquake occurrences in the Tianshui region, which can provide a valuable reference for landslide hazard management and infrastructure design in mountainous seismic zones.
文摘It is necessary to pay particular attention to the uncertainties that exist in an engineering problem to reduce the risk of seismic damage of infrastructures against natural hazards.Moreover,certain structural performance levels should be satisfied during strong earthquakes.However,these performance levels have been only well described for aboveground structures.This study investigates the main uncertainties involved in the performance-based seismic analysis of a multi-story subway station.More than 100 pulse-like and no pulse-like ground motions have been selected.In this regard,an effective framework is presented,based on a set of nonlinear static and dynamic analyses performed by OpenSees code.The probabilistic seismic demand models for computing the free-field shear strain of soil and racking ratio of structure are proposed.These models result in less variability compared with existing relations,and make it possible to evaluate a wider range of uncertainties through reliability analysis in Rtx software using the Monte Carlo sampling method.This work is performed for three different structural performance levels(denoted as PL1ePL3).It is demonstrated that the error terms related to the magnitude and location of earthquake excitations and also the corresponding attenuation relationships have been the most important parameters.Therefore,using a faultestructure model would be inevitable for the reliability analysis of subway stations.It is found that the higher performance level(i.e.PL3)has more sensitivity to random variables than the others.In this condition,the pulse-like ground motions have a major contribution to the vulnerability of subway stations.
文摘This study is to investigate what factors and how they affect tours (trip chains) behavior. The key issue is the understanding and definition of tour and tour level mode. Also, these definitions should fit for the data. A semi-home based tour definition is stated, and a competing mode based tour mode is defined. Based on the definition, this study used Madison Area Data from National Household Survey to estimate a MNL structured model. It is found that travel distance could be a positive factor for car mode. Meanwhile, the number of trips is also a positive factor for choosing car.
文摘The objective of this paper is to present a review of different calibration and classification methods for functional data in the context of chemometric applications. In chemometric, it is usual to measure certain parameters in terms of a set of spectrometric curves that are observed in a finite set of points (functional data). Although the predictor variable is clearly functional, this problem is usually solved by using multivariate calibration techniques that consider it as a finite set of variables associated with the observed points (wavelengths or times). But these explicative variables are highly correlated and it is therefore more informative to reconstruct first the true functional form of the predictor curves. Although it has been published in several articles related to the implementation of functional data analysis techniques in chemometric, their power to solve real problems is not yet well known. Because of this the extension of multivariate calibration techniques (linear regression, principal component regression and partial least squares) and classification methods (linear discriminant analysis and logistic regression) to the functional domain and some relevant chemometric applications are reviewed in this paper.
基金funded by the Sino-US international collaboration project funded by the U.S.Department of Agriculture Risk Management Agency through the Center for Agribusiness Excellence,Tarleton State University,Texas A&M University System(No.53-3151-2-00017)The general project of Humanities and Social Sciences,the Education Department of Liaoning province(W2014284)funded by China National Natural Science Foundation(Ref.No.71271040)
文摘It's been well recognized for the big role played by innovative rural cooperative financial organizations in terms of spreading farmers' operation risk,increasing farmers' income,and developing rural economy.However,no sufficient research has been conducted regarding those factors which may have effects on the farmer's willingness to participate in the new rural financial organization.This paper tries to fill out the gap of identifying various factors which may have potential influence on the farmer's willingness to participate in the new type of rural financial cooperatives.In the process,442 farmer households and small-micro-enterprises are sampled from the cooperative finance experiment villages in Panjin municipality of Liaoning province.The potential influencing factors are classified into four categories,including the famer household's characteristics,financial cooperative reputation,transaction costs,and service quality.A discrete Logit model is used for the parameter estimations.The results show that most assumed factors display statistical significance effect on the farmer's willingness to take part in the rural cooperative financing organizations but with different level of sensitivity.The cause and effect are fully discussed following by addressing policy issues related to the rural financing cooperative reforms.
基金supported by the Special Fund of the National Priority Basic Research of China (No. 2013CB228204)the National Science Foundation of China (No. 50977021)
文摘A probabilistic equivalent method for doubly fed induction generator (DFIG) based wind farms is proposed in this paper.First,the wind farm equivalent model is assumed to be composed of three types of equivalent DFIGs with different dynamic characteristics.The structure of equivalent model remains constant,whereas the parameters change with the migration of different scenarios in the wind farm.Then,historical meteorological data are utilized to investigate the probability distribution of key equivalent parameters,such as capacity,wind speed and electrical impedance to the point of common coupling.Each type of equivalent DFIG is further clustered into several groups according to their active power output.Combinations are created to generate representative scenarios.The probabilistic equivalent model of wind farm is finally achieved after removing invalid combinations.Most matched representative scenarios can be predicted according to the real-time measurement.The equivalentmodel is applied to the probabilistic power flow calculation and the stability analysis of test systems.
基金the National Natural Science Foundation of China (51408444, 51708428)
文摘The aim of this paper is to propose a theoretical approach for performing the nonprobabilistic reliability analysis of structure.Due to a great deal of uncertainties and limited measured data in engineering practice,the structural uncertain parameters were described as interval variables.The theoretical analysis model was developed by starting from the 2-D plane and 3-D space.In order to avoid the loss of probable failure points,the 2-D plane and 3-D space were respectively divided into two parts and three parts for further analysis.The study pointed out that the probable failure points only existed among extreme points and root points of the limit state function.Furthermore,the low-dimensional analytical scheme was extended to the high-dimensional case.Using the proposed approach,it is easy to find the most probable failure point and to acquire the reliability index through simple comparison directly.A number of equations used for calculating the extreme points and root points were also evaluated.This result was useful to avoid the loss of probable failure points and meaningful for optimizing searches in the research field.Finally,two kinds of examples were presented and compared with the existing computation.The good agreements show that the proposed theoretical analysis approach in the paper is correct.The efforts were conducted to improve the optimization method,to indicate the search direction and path,and to avoid only searching the local optimal solution which would result in missed probable failure points.