A clustering algorithm and a probability statistics method were applied to different phases of a flight to analyze operation time during aircraft ground taxiing and airborne flight.And the clustering pattern,distribut...A clustering algorithm and a probability statistics method were applied to different phases of a flight to analyze operation time during aircraft ground taxiing and airborne flight.And the clustering pattern,distribution characteristics and dynamically changing rules of the two phases were identified.Further,an estimate method was established to measure operation time of flight legs,with creative steps of calculating individual segment separately and then integrating them accordingly.The method can both objectively and dynamically measure operation time,and accurately reflect real situation.It helps to better utilize airport slot resources and provides a strong support for air traffic flow management when scheduling flight plan in strategic and pre-tactic phases.展开更多
The problems related to agricultural structure engineering for crack simulation and reliability analysis are complicated because its variables contain wide ranges of mean and standard deviation.This paper describes an...The problems related to agricultural structure engineering for crack simulation and reliability analysis are complicated because its variables contain wide ranges of mean and standard deviation.This paper describes an integrated model to perform crack simulation and reliability analysis of a continuum structure.The structure is assumed to be under a two-dimensional plane stress and the deformation is infinitesimal.A truss structure model that has the same behaviour as a continuum structure was developed using irregular triangle truss components where each element consists of two hinges with an axial degree of freedom at both of their ends.A Monte-Carlo simulation(MCS)was adopted for the reliability analysis.If the length of one side of the irregular triangle mesh is shorter than the thickness of the structure,the slenderness associated with compressive failure needs to be examined only for the short column.For that reason,the failure criterion suitable for the equivalent truss structure model was established by checking only axial stresses acting on truss members.Since nodes of the equivalent truss structure model for the structural analysis in this study consist of hinges,development of plastic hinges that occurred during crack propagation were not considered in this model.To simulate the development of crack,truss members over allowable stresses of tension or compression among truss members with the largest amount of stress at each completed structural analysis time step were sequentially removed.Since irregular triangle meshes have an uncertainty in themselves to compare with regular meshes,the equivalent truss structure model could describe crack propagation more realistically.The failure probabilities of structures under various loads and boundary conditions had good agreement with the analytical solutions directly solved from the limit state equations expressed in the form of moments.展开更多
In practical process industries,a variety of online and offline sensors and measuring instruments have been used for process control and monitoring purposes,which indicates that the measurements coming from different ...In practical process industries,a variety of online and offline sensors and measuring instruments have been used for process control and monitoring purposes,which indicates that the measurements coming from different sources are collected at different sampling rates.To build a complete process monitoring strategy,all these multi-rate measurements should be considered for data-based modeling and monitoring.In this paper,a novel kernel multi-rate probabilistic principal component analysis(K-MPPCA)model is proposed to extract the nonlinear correlations among different sampling rates.In the proposed model,the model parameters are calibrated using the kernel trick and the expectation-maximum(EM)algorithm.Also,the corresponding fault detection methods based on the nonlinear features are developed.Finally,a simulated nonlinear case and an actual pre-decarburization unit in the ammonia synthesis process are tested to demonstrate the efficiency of the proposed method.展开更多
It is pointed out in this paper that the concept of scenario earthquake, expectant earthquake or proposed earthquake suggested by Kameda Nojima (1988) is not probability consistent due to unfit understanding for the ...It is pointed out in this paper that the concept of scenario earthquake, expectant earthquake or proposed earthquake suggested by Kameda Nojima (1988) is not probability consistent due to unfit understanding for the aseismic design standard of probabilistic method. The corresponding concept proposed by QI FENG LUO meets the meaning of probability consistent, but it is still in a meaning of average so the result is not good enough. On the basis of above analysis, a concept of probability consistent conservative earthquakes is suggested. And a new method selecting aseismic objective earthquake with physical meaning is proposed on the basis of probabilistic method. After seismic hazard is analysed by certain control parameters, such as peak acceleration, we can determine the aseismic standard according to certain probabilistic level. Based on the attenuation law and the potential sources, we can find out some earthquakes or their combinations of magnitudes and distances. Such earthquakes or combinations are probability consistent for this control parameter. Based on above parameter, we suggest considering the destructive effects of other parameters (such as response spectrum), and selecting conservative earthquakes to replace the average earthquake and meet the requirements of aseismic design better.展开更多
Statistical analysis was done on simultaneous wave and wind using data recorded by discus-shape wave buoy. The area is located in the southern Caspian Sea near the Anzali Port. Recorded wave data were obtained through...Statistical analysis was done on simultaneous wave and wind using data recorded by discus-shape wave buoy. The area is located in the southern Caspian Sea near the Anzali Port. Recorded wave data were obtained through directional spectrum wave analysis. Recorded wind direction and wind speed were obtained through the related time series as well. For 12-month measurements(May 25 2007-2008), statistical calculations were done to specify the value of nonlinear auto-correlation of wave and wind using the probability distribution function of wave characteristics and statistical analysis in various time periods. The paper also presents and analyzes the amount of wave energy for the area mentioned on the basis of available database. Analyses showed a suitable comparison between the amounts of wave energy in different seasons. As a result, the best period for the largest amount of wave energy was known. Results showed that in the research period, the mean wave and wind auto correlation were about three hours. Among the probability distribution functions, i.e Weibull, Normal, Lognormal and Rayleigh, "Weibull" had the best consistency with experimental distribution function shown in different diagrams for each season. Results also showed that the mean wave energy in the research period was about 49.88 k W/m and the maximum density of wave energy was found in February and March, 2010.展开更多
An analysis of nearly 250 years of flood records on the river Eden at Appleby-in-Westmorland has enabled a flood frequency relationship to be established. The most severe floods were in the late 18th and early 19th ce...An analysis of nearly 250 years of flood records on the river Eden at Appleby-in-Westmorland has enabled a flood frequency relationship to be established. The most severe floods were in the late 18th and early 19th century. With such a long history of flooding, some remedial measures would have been expected but the local people have, to some extent, adapted to the flood hazard by means of temporary and permanent flood proofing methods such as a cemented board across a doorway and removable flood boards. These measures were overwhelmed during the 2015 flood, as were the flood gates installed by the Environment Agency in 1998. A higher level of protection from floods at Appleby is called for.展开更多
The nature of random errors in any data set is Gaussian, which is a well established fact according to the Central Limit Theorem. Supernovae type Ia data have played a crucial role in major discoveries in cosmology. U...The nature of random errors in any data set is Gaussian, which is a well established fact according to the Central Limit Theorem. Supernovae type Ia data have played a crucial role in major discoveries in cosmology. Unlike in laboratory experiments, astronomical measurements cannot be performed in controlled situations. Thus, errors in astronomical data can be more severe in terms of systematics and non-Gaussianity compared to those of laboratory experiments. In this paper, we use the Kolmogorov-Smiruov statistic to test non-Gaussianity in high-z supernovae data. We apply this statistic to four data sets, i.e., Gold data (2004), Gold data (2007), the Union2 catalog and the Union2.1 data set for our analysis. Our results show that in all four data sets the errors are consistent with a Gaussian distribution.展开更多
Conventional trajectory optimization techniques have been challenged by their inability to handle threats with irregular shapes and the tendency to be sensitive to control variations of aircraft. Aiming to overcome th...Conventional trajectory optimization techniques have been challenged by their inability to handle threats with irregular shapes and the tendency to be sensitive to control variations of aircraft. Aiming to overcome these difficulties, this paper presents an alternative approach for trajectory optimization, where the problem is formulated into a parametric optimization of the maneuver variables under a tactics template framework. To reduce the size of the problem, global sensitivity analysis (GSA) is performed to identify the less-influential maneuver variables. The probability collectives (PC) algorithm, which is well-suited to discrete and discontinuous optimization, is applied to solve the trajectory optimization problem. The robustness of the trajectory is assessed through multiple sampling around the chosen values of the maneuver variables. Meta-models based on radius basis function (RBF) are created for evaluations of the means and deviations of the problem objectives and constraints. To guarantee the approximation accuracy, the meta-models are adaptively updated during optimization. The proposed approach is demonstrated on a typical airground attack mission scenario. Results reveal that the proposed approach is capable of generating robust and optimal trajectories with both accuracy and efficiency.展开更多
Traditional Global Sensitivity Analysis(GSA) focuses on ranking inputs according to their contributions to the output uncertainty.However,information about how the specific regions inside an input affect the output ...Traditional Global Sensitivity Analysis(GSA) focuses on ranking inputs according to their contributions to the output uncertainty.However,information about how the specific regions inside an input affect the output is beyond the traditional GSA techniques.To fully address this issue,in this work,two regional moment-independent importance measures,Regional Importance Measure based on Probability Density Function(RIMPDF) and Regional Importance Measure based on Cumulative Distribution Function(RIMCDF),are introduced to find out the contributions of specific regions of an input to the whole output distribution.The two regional importance measures prove to be reasonable supplements of the traditional GSA techniques.The ideas of RIMPDF and RIMCDF are applied in two engineering examples to demonstrate that the regional moment-independent importance analysis can add more information concerning the contributions of model inputs.展开更多
The aim of this paper is to propose a theoretical approach for performing the nonprobabilistic reliability analysis of structure.Due to a great deal of uncertainties and limited measured data in engineering practice,t...The aim of this paper is to propose a theoretical approach for performing the nonprobabilistic reliability analysis of structure.Due to a great deal of uncertainties and limited measured data in engineering practice,the structural uncertain parameters were described as interval variables.The theoretical analysis model was developed by starting from the 2-D plane and 3-D space.In order to avoid the loss of probable failure points,the 2-D plane and 3-D space were respectively divided into two parts and three parts for further analysis.The study pointed out that the probable failure points only existed among extreme points and root points of the limit state function.Furthermore,the low-dimensional analytical scheme was extended to the high-dimensional case.Using the proposed approach,it is easy to find the most probable failure point and to acquire the reliability index through simple comparison directly.A number of equations used for calculating the extreme points and root points were also evaluated.This result was useful to avoid the loss of probable failure points and meaningful for optimizing searches in the research field.Finally,two kinds of examples were presented and compared with the existing computation.The good agreements show that the proposed theoretical analysis approach in the paper is correct.The efforts were conducted to improve the optimization method,to indicate the search direction and path,and to avoid only searching the local optimal solution which would result in missed probable failure points.展开更多
Estimating the Probability Density Function(PDF) of the performance function is a direct way for structural reliability analysis,and the failure probability can be easily obtained by integration in the failure domai...Estimating the Probability Density Function(PDF) of the performance function is a direct way for structural reliability analysis,and the failure probability can be easily obtained by integration in the failure domain.However,efficiently estimating the PDF is still an urgent problem to be solved.The existing fractional moment based maximum entropy has provided a very advanced method for the PDF estimation,whereas the main shortcoming is that it limits the application of the reliability analysis method only to structures with independent inputs.While in fact,structures with correlated inputs always exist in engineering,thus this paper improves the maximum entropy method,and applies the Unscented Transformation(UT) technique to compute the fractional moments of the performance function for structures with correlations,which is a very efficient moment estimation method for models with any inputs.The proposed method can precisely estimate the probability distributions of performance functions for structures with correlations.Besides,the number of function evaluations of the proposed method in reliability analysis,which is determined by UT,is really small.Several examples are employed to illustrate the accuracy and advantages of the proposed method.展开更多
A hybrid energy storage system(HESS)plays an important role in balancing the cost with the performance in terms of stabilizing the fluctuant power of wind farms and photovoltaic(PV)stations.To further bring down the c...A hybrid energy storage system(HESS)plays an important role in balancing the cost with the performance in terms of stabilizing the fluctuant power of wind farms and photovoltaic(PV)stations.To further bring down the cost and actually implement the dispatchability of wind/PV plants,there is a need to penetrate into the major factors that contribute to the cost of the any HESS.This paper first discusses hybrid energy storage systems,as well as chemical properties in different medium,deeming the ramp rate as one of the determinants that must be observed in the cost calculation.Then,a mathematical tool,Copula,is explained in details for the purpose of unscrambling the dependences between the power of wind and PV plants.To lower the cost,the basic rule for allocation of buffered power is also put forward,with the minimum energy capacities of the battery ESS(BESS)and the supercapacitor ESS(SC-ESS)simultaneously determined by integration.And the paper introduces the probability method to analyze how power and energy is compensated in certain confidence level.After that,two definitions of coefficients are set up,separately describing energy storage status and wind curtailment level.Finally,the paper gives a numerical example stemmed from real data acquired in wind farms and PV stations in Belgium.The conclusion presents that the cost of a hybrid energy storage system is greatly affected by ramp-rate and dependence between the power of wind farms and photovoltaic stations,in which dependence can easily be determined by Copulas.展开更多
To improve nitrogen removal performance of wastewater treatment plants (WWTPs), it is essential to understand the behavior of nitrogen cycling communities, which comprise various microorganisms. This study character...To improve nitrogen removal performance of wastewater treatment plants (WWTPs), it is essential to understand the behavior of nitrogen cycling communities, which comprise various microorganisms. This study characterized the quantity and diversity of nitrogen cycling genes in various processes of municipal WWTPs by employing two molecular-based methods:most probable number-polymerase chain reaction (MPN-PCR) and DNA microarray. MPN-PCR analysis revealed that gene quantities were not statistically different among processes, suggesting that conventional actwated sludge processes (CAS) are similar to nitrogen removal processes in their ability to retain an adequate population of nitrogen cycling microorganisms. Furthermore, most processes in the WWTPs that were researched shared a pattern:the nitS and the bacterial amoA genes were more abundant than the nirK and archaeal amoA genes, respectivelv. DNA microarray analysis revealed that several kinds of nitrification and denitrification genes were detected in both CAS and anaerobic-oxic processes (AO), whereas limited genes were detected in nitrogen removal processes. Results of this study suggest that CAS maintains a diverse community of nitrogen cycling microorganisms; moreover, the microbial communities in nitrogen removal processes may be specific.展开更多
基金supported by the National Natural Science Foundation of China(No.U1333202)
文摘A clustering algorithm and a probability statistics method were applied to different phases of a flight to analyze operation time during aircraft ground taxiing and airborne flight.And the clustering pattern,distribution characteristics and dynamically changing rules of the two phases were identified.Further,an estimate method was established to measure operation time of flight legs,with creative steps of calculating individual segment separately and then integrating them accordingly.The method can both objectively and dynamically measure operation time,and accurately reflect real situation.It helps to better utilize airport slot resources and provides a strong support for air traffic flow management when scheduling flight plan in strategic and pre-tactic phases.
文摘The problems related to agricultural structure engineering for crack simulation and reliability analysis are complicated because its variables contain wide ranges of mean and standard deviation.This paper describes an integrated model to perform crack simulation and reliability analysis of a continuum structure.The structure is assumed to be under a two-dimensional plane stress and the deformation is infinitesimal.A truss structure model that has the same behaviour as a continuum structure was developed using irregular triangle truss components where each element consists of two hinges with an axial degree of freedom at both of their ends.A Monte-Carlo simulation(MCS)was adopted for the reliability analysis.If the length of one side of the irregular triangle mesh is shorter than the thickness of the structure,the slenderness associated with compressive failure needs to be examined only for the short column.For that reason,the failure criterion suitable for the equivalent truss structure model was established by checking only axial stresses acting on truss members.Since nodes of the equivalent truss structure model for the structural analysis in this study consist of hinges,development of plastic hinges that occurred during crack propagation were not considered in this model.To simulate the development of crack,truss members over allowable stresses of tension or compression among truss members with the largest amount of stress at each completed structural analysis time step were sequentially removed.Since irregular triangle meshes have an uncertainty in themselves to compare with regular meshes,the equivalent truss structure model could describe crack propagation more realistically.The failure probabilities of structures under various loads and boundary conditions had good agreement with the analytical solutions directly solved from the limit state equations expressed in the form of moments.
基金supported by Zhejiang Provincial Natural Science Foundation of China(LY19F030003)Key Research and Development Project of Zhejiang Province(2021C04030)+1 种基金the National Natural Science Foundation of China(62003306)Educational Commission Research Program of Zhejiang Province(Y202044842)。
文摘In practical process industries,a variety of online and offline sensors and measuring instruments have been used for process control and monitoring purposes,which indicates that the measurements coming from different sources are collected at different sampling rates.To build a complete process monitoring strategy,all these multi-rate measurements should be considered for data-based modeling and monitoring.In this paper,a novel kernel multi-rate probabilistic principal component analysis(K-MPPCA)model is proposed to extract the nonlinear correlations among different sampling rates.In the proposed model,the model parameters are calibrated using the kernel trick and the expectation-maximum(EM)algorithm.Also,the corresponding fault detection methods based on the nonlinear features are developed.Finally,a simulated nonlinear case and an actual pre-decarburization unit in the ammonia synthesis process are tested to demonstrate the efficiency of the proposed method.
文摘It is pointed out in this paper that the concept of scenario earthquake, expectant earthquake or proposed earthquake suggested by Kameda Nojima (1988) is not probability consistent due to unfit understanding for the aseismic design standard of probabilistic method. The corresponding concept proposed by QI FENG LUO meets the meaning of probability consistent, but it is still in a meaning of average so the result is not good enough. On the basis of above analysis, a concept of probability consistent conservative earthquakes is suggested. And a new method selecting aseismic objective earthquake with physical meaning is proposed on the basis of probabilistic method. After seismic hazard is analysed by certain control parameters, such as peak acceleration, we can determine the aseismic standard according to certain probabilistic level. Based on the attenuation law and the potential sources, we can find out some earthquakes or their combinations of magnitudes and distances. Such earthquakes or combinations are probability consistent for this control parameter. Based on above parameter, we suggest considering the destructive effects of other parameters (such as response spectrum), and selecting conservative earthquakes to replace the average earthquake and meet the requirements of aseismic design better.
文摘Statistical analysis was done on simultaneous wave and wind using data recorded by discus-shape wave buoy. The area is located in the southern Caspian Sea near the Anzali Port. Recorded wave data were obtained through directional spectrum wave analysis. Recorded wind direction and wind speed were obtained through the related time series as well. For 12-month measurements(May 25 2007-2008), statistical calculations were done to specify the value of nonlinear auto-correlation of wave and wind using the probability distribution function of wave characteristics and statistical analysis in various time periods. The paper also presents and analyzes the amount of wave energy for the area mentioned on the basis of available database. Analyses showed a suitable comparison between the amounts of wave energy in different seasons. As a result, the best period for the largest amount of wave energy was known. Results showed that in the research period, the mean wave and wind auto correlation were about three hours. Among the probability distribution functions, i.e Weibull, Normal, Lognormal and Rayleigh, "Weibull" had the best consistency with experimental distribution function shown in different diagrams for each season. Results also showed that the mean wave energy in the research period was about 49.88 k W/m and the maximum density of wave energy was found in February and March, 2010.
文摘An analysis of nearly 250 years of flood records on the river Eden at Appleby-in-Westmorland has enabled a flood frequency relationship to be established. The most severe floods were in the late 18th and early 19th century. With such a long history of flooding, some remedial measures would have been expected but the local people have, to some extent, adapted to the flood hazard by means of temporary and permanent flood proofing methods such as a cemented board across a doorway and removable flood boards. These measures were overwhelmed during the 2015 flood, as were the flood gates installed by the Environment Agency in 1998. A higher level of protection from floods at Appleby is called for.
文摘The nature of random errors in any data set is Gaussian, which is a well established fact according to the Central Limit Theorem. Supernovae type Ia data have played a crucial role in major discoveries in cosmology. Unlike in laboratory experiments, astronomical measurements cannot be performed in controlled situations. Thus, errors in astronomical data can be more severe in terms of systematics and non-Gaussianity compared to those of laboratory experiments. In this paper, we use the Kolmogorov-Smiruov statistic to test non-Gaussianity in high-z supernovae data. We apply this statistic to four data sets, i.e., Gold data (2004), Gold data (2007), the Union2 catalog and the Union2.1 data set for our analysis. Our results show that in all four data sets the errors are consistent with a Gaussian distribution.
基金supported by Open Research Foundation of Science and Technology on Aerospace Flight Dynamics Laboratory (No. 2012afd1010)
文摘Conventional trajectory optimization techniques have been challenged by their inability to handle threats with irregular shapes and the tendency to be sensitive to control variations of aircraft. Aiming to overcome these difficulties, this paper presents an alternative approach for trajectory optimization, where the problem is formulated into a parametric optimization of the maneuver variables under a tactics template framework. To reduce the size of the problem, global sensitivity analysis (GSA) is performed to identify the less-influential maneuver variables. The probability collectives (PC) algorithm, which is well-suited to discrete and discontinuous optimization, is applied to solve the trajectory optimization problem. The robustness of the trajectory is assessed through multiple sampling around the chosen values of the maneuver variables. Meta-models based on radius basis function (RBF) are created for evaluations of the means and deviations of the problem objectives and constraints. To guarantee the approximation accuracy, the meta-models are adaptively updated during optimization. The proposed approach is demonstrated on a typical airground attack mission scenario. Results reveal that the proposed approach is capable of generating robust and optimal trajectories with both accuracy and efficiency.
基金supported by the National Natural Science Foundation of China(No.NSFC51608446)the Fundamental Research Fund for Central Universities of China(No.3102016ZY015)
文摘Traditional Global Sensitivity Analysis(GSA) focuses on ranking inputs according to their contributions to the output uncertainty.However,information about how the specific regions inside an input affect the output is beyond the traditional GSA techniques.To fully address this issue,in this work,two regional moment-independent importance measures,Regional Importance Measure based on Probability Density Function(RIMPDF) and Regional Importance Measure based on Cumulative Distribution Function(RIMCDF),are introduced to find out the contributions of specific regions of an input to the whole output distribution.The two regional importance measures prove to be reasonable supplements of the traditional GSA techniques.The ideas of RIMPDF and RIMCDF are applied in two engineering examples to demonstrate that the regional moment-independent importance analysis can add more information concerning the contributions of model inputs.
基金the National Natural Science Foundation of China (51408444, 51708428)
文摘The aim of this paper is to propose a theoretical approach for performing the nonprobabilistic reliability analysis of structure.Due to a great deal of uncertainties and limited measured data in engineering practice,the structural uncertain parameters were described as interval variables.The theoretical analysis model was developed by starting from the 2-D plane and 3-D space.In order to avoid the loss of probable failure points,the 2-D plane and 3-D space were respectively divided into two parts and three parts for further analysis.The study pointed out that the probable failure points only existed among extreme points and root points of the limit state function.Furthermore,the low-dimensional analytical scheme was extended to the high-dimensional case.Using the proposed approach,it is easy to find the most probable failure point and to acquire the reliability index through simple comparison directly.A number of equations used for calculating the extreme points and root points were also evaluated.This result was useful to avoid the loss of probable failure points and meaningful for optimizing searches in the research field.Finally,two kinds of examples were presented and compared with the existing computation.The good agreements show that the proposed theoretical analysis approach in the paper is correct.The efforts were conducted to improve the optimization method,to indicate the search direction and path,and to avoid only searching the local optimal solution which would result in missed probable failure points.
基金supported by the Equipment Development Department ‘‘13th Five-year” Equipment Research Field Foundation of China Central Military Commission(No.6140244010216HT15001)
文摘Estimating the Probability Density Function(PDF) of the performance function is a direct way for structural reliability analysis,and the failure probability can be easily obtained by integration in the failure domain.However,efficiently estimating the PDF is still an urgent problem to be solved.The existing fractional moment based maximum entropy has provided a very advanced method for the PDF estimation,whereas the main shortcoming is that it limits the application of the reliability analysis method only to structures with independent inputs.While in fact,structures with correlated inputs always exist in engineering,thus this paper improves the maximum entropy method,and applies the Unscented Transformation(UT) technique to compute the fractional moments of the performance function for structures with correlations,which is a very efficient moment estimation method for models with any inputs.The proposed method can precisely estimate the probability distributions of performance functions for structures with correlations.Besides,the number of function evaluations of the proposed method in reliability analysis,which is determined by UT,is really small.Several examples are employed to illustrate the accuracy and advantages of the proposed method.
基金supported by Shanghai Science and Technology Committee(13231204002)National Key Technology R&D Program of China(2015BAA01B02).
文摘A hybrid energy storage system(HESS)plays an important role in balancing the cost with the performance in terms of stabilizing the fluctuant power of wind farms and photovoltaic(PV)stations.To further bring down the cost and actually implement the dispatchability of wind/PV plants,there is a need to penetrate into the major factors that contribute to the cost of the any HESS.This paper first discusses hybrid energy storage systems,as well as chemical properties in different medium,deeming the ramp rate as one of the determinants that must be observed in the cost calculation.Then,a mathematical tool,Copula,is explained in details for the purpose of unscrambling the dependences between the power of wind and PV plants.To lower the cost,the basic rule for allocation of buffered power is also put forward,with the minimum energy capacities of the battery ESS(BESS)and the supercapacitor ESS(SC-ESS)simultaneously determined by integration.And the paper introduces the probability method to analyze how power and energy is compensated in certain confidence level.After that,two definitions of coefficients are set up,separately describing energy storage status and wind curtailment level.Finally,the paper gives a numerical example stemmed from real data acquired in wind farms and PV stations in Belgium.The conclusion presents that the cost of a hybrid energy storage system is greatly affected by ramp-rate and dependence between the power of wind farms and photovoltaic stations,in which dependence can easily be determined by Copulas.
文摘To improve nitrogen removal performance of wastewater treatment plants (WWTPs), it is essential to understand the behavior of nitrogen cycling communities, which comprise various microorganisms. This study characterized the quantity and diversity of nitrogen cycling genes in various processes of municipal WWTPs by employing two molecular-based methods:most probable number-polymerase chain reaction (MPN-PCR) and DNA microarray. MPN-PCR analysis revealed that gene quantities were not statistically different among processes, suggesting that conventional actwated sludge processes (CAS) are similar to nitrogen removal processes in their ability to retain an adequate population of nitrogen cycling microorganisms. Furthermore, most processes in the WWTPs that were researched shared a pattern:the nitS and the bacterial amoA genes were more abundant than the nirK and archaeal amoA genes, respectivelv. DNA microarray analysis revealed that several kinds of nitrification and denitrification genes were detected in both CAS and anaerobic-oxic processes (AO), whereas limited genes were detected in nitrogen removal processes. Results of this study suggest that CAS maintains a diverse community of nitrogen cycling microorganisms; moreover, the microbial communities in nitrogen removal processes may be specific.