The conventional method which assumes the soil distribution is continuous was unsuitable for estimating soil organic carbon density(SOCD) in Karst areas because of its discontinuous soil distribution. The accurate est...The conventional method which assumes the soil distribution is continuous was unsuitable for estimating soil organic carbon density(SOCD) in Karst areas because of its discontinuous soil distribution. The accurate estimation of SOCD in Karst areas is essential for carbon sequestration assessment in China. In this study, a modified method,which considers the vertical proportion of soil area in the profile when calculating the SOCD, was developed to estimate the SOCD in a typical Karst peak-cluster depression area in southwest China. In the modified method, ground-penetrating radar(GPR) technology was used to detect the distribution and thickness of soil. The accuracy of the method was confirmed through comparison with the data obtained using a validation method, in which the soil thickness was measured by excavation. In comparison with the conventional method and average-soil-depth method,the SOCD estimated using the GPR method showed the minimum relative error with respect to that obtained using the validation method. At a regional scale, the average SOCDs at depths of 0-20 cm and 0-100 cm, which were interpolated by ordinary kriging,were 1.49(ranging from 0.03-5.65) and 2.26(0.09-11.60) kgm-2based on GPR method in our study area(covering 393.6 hm2), respectively. Therefore, the modified method can be applied on the accurate estimation of SOCD in discontinuous soil areas such as Karst regions.展开更多
With the recent advent of Intelligent Transporta- tion Systems (ITS), and their associated data collection and archiving capabilities, there is now a rich data source for transportation professionals to develop capa...With the recent advent of Intelligent Transporta- tion Systems (ITS), and their associated data collection and archiving capabilities, there is now a rich data source for transportation professionals to develop capacity values for their own jurisdictions. Unfortunately, there is no consensus on the best approach for estimating capacity from ITS data. The motivation of this paper is to compare and contrast four of the most popular capacity estimation techniques in terms of (1) data requirements, (2) modeling effort required, (3) esti- mated parameter values, (4) theoretical background, and (5) statistical differences across time and over geographically dispersed locations. Specifically, the first method is the maximum observed value, the second is a standard funda- mental diagram curve fitting approach using the popular Van Aerde model, the third method uses the breakdown identifi- cation approach, and the fourth method is the survival prob- ability based on product limit method. These four approaches were tested on two test beds: one is located in San Diego, California, U.S., and has data from 112 work days; the other is located in Shanghai, China, and consists of 81 work days. It was found that, irrespective of the estimation methodology and the definition of capacity, the estimated capacity can vary considerably over time. The second finding was that, as ex- pected, the different approaches yielded different capacity results. These estimated capacities varied by as much as 26 % at the San Diego test site and by 34 % at the Shanghai test site. It was also found that each of the methodologies has advantages and disadvantages, and the best method will be the function of the available data, the application, and the goals of the modeler. Consequently, it is critical for users of automatic capacity estimation techniques, which utilize ITS data, to understand the underlying assumptions of each of the different approaches.展开更多
GMM inference procedures based on the square of the modulus of the model characteristic function are developed using sample moments selected using estimating function theory and bypassing the use of empirical characte...GMM inference procedures based on the square of the modulus of the model characteristic function are developed using sample moments selected using estimating function theory and bypassing the use of empirical characteristic function of other GMM procedures in the literature. The procedures are relatively simple to implement and are less simulation-oriented than simulated methods of inferences yet have the potential of good efficiencies for models with densities without closed form. The procedures also yield better estimators than method of moment estimators for models with more than three parameters as higher order sample moments tend to be unstable.展开更多
The objective of reliability-based design optimization(RBDO)is to minimize the optimization objective while satisfying the corresponding reliability requirements.However,the nested loop characteristic reduces the effi...The objective of reliability-based design optimization(RBDO)is to minimize the optimization objective while satisfying the corresponding reliability requirements.However,the nested loop characteristic reduces the efficiency of RBDO algorithm,which hinders their application to high-dimensional engineering problems.To address these issues,this paper proposes an efficient decoupled RBDO method combining high dimensional model representation(HDMR)and the weight-point estimation method(WPEM).First,we decouple the RBDO model using HDMR and WPEM.Second,Lagrange interpolation is used to approximate a univariate function.Finally,based on the results of the first two steps,the original nested loop reliability optimization model is completely transformed into a deterministic design optimization model that can be solved by a series of mature constrained optimization methods without any additional calculations.Two numerical examples of a planar 10-bar structure and an aviation hydraulic piping system with 28 design variables are analyzed to illustrate the performance and practicability of the proposed method.展开更多
In this paper the main sources causing the scatter of the experimental results of the material parameters are discussed. They can be divided into two parts: one is the experimental errors which are introduced because ...In this paper the main sources causing the scatter of the experimental results of the material parameters are discussed. They can be divided into two parts: one is the experimental errors which are introduced because of the inaccuracy of experimental equipment, the experimental techniques, etc., and the form of the scatter caused by this source is called external distribution. The other is due to the irregularity and inhomogeneity of the material structure and the randomness of deformation process. The scatter caused by this source is inherent and then this form of the scatter is called internal distribution. Obviously the experimental distribution of material parameters combines these two distributions in some way; therefore, it is a sum distribution of the external distribution and the internal distribution. In view of this , a general method used to analyse the influence of the experimental errors on the experimental results is presented, and three criteria used to value this influence are defined. An example in which the fracture toughness KIC is analysed shows that this method is reasonable, convenient and effective.展开更多
Two aspects of a new method,which can be used for seismic zoning,are introduced in this paper.On the one hand,the approach to estimate b value and annual activity rate proposed by Kijko and Sellevoll needs to use the ...Two aspects of a new method,which can be used for seismic zoning,are introduced in this paper.On the one hand,the approach to estimate b value and annual activity rate proposed by Kijko and Sellevoll needs to use the earthquake catalogue.The existing earthquake catalogue contains both historical and recent instrumental data sets and it is inadequate to use only one part.Combining the large number of historical events with recent complete records and taking the magnitude uncertainty into account,Kijko’s method gives the maximum likelihood estimation of b value and annual activity rate,which might be more realistic.On the other hand,this method considers the source zone boundary uncertainty in seismic hazard analysis,which means the earthquake activity rate across a boundary of a source zone changes smoothly instead of abruptly and avoids too large a gradient in the calculated results.展开更多
This article explores the comparison between the probability method and the least squares method in the design of linear predictive models. It points out that these two approaches have distinct theoretical foundations...This article explores the comparison between the probability method and the least squares method in the design of linear predictive models. It points out that these two approaches have distinct theoretical foundations and can lead to varied or similar results in terms of precision and performance under certain assumptions. The article underlines the importance of comparing these two approaches to choose the one best suited to the context, available data and modeling objectives.展开更多
Studies in the coastal area of Bohai Bay,China,from July 2006 to October 2007,suggest that the method of meiofaunal biomass estimation affected the meiofaunal analysis.Conventional estimation methods that use a unique...Studies in the coastal area of Bohai Bay,China,from July 2006 to October 2007,suggest that the method of meiofaunal biomass estimation affected the meiofaunal analysis.Conventional estimation methods that use a unique mean individual weight value for nematodes to calculate total biomass may cause deviation of the results.A modified estimation method,named the Subsection Count Method (SCM),was also used to calculate meiofaunal biomass.This entails only a slight increase in workload but generates results of greater accuracy.Results gained using each of these two methods were compared in the present study.The results show that the conventional method generally estimates a meiofaunal biomass.The difference between the two estimation methods was highly significant (P<0.01) for the spring and winter cruises.Furthermore,the estimation method for meiofaunal biomass affected the analysis of horizontal distribution and correlation with environmental factors.These findings highlight the importance of estimation methods for meiofaunal biomass and will hopefully stimulate further investigation and discussion of the topic.展开更多
In this study, a multivariate local quadratic polynomial regression(MLQPR) method is proposed to design a model for the sludge volume index(SVI). In MLQPR, a quadratic polynomial regression function is established to ...In this study, a multivariate local quadratic polynomial regression(MLQPR) method is proposed to design a model for the sludge volume index(SVI). In MLQPR, a quadratic polynomial regression function is established to describe the relationship between SVI and the relative variables, and the important terms of the quadratic polynomial regression function are determined by the significant test of the corresponding coefficients. Moreover, a local estimation method is introduced to adjust the weights of the quadratic polynomial regression function to improve the model accuracy. Finally, the proposed method is applied to predict the SVI values in a real wastewater treatment process(WWTP). The experimental results demonstrate that the proposed MLQPR method has faster testing speed and more accurate results than some existing methods.展开更多
The law of variation of deep rock stress in gravitational and tectonic stress fields is analyzed based on the Hoek-Brown strength criterion. In the gravitational stress field,the rocks in the shallow area are in an el...The law of variation of deep rock stress in gravitational and tectonic stress fields is analyzed based on the Hoek-Brown strength criterion. In the gravitational stress field,the rocks in the shallow area are in an elastic state and the deep,relatively soft rock may be in a plastic state. However,in the tectonic stress field,the relatively soft rock in the shallow area is in a plastic state and the deep rock in an elastic state. A method is proposed to estimate stress values in coal and soft rock based on in-situ measurements of hard rock. Our estimation method relates to the type of stress field and stress state. The equations of rock stress in various stress states are presented for the elastic,plastic and critical states. The critical state is a special stress state,which indicates the conversion of the elastic to the plastic state in the gravitational stress field and the conversion of the plastic to the elastic state in the tectonic stress field. Two cases stud-ies show that the estimation method is feasible.展开更多
Significant wave height is an important criterion in designing coastal and offshore structures.Based on the orthogonality principle, the linear mean square estimation method is applied to calculate significant wave he...Significant wave height is an important criterion in designing coastal and offshore structures.Based on the orthogonality principle, the linear mean square estimation method is applied to calculate significant wave height in this paper.Twenty-eight-year time series of wave data collected from three ocean buoys near San Francisco along the California coast are analyzed.It is proved theoretically that the computation error will be reduced by using as many measured data as possible for the calculation of significant wave height.Measured significant wave height at one buoy location is compared with the calculated value based on the data from two other adjacent buoys.The results indicate that the linear mean square estimation method can be well applied to the calculation and prediction of significant wave height in coastal regions.展开更多
A new method for array calibration of array gain and phase uncertainties, which severely degrade the performance of spatial spectrum estimation, is presented. The method is based on the idea of the instrumental sensor...A new method for array calibration of array gain and phase uncertainties, which severely degrade the performance of spatial spectrum estimation, is presented. The method is based on the idea of the instrumental sensors method (ISM), two well-calibrated sensors are added into the original array. By applying the principle of estimation of signal parameters via rotational invariance techniques (ESPRIT), the direction-of-arrivals (DOAs) and uncertainties can be estimated simultaneously through eigen-decomposition. Compared with the conventional ones, this new method has less computational complexity while has higher estimation precision, what's more, it can overcome the problem of ambiguity. Both theoretical analysis and computer simulations show the effectiveness of the proposed method.展开更多
Based on the principle of conservative matter removal in estuary,a new method is proposed for estimating the ratio of sediment resuspension in estuaries with fine suspended sediments in the turbidity maximum zone(TMZ)...Based on the principle of conservative matter removal in estuary,a new method is proposed for estimating the ratio of sediment resuspension in estuaries with fine suspended sediments in the turbidity maximum zone(TMZ) of the Changjiang(Yangtze) estuary during 2005.Results show that there was a range of 18.7%±27.9% to 73.9%±22.5% per annum of total suspended particulate matter(SPM),with an average of 49.2%.Nearly half of the particulate matter in the TMZ originates from sediment resuspension.This indicates that sediment resuspension is one of the major mechanisms involved in formation of the TMZ.Compared with traditional method for calculating these ratios in the estuary,this new method evaluates the dynamic variation of SPM content carried by river runoff from the river mouth to the ocean.The new method produced more reliable results than the traditional one and could produce a better estimation of resuspension flux for particulate matter in estuaries.展开更多
Based on the review of present force coefficients estimation methods, a new method in the frequency domain, revised cross-spectrum estimation method, is presented in this paper. Some experiments on the wave-current fo...Based on the review of present force coefficients estimation methods, a new method in the frequency domain, revised cross-spectrum estimation method, is presented in this paper. Some experiments on the wave-current force on inclined cylinders are also described and the wave current force coefficients are estimated by the revised cross-spectrum estimation method. From the results, it is found that the wave and current directions have some regular effect on the coefficients. According to the results, some empirical formulas are obtained for converting the wave-current force coefficients on inclined cylinders into a unified coefficient. Comparisons show that the unified coefficients are in good agreement with other results.展开更多
According to the principle, “The failure data is the basis of software reliability analysis”, we built a software reliability expert system (SRES) by adopting the artificial intelligence technology. By reasoning out...According to the principle, “The failure data is the basis of software reliability analysis”, we built a software reliability expert system (SRES) by adopting the artificial intelligence technology. By reasoning out the conclusion from the fitting results of failure data of a software project, the SRES can recommend users “the most suitable model” as a software reliability measurement model. We believe that the SRES can overcome the inconsistency in applications of software reliability models well. We report investigation results of singularity and parameter estimation methods of experimental models in SRES.展开更多
A method of source depth estimation based on the multi-path time delay difference is proposed. When the minimum time arrivals in all receiver depths are snapped to a certain time on time delay-depth plane, time delay ...A method of source depth estimation based on the multi-path time delay difference is proposed. When the minimum time arrivals in all receiver depths are snapped to a certain time on time delay-depth plane, time delay arrivals of surface-bottom reflection and bottom-surface reflection intersect at the source depth. Two hydrophones deployed vertically with a certain interval are required at least. If the receiver depths are known, the pair of time delays can be used to estimate the source depth. With the proposed method the source depth can be estimated successfully in a moderate range in the deep ocean without complicated matched-field calculations in the simulations and experiments.展开更多
Accurate assessment of herbage mass (HM) in pasture is a key to budgeting forage in grazing systems worldwide. Different non-destructive techniques to measuring pasture yield are commented. The methods compared incl...Accurate assessment of herbage mass (HM) in pasture is a key to budgeting forage in grazing systems worldwide. Different non-destructive techniques to measuring pasture yield are commented. The methods compared include visual estimations, manual and electronic pasture meters and remote sensing. All methods are associated with a moderate to high error, showing that some indirect methods of yield estimation are appropriate under most appropriate because many factors as climate variations, soil certain conditions. In general terms, no method was found as the characteristics, plant phenology, pasture management and species composition must be taken into account to make local calibrations from a general model. Best results were found modifying general methods under local calibrations and under local conditions. In order to give farmers the best method to manage adequately their own grazing systems, researchers must select the most suitable technique considering the scale of operation, the desired accuracy and the resources available.展开更多
Source term identification is very important for the contaminant gas emission event. Thus, it is necessary to study the source parameter estimation method with high computation efficiency, high estimation accuracy and...Source term identification is very important for the contaminant gas emission event. Thus, it is necessary to study the source parameter estimation method with high computation efficiency, high estimation accuracy and reasonable confidence interval. Tikhonov regularization method is a potential good tool to identify the source parameters. However, it is invalid for nonlinear inverse problem like gas emission process. 2-step nonlinear and linear PSO (partial swarm optimization)-Tikhonov regularization method proposed previously have estimated the emission source parameters successfully. But there are still some problems in computation efficiency and confidence interval. Hence, a new 1-step nonlinear method combined Tikhonov regularizafion and PSO algorithm with nonlinear forward dispersion model was proposed. First, the method was tested with simulation and experiment cases. The test results showed that 1-step nonlinear hybrid method is able to estimate multiple source parameters with reasonable confidence interval. Then, the estimation performances of different methods were compared with different cases. The estimation values with 1-step nonlinear method were close to that with 2-step nonlinear and linear PSO-Tikhonov regularization method, 1-step nonlinear method even performs better than other two methods in some cases, especially for source strength and downwind distance estimation. Compared with 2-step nonlinear method, 1-step method has higher computation efficiency. On the other hand, the confidence intervals with the method proposed in this paper seem more reasonable than that with other two methods. Finally, single PSO algorithm was compared with 1-step nonlinear PSO-Tikhonov hybrid regularization method. The results showed that the skill scores of 1-step nonlinear hybrid method to estimate source parameters were close to that of single PSO method and even better in some cases. One more important property of 1-step nonlinear PSO-Tikhonov regularization method is its reasonable confidence interval, which is not obtained by single PSO algorithm. Therefore, 1-step nonlinear hybrid regularization method proposed in this paper is a potential good method to estimate contaminant gas emission source term.展开更多
Stem diameter distribution information is useful in forest management planning. Weibull function is flexible, and has been used in characterising diameter distributions, especially in single-species planted stands, th...Stem diameter distribution information is useful in forest management planning. Weibull function is flexible, and has been used in characterising diameter distributions, especially in single-species planted stands, the world over. We evaluated some Weibull parameter estimation methods for stem diameter characterisation in (Oban) multi-species Forest in southern Nigeria. Four study sites (Aking, Ekang, Erokut and Ekuri) were selected. Four 2 km-long transects situated at 600 m apart were laid in each location. Five 50m x 50m plots were alternately laid along each transect at 400 m apart (20 plots/location) using systematic sampling technique. Tree growth variables: diameter at breast height (Dbh), diameters at the base, middle and merchantable limit, total height, merchantable height, stem straightness, crown length and crown diameter were measured on all trees 〉 10 cm to compute model response variables such as mean diameters, basal area and stem volume. Weibull parameters estimation methods used were: moment-based, percentile-based, hybrid and maximum-likelihood (ML). Data were analysed using descriptive statistics, regression models and ANOVA at α0.05. Percentile-based method was the best for Weibull [location (a), scale (b) and shape (c)] parameters estimations with mLogL = 116.66±21.89, while hybrid method was least-suitable (mLogL = 690.14±128.81) for Weibull parameters estimations. Quadratic mean diameter (Dq) was the only suitable predictor of Weibull parameters in Oban Forest.展开更多
Commonly used statistical procedure to describe the observed statistical sets is to use their conventional moments or cumulants. When choosing an appropriate parametric distribution for the data set is typically that ...Commonly used statistical procedure to describe the observed statistical sets is to use their conventional moments or cumulants. When choosing an appropriate parametric distribution for the data set is typically that parameters of a parametric distribution are estimated using the moment method of creating a system of equations in which the sample conventional moments lay in the equality of the corresponding moments of the theoretical distribution. However, the moment method of parameter estimation is not always convenient, especially for small samples. An alternative approach is based on the use of other characteristics, which the author calls L-moments. L-moments are analogous to conventional moments, but they are based on linear combinations of order statistics, i.e., L-statistics. Using L-moments is theoretically preferable to the conventional moments and consists in the fact that L-moments characterize a wider range of distribution. When estimating from sample L-moments, L-moments are more robust to the presence of outliers in the data. Experience also shows that, compared to conventional moments, L-moments are less prone to bias of estimation. Parameter estimates obtained using L-moments are mainly in the case of small samples often even more accurate than estimates of parameters made by maximum likelihood method. Using the method of L-moments in the case of small data sets from the meteorology is primarily known in statistical literature. This paper deals with the use of L-moments in the case for large data sets of income distribution (individual data) and wage distribution (data are ordered to form of interval frequency distribution of extreme open intervals). This paper also presents a comparison of the accuracy of the method of L-moments with an accuracy of other methods of point estimation of parameters of parametric probability distribution in the case of large data sets of individual data and data ordered to form of interval frequency distribution.展开更多
基金supported by National Science and Technology Support Project (Grant No. 2012BAD05B03–6)Strategic Priority Research Program of the Chinese Academy of Sciences (Grant No. XDA05070403)National Natural Science Foundationof China (Grant No. 41171246)
文摘The conventional method which assumes the soil distribution is continuous was unsuitable for estimating soil organic carbon density(SOCD) in Karst areas because of its discontinuous soil distribution. The accurate estimation of SOCD in Karst areas is essential for carbon sequestration assessment in China. In this study, a modified method,which considers the vertical proportion of soil area in the profile when calculating the SOCD, was developed to estimate the SOCD in a typical Karst peak-cluster depression area in southwest China. In the modified method, ground-penetrating radar(GPR) technology was used to detect the distribution and thickness of soil. The accuracy of the method was confirmed through comparison with the data obtained using a validation method, in which the soil thickness was measured by excavation. In comparison with the conventional method and average-soil-depth method,the SOCD estimated using the GPR method showed the minimum relative error with respect to that obtained using the validation method. At a regional scale, the average SOCDs at depths of 0-20 cm and 0-100 cm, which were interpolated by ordinary kriging,were 1.49(ranging from 0.03-5.65) and 2.26(0.09-11.60) kgm-2based on GPR method in our study area(covering 393.6 hm2), respectively. Therefore, the modified method can be applied on the accurate estimation of SOCD in discontinuous soil areas such as Karst regions.
文摘With the recent advent of Intelligent Transporta- tion Systems (ITS), and their associated data collection and archiving capabilities, there is now a rich data source for transportation professionals to develop capacity values for their own jurisdictions. Unfortunately, there is no consensus on the best approach for estimating capacity from ITS data. The motivation of this paper is to compare and contrast four of the most popular capacity estimation techniques in terms of (1) data requirements, (2) modeling effort required, (3) esti- mated parameter values, (4) theoretical background, and (5) statistical differences across time and over geographically dispersed locations. Specifically, the first method is the maximum observed value, the second is a standard funda- mental diagram curve fitting approach using the popular Van Aerde model, the third method uses the breakdown identifi- cation approach, and the fourth method is the survival prob- ability based on product limit method. These four approaches were tested on two test beds: one is located in San Diego, California, U.S., and has data from 112 work days; the other is located in Shanghai, China, and consists of 81 work days. It was found that, irrespective of the estimation methodology and the definition of capacity, the estimated capacity can vary considerably over time. The second finding was that, as ex- pected, the different approaches yielded different capacity results. These estimated capacities varied by as much as 26 % at the San Diego test site and by 34 % at the Shanghai test site. It was also found that each of the methodologies has advantages and disadvantages, and the best method will be the function of the available data, the application, and the goals of the modeler. Consequently, it is critical for users of automatic capacity estimation techniques, which utilize ITS data, to understand the underlying assumptions of each of the different approaches.
文摘GMM inference procedures based on the square of the modulus of the model characteristic function are developed using sample moments selected using estimating function theory and bypassing the use of empirical characteristic function of other GMM procedures in the literature. The procedures are relatively simple to implement and are less simulation-oriented than simulated methods of inferences yet have the potential of good efficiencies for models with densities without closed form. The procedures also yield better estimators than method of moment estimators for models with more than three parameters as higher order sample moments tend to be unstable.
基金supported by the Innovation Fund Project of the Gansu Education Department(Grant No.2021B-099).
文摘The objective of reliability-based design optimization(RBDO)is to minimize the optimization objective while satisfying the corresponding reliability requirements.However,the nested loop characteristic reduces the efficiency of RBDO algorithm,which hinders their application to high-dimensional engineering problems.To address these issues,this paper proposes an efficient decoupled RBDO method combining high dimensional model representation(HDMR)and the weight-point estimation method(WPEM).First,we decouple the RBDO model using HDMR and WPEM.Second,Lagrange interpolation is used to approximate a univariate function.Finally,based on the results of the first two steps,the original nested loop reliability optimization model is completely transformed into a deterministic design optimization model that can be solved by a series of mature constrained optimization methods without any additional calculations.Two numerical examples of a planar 10-bar structure and an aviation hydraulic piping system with 28 design variables are analyzed to illustrate the performance and practicability of the proposed method.
文摘In this paper the main sources causing the scatter of the experimental results of the material parameters are discussed. They can be divided into two parts: one is the experimental errors which are introduced because of the inaccuracy of experimental equipment, the experimental techniques, etc., and the form of the scatter caused by this source is called external distribution. The other is due to the irregularity and inhomogeneity of the material structure and the randomness of deformation process. The scatter caused by this source is inherent and then this form of the scatter is called internal distribution. Obviously the experimental distribution of material parameters combines these two distributions in some way; therefore, it is a sum distribution of the external distribution and the internal distribution. In view of this , a general method used to analyse the influence of the experimental errors on the experimental results is presented, and three criteria used to value this influence are defined. An example in which the fracture toughness KIC is analysed shows that this method is reasonable, convenient and effective.
基金This project was sponsored by the State Seismological Bureau (85070102), China
文摘Two aspects of a new method,which can be used for seismic zoning,are introduced in this paper.On the one hand,the approach to estimate b value and annual activity rate proposed by Kijko and Sellevoll needs to use the earthquake catalogue.The existing earthquake catalogue contains both historical and recent instrumental data sets and it is inadequate to use only one part.Combining the large number of historical events with recent complete records and taking the magnitude uncertainty into account,Kijko’s method gives the maximum likelihood estimation of b value and annual activity rate,which might be more realistic.On the other hand,this method considers the source zone boundary uncertainty in seismic hazard analysis,which means the earthquake activity rate across a boundary of a source zone changes smoothly instead of abruptly and avoids too large a gradient in the calculated results.
文摘This article explores the comparison between the probability method and the least squares method in the design of linear predictive models. It points out that these two approaches have distinct theoretical foundations and can lead to varied or similar results in terms of precision and performance under certain assumptions. The article underlines the importance of comparing these two approaches to choose the one best suited to the context, available data and modeling objectives.
基金Supported by Chinese Offshore Investigation and Assessment Project (No. 908-TJ-10,908-TJ-09)the Initial Fund for Introduced Talent of Tianjin University of Science and Technology (No. 20090413)
文摘Studies in the coastal area of Bohai Bay,China,from July 2006 to October 2007,suggest that the method of meiofaunal biomass estimation affected the meiofaunal analysis.Conventional estimation methods that use a unique mean individual weight value for nematodes to calculate total biomass may cause deviation of the results.A modified estimation method,named the Subsection Count Method (SCM),was also used to calculate meiofaunal biomass.This entails only a slight increase in workload but generates results of greater accuracy.Results gained using each of these two methods were compared in the present study.The results show that the conventional method generally estimates a meiofaunal biomass.The difference between the two estimation methods was highly significant (P<0.01) for the spring and winter cruises.Furthermore,the estimation method for meiofaunal biomass affected the analysis of horizontal distribution and correlation with environmental factors.These findings highlight the importance of estimation methods for meiofaunal biomass and will hopefully stimulate further investigation and discussion of the topic.
文摘In this study, a multivariate local quadratic polynomial regression(MLQPR) method is proposed to design a model for the sludge volume index(SVI). In MLQPR, a quadratic polynomial regression function is established to describe the relationship between SVI and the relative variables, and the important terms of the quadratic polynomial regression function are determined by the significant test of the corresponding coefficients. Moreover, a local estimation method is introduced to adjust the weights of the quadratic polynomial regression function to improve the model accuracy. Finally, the proposed method is applied to predict the SVI values in a real wastewater treatment process(WWTP). The experimental results demonstrate that the proposed MLQPR method has faster testing speed and more accurate results than some existing methods.
基金Projects 40272114 and 40572160 supported by the National Natural Science Foundation of China
文摘The law of variation of deep rock stress in gravitational and tectonic stress fields is analyzed based on the Hoek-Brown strength criterion. In the gravitational stress field,the rocks in the shallow area are in an elastic state and the deep,relatively soft rock may be in a plastic state. However,in the tectonic stress field,the relatively soft rock in the shallow area is in a plastic state and the deep rock in an elastic state. A method is proposed to estimate stress values in coal and soft rock based on in-situ measurements of hard rock. Our estimation method relates to the type of stress field and stress state. The equations of rock stress in various stress states are presented for the elastic,plastic and critical states. The critical state is a special stress state,which indicates the conversion of the elastic to the plastic state in the gravitational stress field and the conversion of the plastic to the elastic state in the tectonic stress field. Two cases stud-ies show that the estimation method is feasible.
基金support for this study was provided by the National Natural Science Foundation of China (No.40776006)Research Fund for the Doctoral Program of Higher Education of China (Grant No.20060423009)the Science and Technology Development Program of Shandong Province (Grant No.2008GGB01099)
文摘Significant wave height is an important criterion in designing coastal and offshore structures.Based on the orthogonality principle, the linear mean square estimation method is applied to calculate significant wave height in this paper.Twenty-eight-year time series of wave data collected from three ocean buoys near San Francisco along the California coast are analyzed.It is proved theoretically that the computation error will be reduced by using as many measured data as possible for the calculation of significant wave height.Measured significant wave height at one buoy location is compared with the calculated value based on the data from two other adjacent buoys.The results indicate that the linear mean square estimation method can be well applied to the calculation and prediction of significant wave height in coastal regions.
文摘A new method for array calibration of array gain and phase uncertainties, which severely degrade the performance of spatial spectrum estimation, is presented. The method is based on the idea of the instrumental sensors method (ISM), two well-calibrated sensors are added into the original array. By applying the principle of estimation of signal parameters via rotational invariance techniques (ESPRIT), the direction-of-arrivals (DOAs) and uncertainties can be estimated simultaneously through eigen-decomposition. Compared with the conventional ones, this new method has less computational complexity while has higher estimation precision, what's more, it can overcome the problem of ambiguity. Both theoretical analysis and computer simulations show the effectiveness of the proposed method.
基金Supported by National Natural Science Foundation of China for Creative Research Groups(No.41121064) and NSFC(No.41176138)the Program from Three Gorges Engineering Construction Committee of the State Council,China(No.SX2004-010)
文摘Based on the principle of conservative matter removal in estuary,a new method is proposed for estimating the ratio of sediment resuspension in estuaries with fine suspended sediments in the turbidity maximum zone(TMZ) of the Changjiang(Yangtze) estuary during 2005.Results show that there was a range of 18.7%±27.9% to 73.9%±22.5% per annum of total suspended particulate matter(SPM),with an average of 49.2%.Nearly half of the particulate matter in the TMZ originates from sediment resuspension.This indicates that sediment resuspension is one of the major mechanisms involved in formation of the TMZ.Compared with traditional method for calculating these ratios in the estuary,this new method evaluates the dynamic variation of SPM content carried by river runoff from the river mouth to the ocean.The new method produced more reliable results than the traditional one and could produce a better estimation of resuspension flux for particulate matter in estuaries.
文摘Based on the review of present force coefficients estimation methods, a new method in the frequency domain, revised cross-spectrum estimation method, is presented in this paper. Some experiments on the wave-current force on inclined cylinders are also described and the wave current force coefficients are estimated by the revised cross-spectrum estimation method. From the results, it is found that the wave and current directions have some regular effect on the coefficients. According to the results, some empirical formulas are obtained for converting the wave-current force coefficients on inclined cylinders into a unified coefficient. Comparisons show that the unified coefficients are in good agreement with other results.
基金the National Natural Science Foundation of China
文摘According to the principle, “The failure data is the basis of software reliability analysis”, we built a software reliability expert system (SRES) by adopting the artificial intelligence technology. By reasoning out the conclusion from the fitting results of failure data of a software project, the SRES can recommend users “the most suitable model” as a software reliability measurement model. We believe that the SRES can overcome the inconsistency in applications of software reliability models well. We report investigation results of singularity and parameter estimation methods of experimental models in SRES.
基金Supported by the National Natural Science Foundation of China under Grant No 11174235
文摘A method of source depth estimation based on the multi-path time delay difference is proposed. When the minimum time arrivals in all receiver depths are snapped to a certain time on time delay-depth plane, time delay arrivals of surface-bottom reflection and bottom-surface reflection intersect at the source depth. Two hydrophones deployed vertically with a certain interval are required at least. If the receiver depths are known, the pair of time delays can be used to estimate the source depth. With the proposed method the source depth can be estimated successfully in a moderate range in the deep ocean without complicated matched-field calculations in the simulations and experiments.
文摘Accurate assessment of herbage mass (HM) in pasture is a key to budgeting forage in grazing systems worldwide. Different non-destructive techniques to measuring pasture yield are commented. The methods compared include visual estimations, manual and electronic pasture meters and remote sensing. All methods are associated with a moderate to high error, showing that some indirect methods of yield estimation are appropriate under most appropriate because many factors as climate variations, soil certain conditions. In general terms, no method was found as the characteristics, plant phenology, pasture management and species composition must be taken into account to make local calibrations from a general model. Best results were found modifying general methods under local calibrations and under local conditions. In order to give farmers the best method to manage adequately their own grazing systems, researchers must select the most suitable technique considering the scale of operation, the desired accuracy and the resources available.
基金Supported by the National Natural Science Foundation of China(21676216)China Postdoctoral Science Foundation(2015M582667)+2 种基金Natural Science Basic Research Plan in Shaanxi Province of China(2016JQ5079)Key Research Project of Shaanxi Province(2015ZDXM-GY-115)the Fundamental Research Funds for the Central Universities(xjj2017124)
文摘Source term identification is very important for the contaminant gas emission event. Thus, it is necessary to study the source parameter estimation method with high computation efficiency, high estimation accuracy and reasonable confidence interval. Tikhonov regularization method is a potential good tool to identify the source parameters. However, it is invalid for nonlinear inverse problem like gas emission process. 2-step nonlinear and linear PSO (partial swarm optimization)-Tikhonov regularization method proposed previously have estimated the emission source parameters successfully. But there are still some problems in computation efficiency and confidence interval. Hence, a new 1-step nonlinear method combined Tikhonov regularizafion and PSO algorithm with nonlinear forward dispersion model was proposed. First, the method was tested with simulation and experiment cases. The test results showed that 1-step nonlinear hybrid method is able to estimate multiple source parameters with reasonable confidence interval. Then, the estimation performances of different methods were compared with different cases. The estimation values with 1-step nonlinear method were close to that with 2-step nonlinear and linear PSO-Tikhonov regularization method, 1-step nonlinear method even performs better than other two methods in some cases, especially for source strength and downwind distance estimation. Compared with 2-step nonlinear method, 1-step method has higher computation efficiency. On the other hand, the confidence intervals with the method proposed in this paper seem more reasonable than that with other two methods. Finally, single PSO algorithm was compared with 1-step nonlinear PSO-Tikhonov hybrid regularization method. The results showed that the skill scores of 1-step nonlinear hybrid method to estimate source parameters were close to that of single PSO method and even better in some cases. One more important property of 1-step nonlinear PSO-Tikhonov regularization method is its reasonable confidence interval, which is not obtained by single PSO algorithm. Therefore, 1-step nonlinear hybrid regularization method proposed in this paper is a potential good method to estimate contaminant gas emission source term.
文摘Stem diameter distribution information is useful in forest management planning. Weibull function is flexible, and has been used in characterising diameter distributions, especially in single-species planted stands, the world over. We evaluated some Weibull parameter estimation methods for stem diameter characterisation in (Oban) multi-species Forest in southern Nigeria. Four study sites (Aking, Ekang, Erokut and Ekuri) were selected. Four 2 km-long transects situated at 600 m apart were laid in each location. Five 50m x 50m plots were alternately laid along each transect at 400 m apart (20 plots/location) using systematic sampling technique. Tree growth variables: diameter at breast height (Dbh), diameters at the base, middle and merchantable limit, total height, merchantable height, stem straightness, crown length and crown diameter were measured on all trees 〉 10 cm to compute model response variables such as mean diameters, basal area and stem volume. Weibull parameters estimation methods used were: moment-based, percentile-based, hybrid and maximum-likelihood (ML). Data were analysed using descriptive statistics, regression models and ANOVA at α0.05. Percentile-based method was the best for Weibull [location (a), scale (b) and shape (c)] parameters estimations with mLogL = 116.66±21.89, while hybrid method was least-suitable (mLogL = 690.14±128.81) for Weibull parameters estimations. Quadratic mean diameter (Dq) was the only suitable predictor of Weibull parameters in Oban Forest.
文摘Commonly used statistical procedure to describe the observed statistical sets is to use their conventional moments or cumulants. When choosing an appropriate parametric distribution for the data set is typically that parameters of a parametric distribution are estimated using the moment method of creating a system of equations in which the sample conventional moments lay in the equality of the corresponding moments of the theoretical distribution. However, the moment method of parameter estimation is not always convenient, especially for small samples. An alternative approach is based on the use of other characteristics, which the author calls L-moments. L-moments are analogous to conventional moments, but they are based on linear combinations of order statistics, i.e., L-statistics. Using L-moments is theoretically preferable to the conventional moments and consists in the fact that L-moments characterize a wider range of distribution. When estimating from sample L-moments, L-moments are more robust to the presence of outliers in the data. Experience also shows that, compared to conventional moments, L-moments are less prone to bias of estimation. Parameter estimates obtained using L-moments are mainly in the case of small samples often even more accurate than estimates of parameters made by maximum likelihood method. Using the method of L-moments in the case of small data sets from the meteorology is primarily known in statistical literature. This paper deals with the use of L-moments in the case for large data sets of income distribution (individual data) and wage distribution (data are ordered to form of interval frequency distribution of extreme open intervals). This paper also presents a comparison of the accuracy of the method of L-moments with an accuracy of other methods of point estimation of parameters of parametric probability distribution in the case of large data sets of individual data and data ordered to form of interval frequency distribution.