In order to reveal the complex network characteristics and evolution principle of China aviation network, the probability distribution and evolution trace of average degree of edge vertices of China aviation network w...In order to reveal the complex network characteristics and evolution principle of China aviation network, the probability distribution and evolution trace of average degree of edge vertices of China aviation network were studied based on the statistics data of China civil aviation network in 1988, 1994, 2001, 2008 and 2015. According to the theory and method of complex network, the network system was constructed with the city where the airport was located as the network node and the route between cities as the edge of the network. Based on the statistical data, the average degrees of edge vertices in China aviation network in 1988, 1994, 2001, 2008 and 2015 were calculated. Using the probability statistical analysis method and regression analysis approach, it was found that the average degree of edge vertices had the probability distribution of normal function and the position parameters and scale parameters of the probability distribution had linear evolution trace.展开更多
In order to improve the fitting accuracy of college students’ test scores, this paper proposes two-component mixed generalized normal distribution, uses maximum likelihood estimation method and Expectation Conditiona...In order to improve the fitting accuracy of college students’ test scores, this paper proposes two-component mixed generalized normal distribution, uses maximum likelihood estimation method and Expectation Conditional Maxinnization (ECM) algorithm to estimate parameters and conduct numerical simulation, and performs fitting analysis on the test scores of Linear Algebra and Advanced Mathematics of F University. The empirical results show that the two-component mixed generalized normal distribution is better than the commonly used two-component mixed normal distribution in fitting college students’ test data, and has good application value.展开更多
The aim of this paper is to present generalized log-Lindely (GLL) distribution, as a new model, and find doubly truncated generalized log-Lindely (DTGLL) distribution, truncation in probability distributions may occur...The aim of this paper is to present generalized log-Lindely (GLL) distribution, as a new model, and find doubly truncated generalized log-Lindely (DTGLL) distribution, truncation in probability distributions may occur in many studies such as life testing, and reliability. We illustrate the applicability of GLL and DTGLL distributions by Real data application. The GLL distribution can handle the risk rate functions in the form of panich and increase. This property makes GLL useful in survival analysis. Various statistical and reliability measures are obtained for the model, including hazard rate function, moments, moment generating function, mean and variance, quantiles function, Skewness and kurtosis, mean deviations, mean inactivity time and strong mean inactivity time. The estimation of model parameters is justified by the maximum Likelihood method. An application to real data shows that DTGLL distribution can provide better suitability than GLL and some other known distributions.展开更多
Modeling time headways between vehicles has attracted increasing interest in the traffic flow research field recently, because the corresponding statistics help to reveal the intrinsic interactions governing the vehic...Modeling time headways between vehicles has attracted increasing interest in the traffic flow research field recently, because the corresponding statistics help to reveal the intrinsic interactions governing the vehicle dynamics. However, most previous micro-simulation models cannot yield the observed log-normal distributed headways. This paper designs a new car-following model inspired by the Galton board to reproduce the observed time-headway distributions as well as the complex traffic phenomena. The consistency between the empirical data and the simulation results indicates that this new car-following model provides a reasonable description of the car-following behaviours.展开更多
In order to solve the life problem of vacuum fluorescent display (VFD) within shorter time, and reduce the life prediction cost, a constant-step stress accelerated life test was performed with its cathode temperature ...In order to solve the life problem of vacuum fluorescent display (VFD) within shorter time, and reduce the life prediction cost, a constant-step stress accelerated life test was performed with its cathode temperature increased. Statistical analysis was done by applying logarithmic normal distribution for describing the life, and least square method (LSM) for estimating logarithmic normal parameters. Self-designed special software was used to predict the VFD life. It is verified by numerical results that the VFD life follows logarithmic normal distribution, and that the life-stress relationship satisfies linear Arrhenius equation completely. The accurate calculation of the key parameters enables the rapid estimation of VFD life.展开更多
Microlatex particles of emulsion explosives determined by microphotography were studied with the law of logarithmic Gauss normal distribution, and results obtained showed that the microlatex particle just possessed th...Microlatex particles of emulsion explosives determined by microphotography were studied with the law of logarithmic Gauss normal distribution, and results obtained showed that the microlatex particle just possessed the law of logarithmic Gauss normal distribution. The particle diameter in statistical average value, such as DNL, DNS, DLS, DSV and DVM was calculated through the diagram of logarithmic Gauss normal distribution of microlatex particles of emulsion explosives, so was SW.展开更多
In this paper, we study the connectivity of multihop wireless networks under the log-normal shadowing model by investigating the precise distribution of the number of isolated nodes. Under such a realistic shadowing m...In this paper, we study the connectivity of multihop wireless networks under the log-normal shadowing model by investigating the precise distribution of the number of isolated nodes. Under such a realistic shadowing model, all previous known works on the distribution of the number of isolated nodes were obtained only based on simulation studies or by ignoring the important boundary effect to avoid the challenging technical analysis, and thus cannot be applied to any practical wireless networks. It is extremely challenging to take the complicated boundary effect into consideration under such a realistic model because the transmission area of each node is an irregular region other than a circular area. Assume that the wireless nodes are represented by a Poisson point process with densitynover a unit-area disk, and that the transmission power is properly chosen so that the expected node degree of the network equals lnn + ξ (n), where ξ (n) approaches to a constant ξ as n →?∞. Under such a shadowing model with the boundary effect taken into consideration, we proved that the total number of isolated nodes is asymptotically Poisson with mean e$ {-ξ}. The Brun’s sieve is utilized to derive the precise asymptotic distribution. Our results can be used as design guidelines for any practical multihop wireless network where both the shadowing and boundary effects must be taken into consideration.展开更多
About 170 nations have been affected by the COvid VIrus Disease-19(COVID-19)epidemic.On governing bodies across the globe,a lot of stress is created by COVID-19 as there is a continuous rise in patient count testing p...About 170 nations have been affected by the COvid VIrus Disease-19(COVID-19)epidemic.On governing bodies across the globe,a lot of stress is created by COVID-19 as there is a continuous rise in patient count testing positive,and they feel challenging to tackle this situation.Most researchers concentrate on COVID-19 data analysis using the machine learning paradigm in these situations.In the previous works,Long Short-Term Memory(LSTM)was used to predict future COVID-19 cases.According to LSTM network data,the outbreak is expected tofinish by June 2020.However,there is a chance of an over-fitting problem in LSTM and true positive;it may not produce the required results.The COVID-19 dataset has lower accuracy and a higher error rate in the existing system.The proposed method has been introduced to overcome the above-mentioned issues.For COVID-19 prediction,a Linear Decreasing Inertia Weight-based Cat Swarm Optimization with Half Binomial Distribution based Convolutional Neural Network(LDIWCSO-HBDCNN)approach is presented.In this suggested research study,the COVID-19 predicting dataset is employed as an input,and the min-max normalization approach is employed to normalize it.Optimum features are selected using Linear Decreasing Inertia Weight-based Cat Swarm Optimization(LDIWCSO)algorithm,enhancing the accuracy of classification.The Cat Swarm Optimization(CSO)algorithm’s convergence is enhanced using inertia weight in the LDIWCSO algorithm.It is used to select the essential features using the bestfitness function values.For a specified time across India,death and confirmed cases are predicted using the Half Binomial Distribution based Convolutional Neural Network(HBDCNN)technique based on selected features.As demonstrated by empirical observations,the proposed system produces significant performance in terms of f-measure,recall,precision,and accuracy.展开更多
Firstly, using the damage model for rock based on Lemaitre hypothesis about strain equivalence, a new technique for measuring strength of rock micro-cells by adopting the Mohr-Coulomb criterion was developed, and a st...Firstly, using the damage model for rock based on Lemaitre hypothesis about strain equivalence, a new technique for measuring strength of rock micro-cells by adopting the Mohr-Coulomb criterion was developed, and a statistical damage evolution equation was established based on the property that strength of micro-cells is consistent with normal distribution function, through discussing the characteristics of random distributions for strength of micro-cells, then a statistical damage constitutive model that can simulate the full process of rock strain softening under specific confining pressure was set up. Secondly, a new method to determine the model parameters which can be applied to the situations under different confining pressures was proposed, by deeply studying the relations between the model parameters and characteristic parameters of the full stress-strain curve under different confining pressures. Therefore, a unified statistical damage constitutive model for rock softening which can reflect the effect of different confining pressures was set up. This model makes the physical property of model parameters explicit, contains only conventional mechanical parameters, and leads its application more convenient. Finally, the rationality of this model and its parameters-determining method were identified via comparative analyses between theoretical and experimental curves.展开更多
Joint location and scale models of the skew-normal distribution provide useful ex- tension for joint mean and variance models of the normal distribution when the data set under consideration involves asymmetric outcom...Joint location and scale models of the skew-normal distribution provide useful ex- tension for joint mean and variance models of the normal distribution when the data set under consideration involves asymmetric outcomes. This paper focuses on the maximum likelihood estimation of joint location and scale models of the skew-normal distribution. The proposed procedure can simultaneously estimate parameters in the location model and the scale model. Simulation studies and a real example are used to illustrate the proposed methodologies.展开更多
The estimation of the functionθ=exp{αμ+bσ2} of parameters (μ,σ2) in normal distribution N(μ,σ2) is discussed. And when the prior distributions ofμandσ2 are independent, under the loss function L(θ,δ)=(θ-1...The estimation of the functionθ=exp{αμ+bσ2} of parameters (μ,σ2) in normal distribution N(μ,σ2) is discussed. And when the prior distributions ofμandσ2 are independent, under the loss function L(θ,δ)=(θ-1×δ-1)2, the Bayesian estimation and the existence and computing method on minimax estimation are deeply discussed.展开更多
The Sierra de San Miguelito is a relatively uplifted area and is constituted by a large amount of silicic volcanic rocks with ages from middle to late Cenozoic. The normal faults of the Sierra de San Miguelito are Dom...The Sierra de San Miguelito is a relatively uplifted area and is constituted by a large amount of silicic volcanic rocks with ages from middle to late Cenozoic. The normal faults of the Sierra de San Miguelito are Domino-style and nearly parallel. The cumulative length and displacement of the faults obey power-law distribution. The fractal dimension of the fault traces is -1.49. Using the multi-line one-dimensional sampling, the calculated exponent of cumulative fault displacements is -0.66. A cumulative curve combining measurements of all four sections yielded a slope of -0.63. The displacement-length plot shows a non-linear relationship and large dispersion of data. The large dispersion in the plot is mainly due to the fault linkage during faulting. An estimation of extensional strain due to the normal faults is ca. 0.1830. The bed extension strain is always less than or equal to the horizontal extension strain. The deformation in the Sierra de San Miguelito occurred near the surface, producing pervasive faults and many faults are too small to appear in maps and sections at common scales. The stretching produced by small faults reach ca. 33% of the total horizontal elongation.展开更多
Leaf normal distribution is an important structural characteristic of the forest canopy. Although terrestrial laser scanners(TLS) have potential for estimating canopy structural parameters, distinguishing between le...Leaf normal distribution is an important structural characteristic of the forest canopy. Although terrestrial laser scanners(TLS) have potential for estimating canopy structural parameters, distinguishing between leaves and nonphotosynthetic structures to retrieve the leaf normal has been challenging. We used here an approach to accurately retrieve the leaf normals of camphorwood(Cinnamomum camphora) using TLS point cloud data.First, nonphotosynthetic structures were filtered by using the curvature threshold of each point. Then, the point cloud data were segmented by a voxel method and clustered by a Gaussian mixture model in each voxel. Finally, the normal vector of each cluster was computed by principal component analysis to obtain the leaf normal distribution. We collected leaf inclination angles and estimated the distribution, which we compared with the retrieved leaf normal distribution. The correlation coefficient between measurements and obtained results was 0.96, indicating a good coincidence.展开更多
We introduce a new class of the slash distribution using the epsilon half normal distribution. The newly defined model extends the slashed half normal distribution and has more kurtosis than the ordinary half normal d...We introduce a new class of the slash distribution using the epsilon half normal distribution. The newly defined model extends the slashed half normal distribution and has more kurtosis than the ordinary half normal distribution. We study the characterization and properties including moments and some measures based on moments of this distribution. A simulation is conducted to investigate asymptotically the bias properties of the estimators for the parameters. We illustrate its use on a real data set by using maximum likelihood estimation.展开更多
The original data of Nilsson-Ehle experiment in wheat were analyzed with existent genetic knowledge. It indicated that the core of polygenic hypothesis from this experiment was that a character similarity produced by ...The original data of Nilsson-Ehle experiment in wheat were analyzed with existent genetic knowledge. It indicated that the core of polygenic hypothesis from this experiment was that a character similarity produced by additive effect of multiple genes was the basis of continuous variation. Its precondition was for effective genes to have equal effect, to show merodominance and binomial distribution and to inherit independently. In fact, quantitative characters were determined by many genes with different property, effect and behavior. So it was difficult to solve all problems of continuous variation by the aid of polygenic hypothesis. The researchers should seek new ways. With Mendelian group as research object and by means of Lyapunov central limit theorem it was proved that both genotypic value G and the environmental effect in a niche E were subordinated to the normal distribution and respectively. According to additivity of the normal distribution the phenotype P = G + E also obeyed the normal distribution P = G + E ~ and quantitative characters showed continuous variation, whether or not the linkage was presented, whether or not every gene effect was equal, whether or not there were dominance and what kind of dominance between alleles. Moreover it was discussed that the quantitative characters in self-fertilized organism and clone were submitted to the normal distribution and presented continuous variation too.展开更多
Extensive studies based on partition curve of gravity separation have been investigated. All created models are merely used to simulate density distribution at the same size fraction. However, they cannot be used to p...Extensive studies based on partition curve of gravity separation have been investigated. All created models are merely used to simulate density distribution at the same size fraction. However, they cannot be used to predictive distribution of materials depending on compound feature of density and size. According to this situation, an improved model of partition curve based on accumulation normal distribution, which was distinguished from conventional model of accumulation normal distribution for partition curve, was proposed in this paper. It could simulate density distribution at different size fractions by using the density-size compound index and conflating the partition curves at different size fractions as one partition curve. The feasibility of three compound indexes, including mass index, settlement index and transformation index, were investigated. Specific forms of the improved model were also proposed. It is found that transformation index leads to the best fitting results, while the fitting error is only 1.75 according to the fitting partition curve.展开更多
The calculation of the mean difference for the inverse normal distribution can be obtained by a transformation of variable or a hard integration by parts. This paper shows a simpler formula of the mean difference of t...The calculation of the mean difference for the inverse normal distribution can be obtained by a transformation of variable or a hard integration by parts. This paper shows a simpler formula of the mean difference of the inverse normal distribution that highlights the role of the two parameters on the mean difference of the model. It makes it easier to study the relation of the mean difference with the other indexes of variability for the inverse normal distribution.展开更多
In the reliability designing procedure of the vehicle components, when the distribution styles of the random variables are unknown or non-normal distribution, the result evaluated contains great error or even is wrong...In the reliability designing procedure of the vehicle components, when the distribution styles of the random variables are unknown or non-normal distribution, the result evaluated contains great error or even is wrong if the reliability value R is larger than 1 by using the existent method, in which case the formula is necessary to be revised. This is obviously inconvenient for programming. Combining reliability-based optimization theory, robust designing method and reliability based sensitivity analysis, a new method for reliability robust designing is proposed. Therefore the influence level of the designing parameters’ changing to the reliability of vehicle components can be obtained. The reliability sensitivity with respect to design parameters is viewed as a sub-objective function in the multi-objective optimization problem satisfying reliability constraints. Given the first four moments of basic random variables, a fourth-moment technique and the proposed optimization procedure can obtain reliability-based robust design of automobile components with non-normal distribution parameters accurately and quickly. By using the proposed method, the distribution style of the random parameters is relaxed. Therefore it is much closer to the actual reliability problems. The numerical examples indicate the following: (1) The reliability value obtained by the robust method proposed increases (】0.04%) comparing to the value obtained by the ordinary optimization algorithm; (2) The absolute value of reliability-based sensitivity decreases (】0.01%), and the robustness of the products’ quality is improved accordingly. Utilizing the reliability-based optimization and robust design method in the reliability designing procedure reduces the manufacture cost and provides the theoretical basis for the reliability and robust design of the vehicle components.展开更多
文摘In order to reveal the complex network characteristics and evolution principle of China aviation network, the probability distribution and evolution trace of average degree of edge vertices of China aviation network were studied based on the statistics data of China civil aviation network in 1988, 1994, 2001, 2008 and 2015. According to the theory and method of complex network, the network system was constructed with the city where the airport was located as the network node and the route between cities as the edge of the network. Based on the statistical data, the average degrees of edge vertices in China aviation network in 1988, 1994, 2001, 2008 and 2015 were calculated. Using the probability statistical analysis method and regression analysis approach, it was found that the average degree of edge vertices had the probability distribution of normal function and the position parameters and scale parameters of the probability distribution had linear evolution trace.
文摘In order to improve the fitting accuracy of college students’ test scores, this paper proposes two-component mixed generalized normal distribution, uses maximum likelihood estimation method and Expectation Conditional Maxinnization (ECM) algorithm to estimate parameters and conduct numerical simulation, and performs fitting analysis on the test scores of Linear Algebra and Advanced Mathematics of F University. The empirical results show that the two-component mixed generalized normal distribution is better than the commonly used two-component mixed normal distribution in fitting college students’ test data, and has good application value.
文摘The aim of this paper is to present generalized log-Lindely (GLL) distribution, as a new model, and find doubly truncated generalized log-Lindely (DTGLL) distribution, truncation in probability distributions may occur in many studies such as life testing, and reliability. We illustrate the applicability of GLL and DTGLL distributions by Real data application. The GLL distribution can handle the risk rate functions in the form of panich and increase. This property makes GLL useful in survival analysis. Various statistical and reliability measures are obtained for the model, including hazard rate function, moments, moment generating function, mean and variance, quantiles function, Skewness and kurtosis, mean deviations, mean inactivity time and strong mean inactivity time. The estimation of model parameters is justified by the maximum Likelihood method. An application to real data shows that DTGLL distribution can provide better suitability than GLL and some other known distributions.
基金supported partly by the National Basic Research Program of China (Grant No. 2006CB705506)the National Hi-Tech Research and Development Program of China (Grant Nos. 2006AA11Z215 and 2007AA11Z222)the National Natural Science Foundation of China (Grant Nos. 50708055, 60774034 and 10872194)
文摘Modeling time headways between vehicles has attracted increasing interest in the traffic flow research field recently, because the corresponding statistics help to reveal the intrinsic interactions governing the vehicle dynamics. However, most previous micro-simulation models cannot yield the observed log-normal distributed headways. This paper designs a new car-following model inspired by the Galton board to reproduce the observed time-headway distributions as well as the complex traffic phenomena. The consistency between the empirical data and the simulation results indicates that this new car-following model provides a reasonable description of the car-following behaviours.
文摘In order to solve the life problem of vacuum fluorescent display (VFD) within shorter time, and reduce the life prediction cost, a constant-step stress accelerated life test was performed with its cathode temperature increased. Statistical analysis was done by applying logarithmic normal distribution for describing the life, and least square method (LSM) for estimating logarithmic normal parameters. Self-designed special software was used to predict the VFD life. It is verified by numerical results that the VFD life follows logarithmic normal distribution, and that the life-stress relationship satisfies linear Arrhenius equation completely. The accurate calculation of the key parameters enables the rapid estimation of VFD life.
文摘Microlatex particles of emulsion explosives determined by microphotography were studied with the law of logarithmic Gauss normal distribution, and results obtained showed that the microlatex particle just possessed the law of logarithmic Gauss normal distribution. The particle diameter in statistical average value, such as DNL, DNS, DLS, DSV and DVM was calculated through the diagram of logarithmic Gauss normal distribution of microlatex particles of emulsion explosives, so was SW.
文摘In this paper, we study the connectivity of multihop wireless networks under the log-normal shadowing model by investigating the precise distribution of the number of isolated nodes. Under such a realistic shadowing model, all previous known works on the distribution of the number of isolated nodes were obtained only based on simulation studies or by ignoring the important boundary effect to avoid the challenging technical analysis, and thus cannot be applied to any practical wireless networks. It is extremely challenging to take the complicated boundary effect into consideration under such a realistic model because the transmission area of each node is an irregular region other than a circular area. Assume that the wireless nodes are represented by a Poisson point process with densitynover a unit-area disk, and that the transmission power is properly chosen so that the expected node degree of the network equals lnn + ξ (n), where ξ (n) approaches to a constant ξ as n →?∞. Under such a shadowing model with the boundary effect taken into consideration, we proved that the total number of isolated nodes is asymptotically Poisson with mean e$ {-ξ}. The Brun’s sieve is utilized to derive the precise asymptotic distribution. Our results can be used as design guidelines for any practical multihop wireless network where both the shadowing and boundary effects must be taken into consideration.
文摘About 170 nations have been affected by the COvid VIrus Disease-19(COVID-19)epidemic.On governing bodies across the globe,a lot of stress is created by COVID-19 as there is a continuous rise in patient count testing positive,and they feel challenging to tackle this situation.Most researchers concentrate on COVID-19 data analysis using the machine learning paradigm in these situations.In the previous works,Long Short-Term Memory(LSTM)was used to predict future COVID-19 cases.According to LSTM network data,the outbreak is expected tofinish by June 2020.However,there is a chance of an over-fitting problem in LSTM and true positive;it may not produce the required results.The COVID-19 dataset has lower accuracy and a higher error rate in the existing system.The proposed method has been introduced to overcome the above-mentioned issues.For COVID-19 prediction,a Linear Decreasing Inertia Weight-based Cat Swarm Optimization with Half Binomial Distribution based Convolutional Neural Network(LDIWCSO-HBDCNN)approach is presented.In this suggested research study,the COVID-19 predicting dataset is employed as an input,and the min-max normalization approach is employed to normalize it.Optimum features are selected using Linear Decreasing Inertia Weight-based Cat Swarm Optimization(LDIWCSO)algorithm,enhancing the accuracy of classification.The Cat Swarm Optimization(CSO)algorithm’s convergence is enhanced using inertia weight in the LDIWCSO algorithm.It is used to select the essential features using the bestfitness function values.For a specified time across India,death and confirmed cases are predicted using the Half Binomial Distribution based Convolutional Neural Network(HBDCNN)technique based on selected features.As demonstrated by empirical observations,the proposed system produces significant performance in terms of f-measure,recall,precision,and accuracy.
基金Project (50378036) supported by the National Natural Science Foundation of China Project (03JJY5024) supported by the Natural Science Foundation of Hunan Province, China
文摘Firstly, using the damage model for rock based on Lemaitre hypothesis about strain equivalence, a new technique for measuring strength of rock micro-cells by adopting the Mohr-Coulomb criterion was developed, and a statistical damage evolution equation was established based on the property that strength of micro-cells is consistent with normal distribution function, through discussing the characteristics of random distributions for strength of micro-cells, then a statistical damage constitutive model that can simulate the full process of rock strain softening under specific confining pressure was set up. Secondly, a new method to determine the model parameters which can be applied to the situations under different confining pressures was proposed, by deeply studying the relations between the model parameters and characteristic parameters of the full stress-strain curve under different confining pressures. Therefore, a unified statistical damage constitutive model for rock softening which can reflect the effect of different confining pressures was set up. This model makes the physical property of model parameters explicit, contains only conventional mechanical parameters, and leads its application more convenient. Finally, the rationality of this model and its parameters-determining method were identified via comparative analyses between theoretical and experimental curves.
基金Supported by the National Natural Science Foundation of China(11261025,11201412)the Natural Science Foundation of Yunnan Province(2011FB016)the Program for Middle-aged Backbone Teacher,Yunnan University
文摘Joint location and scale models of the skew-normal distribution provide useful ex- tension for joint mean and variance models of the normal distribution when the data set under consideration involves asymmetric outcomes. This paper focuses on the maximum likelihood estimation of joint location and scale models of the skew-normal distribution. The proposed procedure can simultaneously estimate parameters in the location model and the scale model. Simulation studies and a real example are used to illustrate the proposed methodologies.
文摘The estimation of the functionθ=exp{αμ+bσ2} of parameters (μ,σ2) in normal distribution N(μ,σ2) is discussed. And when the prior distributions ofμandσ2 are independent, under the loss function L(θ,δ)=(θ-1×δ-1)2, the Bayesian estimation and the existence and computing method on minimax estimation are deeply discussed.
文摘The Sierra de San Miguelito is a relatively uplifted area and is constituted by a large amount of silicic volcanic rocks with ages from middle to late Cenozoic. The normal faults of the Sierra de San Miguelito are Domino-style and nearly parallel. The cumulative length and displacement of the faults obey power-law distribution. The fractal dimension of the fault traces is -1.49. Using the multi-line one-dimensional sampling, the calculated exponent of cumulative fault displacements is -0.66. A cumulative curve combining measurements of all four sections yielded a slope of -0.63. The displacement-length plot shows a non-linear relationship and large dispersion of data. The large dispersion in the plot is mainly due to the fault linkage during faulting. An estimation of extensional strain due to the normal faults is ca. 0.1830. The bed extension strain is always less than or equal to the horizontal extension strain. The deformation in the Sierra de San Miguelito occurred near the surface, producing pervasive faults and many faults are too small to appear in maps and sections at common scales. The stretching produced by small faults reach ca. 33% of the total horizontal elongation.
文摘Leaf normal distribution is an important structural characteristic of the forest canopy. Although terrestrial laser scanners(TLS) have potential for estimating canopy structural parameters, distinguishing between leaves and nonphotosynthetic structures to retrieve the leaf normal has been challenging. We used here an approach to accurately retrieve the leaf normals of camphorwood(Cinnamomum camphora) using TLS point cloud data.First, nonphotosynthetic structures were filtered by using the curvature threshold of each point. Then, the point cloud data were segmented by a voxel method and clustered by a Gaussian mixture model in each voxel. Finally, the normal vector of each cluster was computed by principal component analysis to obtain the leaf normal distribution. We collected leaf inclination angles and estimated the distribution, which we compared with the retrieved leaf normal distribution. The correlation coefficient between measurements and obtained results was 0.96, indicating a good coincidence.
文摘We introduce a new class of the slash distribution using the epsilon half normal distribution. The newly defined model extends the slashed half normal distribution and has more kurtosis than the ordinary half normal distribution. We study the characterization and properties including moments and some measures based on moments of this distribution. A simulation is conducted to investigate asymptotically the bias properties of the estimators for the parameters. We illustrate its use on a real data set by using maximum likelihood estimation.
文摘The original data of Nilsson-Ehle experiment in wheat were analyzed with existent genetic knowledge. It indicated that the core of polygenic hypothesis from this experiment was that a character similarity produced by additive effect of multiple genes was the basis of continuous variation. Its precondition was for effective genes to have equal effect, to show merodominance and binomial distribution and to inherit independently. In fact, quantitative characters were determined by many genes with different property, effect and behavior. So it was difficult to solve all problems of continuous variation by the aid of polygenic hypothesis. The researchers should seek new ways. With Mendelian group as research object and by means of Lyapunov central limit theorem it was proved that both genotypic value G and the environmental effect in a niche E were subordinated to the normal distribution and respectively. According to additivity of the normal distribution the phenotype P = G + E also obeyed the normal distribution P = G + E ~ and quantitative characters showed continuous variation, whether or not the linkage was presented, whether or not every gene effect was equal, whether or not there were dominance and what kind of dominance between alleles. Moreover it was discussed that the quantitative characters in self-fertilized organism and clone were submitted to the normal distribution and presented continuous variation too.
基金the financial support from the National Natural Science Foundation of China (No. 51221462)
文摘Extensive studies based on partition curve of gravity separation have been investigated. All created models are merely used to simulate density distribution at the same size fraction. However, they cannot be used to predictive distribution of materials depending on compound feature of density and size. According to this situation, an improved model of partition curve based on accumulation normal distribution, which was distinguished from conventional model of accumulation normal distribution for partition curve, was proposed in this paper. It could simulate density distribution at different size fractions by using the density-size compound index and conflating the partition curves at different size fractions as one partition curve. The feasibility of three compound indexes, including mass index, settlement index and transformation index, were investigated. Specific forms of the improved model were also proposed. It is found that transformation index leads to the best fitting results, while the fitting error is only 1.75 according to the fitting partition curve.
文摘The calculation of the mean difference for the inverse normal distribution can be obtained by a transformation of variable or a hard integration by parts. This paper shows a simpler formula of the mean difference of the inverse normal distribution that highlights the role of the two parameters on the mean difference of the model. It makes it easier to study the relation of the mean difference with the other indexes of variability for the inverse normal distribution.
基金supported by National Natural Science Foundation of China (Grant Nos. 51135003, U1234208, 51205050)New Teachers' Fund for Doctor Stations of Ministry of Education of China (Grant No.20110042120020)+1 种基金Fundamental Research Funds for the Central Universities, China (Grant No. N110303003)China Postdoctoral Science Foundation (Grant No. 2011M500564)
文摘In the reliability designing procedure of the vehicle components, when the distribution styles of the random variables are unknown or non-normal distribution, the result evaluated contains great error or even is wrong if the reliability value R is larger than 1 by using the existent method, in which case the formula is necessary to be revised. This is obviously inconvenient for programming. Combining reliability-based optimization theory, robust designing method and reliability based sensitivity analysis, a new method for reliability robust designing is proposed. Therefore the influence level of the designing parameters’ changing to the reliability of vehicle components can be obtained. The reliability sensitivity with respect to design parameters is viewed as a sub-objective function in the multi-objective optimization problem satisfying reliability constraints. Given the first four moments of basic random variables, a fourth-moment technique and the proposed optimization procedure can obtain reliability-based robust design of automobile components with non-normal distribution parameters accurately and quickly. By using the proposed method, the distribution style of the random parameters is relaxed. Therefore it is much closer to the actual reliability problems. The numerical examples indicate the following: (1) The reliability value obtained by the robust method proposed increases (】0.04%) comparing to the value obtained by the ordinary optimization algorithm; (2) The absolute value of reliability-based sensitivity decreases (】0.01%), and the robustness of the products’ quality is improved accordingly. Utilizing the reliability-based optimization and robust design method in the reliability designing procedure reduces the manufacture cost and provides the theoretical basis for the reliability and robust design of the vehicle components.