Background: The signal-to-noise ratio (SNR) is recognized as an index of measurements reproducibility. We derive the maximum likelihood estimators of SNR and discuss confidence interval construction on the difference ...Background: The signal-to-noise ratio (SNR) is recognized as an index of measurements reproducibility. We derive the maximum likelihood estimators of SNR and discuss confidence interval construction on the difference between two correlated SNRs when the readings are from bivariate normal and bivariate lognormal distribution. We use the Pearsons system of curves to approximate the difference between the two estimates and use the bootstrap methods to validate the approximate distributions of the statistic of interest. Methods: The paper uses the delta method to find the first four central moments, and hence the skewness and kurtosis which are important in the determination of the parameters of the Pearsons distribution. Results: The approach is illustrated in two examples;one from veterinary microbiology and food safety data and the other on data from clinical medicine. We derived the four central moments of the target statistics, together with the bootstrap method to evaluate the parameters of Pearsons distribution. The fitted Pearsons curves of Types I and II were recommended based on the available data. The R-codes are also provided to be readily used by the readers.展开更多
Deep neural networks are gaining importance and popularity in applications and services.Due to the enormous number of learnable parameters and datasets,the training of neural networks is computationally costly.Paralle...Deep neural networks are gaining importance and popularity in applications and services.Due to the enormous number of learnable parameters and datasets,the training of neural networks is computationally costly.Parallel and distributed computation-based strategies are used to accelerate this training process.Generative Adversarial Networks(GAN)are a recent technological achievement in deep learning.These generative models are computationally expensive because a GAN consists of two neural networks and trains on enormous datasets.Typically,a GAN is trained on a single server.Conventional deep learning accelerator designs are challenged by the unique properties of GAN,like the enormous computation stages with non-traditional convolution layers.This work addresses the issue of distributing GANs so that they can train on datasets distributed over many TPUs(Tensor Processing Unit).Distributed learning training accelerates the learning process and decreases computation time.In this paper,the Generative Adversarial Network is accelerated using the distributed multi-core TPU in distributed data-parallel synchronous model.For adequate acceleration of the GAN network,the data parallel SGD(Stochastic Gradient Descent)model is implemented in multi-core TPU using distributed TensorFlow with mixed precision,bfloat16,and XLA(Accelerated Linear Algebra).The study was conducted on the MNIST dataset for varying batch sizes from 64 to 512 for 30 epochs in distributed SGD in TPU v3 with 128×128 systolic array.An extensive batch technique is implemented in bfloat16 to decrease the storage cost and speed up floating-point computations.The accelerated learning curve for the generator and discriminator network is obtained.The training time was reduced by 79%by varying the batch size from 64 to 512 in multi-core TPU.展开更多
To dates,most ship detection approaches for single-pol synthetic aperture radar(SAR) imagery try to ensure a constant false-alarm rate(CFAR).A high performance ship detector relies on two key components:an accura...To dates,most ship detection approaches for single-pol synthetic aperture radar(SAR) imagery try to ensure a constant false-alarm rate(CFAR).A high performance ship detector relies on two key components:an accurate estimation to a sea surface distribution and a fine designed CFAR algorithm.First,a novel nonparametric sea surface distribution estimation method is developed based on n-order Bézier curve.To estimate the sea surface distribution using n-order Bézier curve,an explicit analytical solution is derived based on a least square optimization,and the optimal selection also is presented to two essential parameters,the order n of Bézier curve and the number m of sample points.Next,to validate the ship detection performance of the estimated sea surface distribution,the estimated sea surface distribution by n-order Bézier curve is combined with a cell averaging CFAR(CA-CFAR).To eliminate the possible interfering ship targets in background window,an improved automatic censoring method is applied.Comprehensive experiments prove that in terms of sea surface estimation performance,the proposed method is as good as a traditional nonparametric Parzen window kernel method,and in most cases,outperforms two widely used parametric methods,K and G0 models.In terms of computation speed,a major advantage of the proposed estimation method is the time consuming only depended on the number m of sample points while independent of imagery size,which makes it can achieve a significant speed improvement to the Parzen window kernel method,and in some cases,it is even faster than two parametric methods.In terms of ship detection performance,the experiments show that the ship detector which constructed by the proposed sea surface distribution model and the given CA-CFAR algorithm has wide adaptability to different SAR sensors,resolutions and sea surface homogeneities and obtains a leading performance on the test dataset.展开更多
In the study of complex networks almost all theoretical models have the property of infinite growth, but the size of actual networks is finite. According to statistics from the China Internet IPv4 (Internet Protocol ...In the study of complex networks almost all theoretical models have the property of infinite growth, but the size of actual networks is finite. According to statistics from the China Internet IPv4 (Internet Protocol version 4) addresses, this paper proposes a forecasting model by using S curve (logistic curve). The growing trend of IPv4 addresses in China is forecasted. There are some reference values for optimizing the distribution of IPv4 address resource and the development of IPv6. Based on the laws of IPv4 growth, that is, the bulk growth and the finitely growing limit, it proposes a finite network model with a bulk growth. The model is said to be an S-curve network. Analysis demonstrates that the analytic method based on uniform distributions (i.e., Barabasi-Albert method) is not suitable for the network. It develops an approximate method to predict the growth dynamics of the individual nodes, and uses this to calculate analytically the degree distribution and the scaling exponents. The analytical result agrees with the simulation well, obeying an approximately power-law form. This method can overcome a shortcoming of Barabasi-Albert method commonly used in current network research.展开更多
Extensive studies based on partition curve of gravity separation have been investigated. All created models are merely used to simulate density distribution at the same size fraction. However, they cannot be used to p...Extensive studies based on partition curve of gravity separation have been investigated. All created models are merely used to simulate density distribution at the same size fraction. However, they cannot be used to predictive distribution of materials depending on compound feature of density and size. According to this situation, an improved model of partition curve based on accumulation normal distribution, which was distinguished from conventional model of accumulation normal distribution for partition curve, was proposed in this paper. It could simulate density distribution at different size fractions by using the density-size compound index and conflating the partition curves at different size fractions as one partition curve. The feasibility of three compound indexes, including mass index, settlement index and transformation index, were investigated. Specific forms of the improved model were also proposed. It is found that transformation index leads to the best fitting results, while the fitting error is only 1.75 according to the fitting partition curve.展开更多
In this paper the track behavior of passenger car was studied. The vehicle driving trajectory and driving direction were defined, and a classification of the type of vehicle trajectories along the curves was developed...In this paper the track behavior of passenger car was studied. The vehicle driving trajectory and driving direction were defined, and a classification of the type of vehicle trajectories along the curves was developed. The statistical parameters of vehicle trajectory samples in free flow and their frequency curves and cumulative frequency curves were achieved, K-S test and chi-square test were used to test normal distribution and gamma distribution for collected sample data, and the probabili- ty density functions were given. At last, dispersion degree between vehicle trajectory random varia- ble and the characteristic value of cumulative frequency curve in each key cross section in curves was analyzied. The proposed conclusion can provide theoretical support for the reasonable optimization of widen curve, design of alignment and the management of counter flow conflicts.展开更多
An investigation of the errors resulted from distribution curve transformations using six different methods was made on the basis of 61 sets of jig performance test data from the coal preparation plants in China. The ...An investigation of the errors resulted from distribution curve transformations using six different methods was made on the basis of 61 sets of jig performance test data from the coal preparation plants in China. The results indicate that minimum error occurred when distribution curves were transformed by keeping imperfection I constant. Generalized distribution curves are developed for jigs and their applications are discussed.展开更多
The characteristics of density yield curve of coal and distribution curve of products can be described with median, quartile deviation, the quartile measure of skewness and kurtosis like K. On the basis of 16 groups o...The characteristics of density yield curve of coal and distribution curve of products can be described with median, quartile deviation, the quartile measure of skewness and kurtosis like K. On the basis of 16 groups of coal density composition data and their jigging stratification data derived from the pilot jig, the regression analysis has been done for the relationship between the characteristic values of the density curve and the characteristic values of the distribution curve.The results show as follow: (1) The bigger the skewness of the density curve, the bigger the probable error (Ep) and imperfection (I ) are. (2) The bigger the median of density curve, the smaller the probable error or imperfection is. (3) The characteristic values of density curve have no influence on the kurtosis K of the distribution curve.展开更多
Based on theoretical analysis and studying other methods, P-III curve is transformed into an incomplete G function by means of mathematical expression transformation, thus the mathematical model of the fast commonly-u...Based on theoretical analysis and studying other methods, P-III curve is transformed into an incomplete G function by means of mathematical expression transformation, thus the mathematical model of the fast commonly-used algorithm is drawn out. Algorithm comparison and practices demonstrate that the mathematical model has an easy algorithm, agile resolution process, very good commonality, faster convergence rate and better calculation accuracy, and can be applied to other respects.展开更多
The particle size distribution(PSD)curve is an important expression of the basic properties of the soil.The characteristic parameters such as median particle size D_(50),effective particle size D_(10),and some combina...The particle size distribution(PSD)curve is an important expression of the basic properties of the soil.The characteristic parameters such as median particle size D_(50),effective particle size D_(10),and some combinations of characteristic parameters can only represent some points of the PSD curve,failing to represent all information of the PSD curve.In this paper,a fraction characteristic parameter F that could reflect the fraction size variation of the PSD curve was introduced based on grading entropy.And a new presentation method of the A-B-F three parameters was proposed for a refined representation of the PSD curve.It was found that the newly constructed model not only better represents the differences in the width of PSD curves but also has a higher sensitivity than the A-B two-parameter model proposed by Lőrincz.For PSD curves with similar distributions,the new presentation method has a higher degree of discrimination than that of the D_(50)-C_(u)-C_(c) three-parameter or the four-parameter combination G_(c) proposed by Arshad.Finally,the application of this new method in describing the spatial nonuniformity distribution of depositions and predicting the soil hydraulic conductivity was discussed.The research results can provide a reference for the refined representation of PSD curves.展开更多
The shape of body and the ease of garment are the important elements in constructing garment style. However, the distribution of ease in garment is uneven and the curves showing different cross-sections of garment sty...The shape of body and the ease of garment are the important elements in constructing garment style. However, the distribution of ease in garment is uneven and the curves showing different cross-sections of garment style with ease are unavailable in the literature. This information is crucial when the three-dimensional garment style is transformed into two-dimensional garment patterns. Thus, several different X-line style garments are produced by 3D draping technology, and the cubic spline is used to fit the cross sectional curves of BL, WL of body and garment. By comparing cross sections of body and garment, the ease at different location can be derived. Besides, garments in 10 different dimensions and quantities of style ease are included in the current study. By numerical operation on discrete ease data at different location, the fitted functions of ease distribution are obtained.展开更多
Gap debris as discharge product is closely related to machining process in electrical discharge machining(EDM). A lot of recent researches have focused on the relationship among debris size, surfaces texture, remove...Gap debris as discharge product is closely related to machining process in electrical discharge machining(EDM). A lot of recent researches have focused on the relationship among debris size, surfaces texture, remove rate, and machining stability. The study on statistical distribution of debris size contributes to the research, but it is still superficial currently. In order to obtain the distribution law of the debris particle size, laser particle size analyzer(LPSA) combined with scanning electron microscope(SEM) is used to analyze the EDM debris size. Firstly, the heating dried method is applied to obtain the debris particles. Secondly, the measuring range of LPSA is determined as 0.5–100 μm by SEM observation, and the frequency distribution histogram and the cumulative frequency distribution scattergram of debris size are obtained by using LPSA. Thirdly, according to the distribution characteristic of the frequency distribution histogram, the statistical distribution functions of lognormal, exponentially modified Gaussian(EMG), Gamma and Weibull are chosen to achieve curve fitting of the histogram. At last, the distribute law of the debris size is obtained by fitting results. Experiments with different discharge parameters are carried out on an EDM machine designed by the authors themselves, and the machining conditions are tool electrode of red-copper material, workpiece of ANSI 1045 material and working fluid of de-ionized water. The experimental results indicate that the debris sizes of all experiment sample truly obey the Weibull distribution. The obtained distribution law is significantly important for all the models established based on the debris particle size.展开更多
It is well-known that a close link exists between soil-water retention curve(SWRC)and pore size distribution(PSD).Theoretically,mercury intrusion porosimetry(MIP)test simulates a soil drying path and the test results ...It is well-known that a close link exists between soil-water retention curve(SWRC)and pore size distribution(PSD).Theoretically,mercury intrusion porosimetry(MIP)test simulates a soil drying path and the test results can be used to deduce the SWRC(termed SWRCMIP).However,SWRCMIP does not include the effect of volume change,compared with the conventional SWRC that is directly determined by suction measurement or suction control techniques.For deformable soils,there is a significant difference between conventional SWRC and SWRCMIP.In this study,drying test was carried out on a reconstituted silty soil,and the volume change,suction,and PSD were measured on samples with different water contents.The change in the deduced SWRCMIP and its relationship with the conventional SWRC were analyzed.The results showed that the volume change of soil is the main reason accounting for the difference between conventional SWRC and SWRCMIP.Based on the test results,a transformation model was then proposed for conventional SWRC and SWRCMIP,for which the soil state with no volume change is taken as a reference.Comparison between the experimental and predicted SWRCs showed that the proposed model can well consider the influence of soil volume change on its water retention property.展开更多
Current freeform illumination optical designs are mostly focused on producing prescribed irradiance distributions on planar targets.Here,we aim to design freeform optics that could generate a desired illumination on a...Current freeform illumination optical designs are mostly focused on producing prescribed irradiance distributions on planar targets.Here,we aim to design freeform optics that could generate a desired illumination on a curved target from a point source,which is still a challenge.We reduce the difficulties that arise from the curved target by involving its varying z-coordinates in the iterative wavefront tailoring(IWT)procedure.The new IWT-based method is developed under the stereographic coordinate system with a special mesh transformation of the source domain,which is suitable for light sources with light emissions in semi space such as LED sources.The first example demonstrates that a rectangular flat-top illumination can be generated on an undulating surface by a spherical-freeform lens for a Lambertian source.The second example shows that our method is also applicable for producing a non-uniform irradiance distribution in a circular region of the undulating surface.展开更多
The objective of this paper is to present a Bayesian approach based on Kullback- Leibler divergence for assessing local influence in a growth curve model with general co- variance structure. Under certain prior distri...The objective of this paper is to present a Bayesian approach based on Kullback- Leibler divergence for assessing local influence in a growth curve model with general co- variance structure. Under certain prior distribution assumption, the Kullback-Leibler di- vergence is used to measure the influence of some minor perturbation on the posterior distribution of unknown parameter. This leads to the diagnostic statistic for detecting which response is locally influential. As an application, the common covariance-weighted perturbation scheme is thoroughly considered.展开更多
A regional analysis of design storms, defined as the expected rainfall intensity for given storm duration and return period, is conducted to determine storm Rainfall Intensity-Duration-Frequency (IDF) relationships. T...A regional analysis of design storms, defined as the expected rainfall intensity for given storm duration and return period, is conducted to determine storm Rainfall Intensity-Duration-Frequency (IDF) relationships. The ultimate purpose was to determine IDF curves for homogeneous regions identified in Botswana. Three homogeneous regions were identified based on topographic and rainfall characteristics which were constructed with the K-Means Clustering algorithm. Using the mean annual rainfall and the 24 hr annual maximum rainfall as an indicator of rainfall intensity for each homogeneous region, IDF curves and maps of rainfall intensities of 1 to 24 hr and above durations were produced. The Gamma and Lognormal probability distribution functions were able to provide estimates of rainfall depths for low and medium return periods (up to 100 years) in any location in each homogeneous region of Botswana.展开更多
Recently,the Darna distribution has been introduced as a new lifetime distribution.The two-parameter Darna distribution represents is a mixture of two well-known gamma and exponential distributions.A manufacturer or a...Recently,the Darna distribution has been introduced as a new lifetime distribution.The two-parameter Darna distribution represents is a mixture of two well-known gamma and exponential distributions.A manufacturer or an engineer of products conducts life testing to examine whether the quality level of products meets the customer’s requirements,such as reliability or the minimum lifetime.In this article,an attribute modified chain sampling inspection plan based on the time truncated life test is proposed for items whose lifetime follows the Darna distribution.The plan parameters,including the sample size,the acceptance number,and the past lot result of the proposed sampling plan,are determined with the help of the two-point approach considering the acceptable quality level(AQL)and the limiting quality level(LQL).The plan parameters and the corresponding operating characteristic functions of a new plan are provided in tabular form for various Darna distribution parameters.Also,a few illustrated examples are presented for various distribution parameters.The usefulness of the proposed attribute modified chain sampling plan is investigated using two real failure time datasets.The results indicate that the proposed sampling plan can reduce the sample size when the termination ratio increases for fixed values of the producer’s risk and acceptance number.Hence,the proposed attribute modified chain sampling inspection plan is recommended to practitioners in the field.展开更多
In order to study temperature field distribution in burnt surrounding rock and to determine ranges of burnt surrounding rock, coal-wall coking cycle and heat influence in the underground coal gasification(UCG) stope, ...In order to study temperature field distribution in burnt surrounding rock and to determine ranges of burnt surrounding rock, coal-wall coking cycle and heat influence in the underground coal gasification(UCG) stope, based on the Laplace transform and inversion formula, we studied the temperature analytical solution of one-dimensional unsteady heat conduction for multi-layer overlying strata under the first and the forth kinds of boundary conditions, and we also carried out a numerical simulation of twodimensional unsteady heat conduction by the COMSOL multiphysics. The results show that when the boundary temperature of surrounding rock has a linear decrease because of a directional movement of heat source in the UCG flame working face, the temperature in surrounding rock increases first and then decreases with time, the peak of temperature curve decreases gradually and its position moves inside surrounding rock from the boundary. In the surrounding rock of UCG stope, there is an envelope curve of temperature curve clusters. We analyzed the influence of thermophysical parameters on envelope curves and put forward to take envelope curve as the calculation basis for ranges of burnt surrounding rock, coal-wall coking cycle and heat influence. Finally, the concrete numerical values are given by determining those judgement standards and temperature thresholds, which basically tally with the field geophysical prospecting results.展开更多
For the determination of the smoothing factor (also known as the regularization parameter) in the co-seismic slip distribution inversion, the compromise curve between the model roughness and the data fitting residual ...For the determination of the smoothing factor (also known as the regularization parameter) in the co-seismic slip distribution inversion, the compromise curve between the model roughness and the data fitting residual is generally used to determine (in order to distinguish the method proposed in this paper, the method is called “L curve” according to its shape). Based on the L-curve, the Eclectic Intersection curve as a new method is proposed to determine the smoothing factor in this paper. The results of the simulated experiment show that the inversion accuracy of the parameters of the seismic slip distribution with the smoothing factor determined by the Eclectic Intersection curve method is better than that of the L curve method. Moreover, the Eclectic Intersection curve method and the L curve method are used to determine the smoothing factor of L’Aquila earthquake and the Taiwan Meinong earthquake slip distribution inversion respectively, and the inversion results are compared and analyzed. The analysis results show that the L’Aquila and the Taiwan Meinong actual earthquake slip distribution results are in the range of other scholars at home and abroad, and compared with the L curve method, the Eclectic Intersection curve method has advantages of high computation efficiency, no need to depend on data fitting degree and more appropriate of smoothing factor and so on.展开更多
文摘Background: The signal-to-noise ratio (SNR) is recognized as an index of measurements reproducibility. We derive the maximum likelihood estimators of SNR and discuss confidence interval construction on the difference between two correlated SNRs when the readings are from bivariate normal and bivariate lognormal distribution. We use the Pearsons system of curves to approximate the difference between the two estimates and use the bootstrap methods to validate the approximate distributions of the statistic of interest. Methods: The paper uses the delta method to find the first four central moments, and hence the skewness and kurtosis which are important in the determination of the parameters of the Pearsons distribution. Results: The approach is illustrated in two examples;one from veterinary microbiology and food safety data and the other on data from clinical medicine. We derived the four central moments of the target statistics, together with the bootstrap method to evaluate the parameters of Pearsons distribution. The fitted Pearsons curves of Types I and II were recommended based on the available data. The R-codes are also provided to be readily used by the readers.
文摘Deep neural networks are gaining importance and popularity in applications and services.Due to the enormous number of learnable parameters and datasets,the training of neural networks is computationally costly.Parallel and distributed computation-based strategies are used to accelerate this training process.Generative Adversarial Networks(GAN)are a recent technological achievement in deep learning.These generative models are computationally expensive because a GAN consists of two neural networks and trains on enormous datasets.Typically,a GAN is trained on a single server.Conventional deep learning accelerator designs are challenged by the unique properties of GAN,like the enormous computation stages with non-traditional convolution layers.This work addresses the issue of distributing GANs so that they can train on datasets distributed over many TPUs(Tensor Processing Unit).Distributed learning training accelerates the learning process and decreases computation time.In this paper,the Generative Adversarial Network is accelerated using the distributed multi-core TPU in distributed data-parallel synchronous model.For adequate acceleration of the GAN network,the data parallel SGD(Stochastic Gradient Descent)model is implemented in multi-core TPU using distributed TensorFlow with mixed precision,bfloat16,and XLA(Accelerated Linear Algebra).The study was conducted on the MNIST dataset for varying batch sizes from 64 to 512 for 30 epochs in distributed SGD in TPU v3 with 128×128 systolic array.An extensive batch technique is implemented in bfloat16 to decrease the storage cost and speed up floating-point computations.The accelerated learning curve for the generator and discriminator network is obtained.The training time was reduced by 79%by varying the batch size from 64 to 512 in multi-core TPU.
基金The National Natural Science Foundation of China under contract No.61471024the National Marine Technology Program for Public Welfare under contract No.201505002-1the Beijing Higher Education Young Elite Teacher Project under contract No.YETP0514
文摘To dates,most ship detection approaches for single-pol synthetic aperture radar(SAR) imagery try to ensure a constant false-alarm rate(CFAR).A high performance ship detector relies on two key components:an accurate estimation to a sea surface distribution and a fine designed CFAR algorithm.First,a novel nonparametric sea surface distribution estimation method is developed based on n-order Bézier curve.To estimate the sea surface distribution using n-order Bézier curve,an explicit analytical solution is derived based on a least square optimization,and the optimal selection also is presented to two essential parameters,the order n of Bézier curve and the number m of sample points.Next,to validate the ship detection performance of the estimated sea surface distribution,the estimated sea surface distribution by n-order Bézier curve is combined with a cell averaging CFAR(CA-CFAR).To eliminate the possible interfering ship targets in background window,an improved automatic censoring method is applied.Comprehensive experiments prove that in terms of sea surface estimation performance,the proposed method is as good as a traditional nonparametric Parzen window kernel method,and in most cases,outperforms two widely used parametric methods,K and G0 models.In terms of computation speed,a major advantage of the proposed estimation method is the time consuming only depended on the number m of sample points while independent of imagery size,which makes it can achieve a significant speed improvement to the Parzen window kernel method,and in some cases,it is even faster than two parametric methods.In terms of ship detection performance,the experiments show that the ship detector which constructed by the proposed sea surface distribution model and the given CA-CFAR algorithm has wide adaptability to different SAR sensors,resolutions and sea surface homogeneities and obtains a leading performance on the test dataset.
基金Project supported by the National Natural Science Foundation of China (Grant No. 70871082)the Shanghai Leading Academic Discipline Project (Grant No. S30504)
文摘In the study of complex networks almost all theoretical models have the property of infinite growth, but the size of actual networks is finite. According to statistics from the China Internet IPv4 (Internet Protocol version 4) addresses, this paper proposes a forecasting model by using S curve (logistic curve). The growing trend of IPv4 addresses in China is forecasted. There are some reference values for optimizing the distribution of IPv4 address resource and the development of IPv6. Based on the laws of IPv4 growth, that is, the bulk growth and the finitely growing limit, it proposes a finite network model with a bulk growth. The model is said to be an S-curve network. Analysis demonstrates that the analytic method based on uniform distributions (i.e., Barabasi-Albert method) is not suitable for the network. It develops an approximate method to predict the growth dynamics of the individual nodes, and uses this to calculate analytically the degree distribution and the scaling exponents. The analytical result agrees with the simulation well, obeying an approximately power-law form. This method can overcome a shortcoming of Barabasi-Albert method commonly used in current network research.
基金the financial support from the National Natural Science Foundation of China (No. 51221462)
文摘Extensive studies based on partition curve of gravity separation have been investigated. All created models are merely used to simulate density distribution at the same size fraction. However, they cannot be used to predictive distribution of materials depending on compound feature of density and size. According to this situation, an improved model of partition curve based on accumulation normal distribution, which was distinguished from conventional model of accumulation normal distribution for partition curve, was proposed in this paper. It could simulate density distribution at different size fractions by using the density-size compound index and conflating the partition curves at different size fractions as one partition curve. The feasibility of three compound indexes, including mass index, settlement index and transformation index, were investigated. Specific forms of the improved model were also proposed. It is found that transformation index leads to the best fitting results, while the fitting error is only 1.75 according to the fitting partition curve.
基金Supported by the National Natural Science Foundation of China(5097811450808093)
文摘In this paper the track behavior of passenger car was studied. The vehicle driving trajectory and driving direction were defined, and a classification of the type of vehicle trajectories along the curves was developed. The statistical parameters of vehicle trajectory samples in free flow and their frequency curves and cumulative frequency curves were achieved, K-S test and chi-square test were used to test normal distribution and gamma distribution for collected sample data, and the probabili- ty density functions were given. At last, dispersion degree between vehicle trajectory random varia- ble and the characteristic value of cumulative frequency curve in each key cross section in curves was analyzied. The proposed conclusion can provide theoretical support for the reasonable optimization of widen curve, design of alignment and the management of counter flow conflicts.
文摘An investigation of the errors resulted from distribution curve transformations using six different methods was made on the basis of 61 sets of jig performance test data from the coal preparation plants in China. The results indicate that minimum error occurred when distribution curves were transformed by keeping imperfection I constant. Generalized distribution curves are developed for jigs and their applications are discussed.
文摘The characteristics of density yield curve of coal and distribution curve of products can be described with median, quartile deviation, the quartile measure of skewness and kurtosis like K. On the basis of 16 groups of coal density composition data and their jigging stratification data derived from the pilot jig, the regression analysis has been done for the relationship between the characteristic values of the density curve and the characteristic values of the distribution curve.The results show as follow: (1) The bigger the skewness of the density curve, the bigger the probable error (Ep) and imperfection (I ) are. (2) The bigger the median of density curve, the smaller the probable error or imperfection is. (3) The characteristic values of density curve have no influence on the kurtosis K of the distribution curve.
文摘Based on theoretical analysis and studying other methods, P-III curve is transformed into an incomplete G function by means of mathematical expression transformation, thus the mathematical model of the fast commonly-used algorithm is drawn out. Algorithm comparison and practices demonstrate that the mathematical model has an easy algorithm, agile resolution process, very good commonality, faster convergence rate and better calculation accuracy, and can be applied to other respects.
基金This research was supported by the National Natural Science Foundation of China(grant No.41977239)the Sichuan Science and Technology Program(grant No.2022YFS0539).
文摘The particle size distribution(PSD)curve is an important expression of the basic properties of the soil.The characteristic parameters such as median particle size D_(50),effective particle size D_(10),and some combinations of characteristic parameters can only represent some points of the PSD curve,failing to represent all information of the PSD curve.In this paper,a fraction characteristic parameter F that could reflect the fraction size variation of the PSD curve was introduced based on grading entropy.And a new presentation method of the A-B-F three parameters was proposed for a refined representation of the PSD curve.It was found that the newly constructed model not only better represents the differences in the width of PSD curves but also has a higher sensitivity than the A-B two-parameter model proposed by Lőrincz.For PSD curves with similar distributions,the new presentation method has a higher degree of discrimination than that of the D_(50)-C_(u)-C_(c) three-parameter or the four-parameter combination G_(c) proposed by Arshad.Finally,the application of this new method in describing the spatial nonuniformity distribution of depositions and predicting the soil hydraulic conductivity was discussed.The research results can provide a reference for the refined representation of PSD curves.
文摘The shape of body and the ease of garment are the important elements in constructing garment style. However, the distribution of ease in garment is uneven and the curves showing different cross-sections of garment style with ease are unavailable in the literature. This information is crucial when the three-dimensional garment style is transformed into two-dimensional garment patterns. Thus, several different X-line style garments are produced by 3D draping technology, and the cubic spline is used to fit the cross sectional curves of BL, WL of body and garment. By comparing cross sections of body and garment, the ease at different location can be derived. Besides, garments in 10 different dimensions and quantities of style ease are included in the current study. By numerical operation on discrete ease data at different location, the fitted functions of ease distribution are obtained.
基金supported by Research Fund for the Doctoral Program of Ministry of Education of China(Grant No.20090041110031)National Natural Science Foundation of China(Grant No.50575033)
文摘Gap debris as discharge product is closely related to machining process in electrical discharge machining(EDM). A lot of recent researches have focused on the relationship among debris size, surfaces texture, remove rate, and machining stability. The study on statistical distribution of debris size contributes to the research, but it is still superficial currently. In order to obtain the distribution law of the debris particle size, laser particle size analyzer(LPSA) combined with scanning electron microscope(SEM) is used to analyze the EDM debris size. Firstly, the heating dried method is applied to obtain the debris particles. Secondly, the measuring range of LPSA is determined as 0.5–100 μm by SEM observation, and the frequency distribution histogram and the cumulative frequency distribution scattergram of debris size are obtained by using LPSA. Thirdly, according to the distribution characteristic of the frequency distribution histogram, the statistical distribution functions of lognormal, exponentially modified Gaussian(EMG), Gamma and Weibull are chosen to achieve curve fitting of the histogram. At last, the distribute law of the debris size is obtained by fitting results. Experiments with different discharge parameters are carried out on an EDM machine designed by the authors themselves, and the machining conditions are tool electrode of red-copper material, workpiece of ANSI 1045 material and working fluid of de-ionized water. The experimental results indicate that the debris sizes of all experiment sample truly obey the Weibull distribution. The obtained distribution law is significantly important for all the models established based on the debris particle size.
基金Shanghai Key Innovative Team of Cultural Heritage Conservation and the financial support from the National Sciences Foundation of China(Grant Nos.41977214 and 41572284)the Open Research Fund of State Key Laboratory of Geomechanics and Geotechnical Engineering,Institute of Rock and Soil Mechanics,Chinese Academy of Sciences(Grant No.Z013008)。
文摘It is well-known that a close link exists between soil-water retention curve(SWRC)and pore size distribution(PSD).Theoretically,mercury intrusion porosimetry(MIP)test simulates a soil drying path and the test results can be used to deduce the SWRC(termed SWRCMIP).However,SWRCMIP does not include the effect of volume change,compared with the conventional SWRC that is directly determined by suction measurement or suction control techniques.For deformable soils,there is a significant difference between conventional SWRC and SWRCMIP.In this study,drying test was carried out on a reconstituted silty soil,and the volume change,suction,and PSD were measured on samples with different water contents.The change in the deduced SWRCMIP and its relationship with the conventional SWRC were analyzed.The results showed that the volume change of soil is the main reason accounting for the difference between conventional SWRC and SWRCMIP.Based on the test results,a transformation model was then proposed for conventional SWRC and SWRCMIP,for which the soil state with no volume change is taken as a reference.Comparison between the experimental and predicted SWRCs showed that the proposed model can well consider the influence of soil volume change on its water retention property.
基金We are grateful for financial supports from National Key Research and Development Program(Grant No.2017YFA0701200)National Science Foundation of China(No.11704030).The author Z X Feng thanks the valuable discussions with Xu-Jia Wang and Rengmao Wu.
文摘Current freeform illumination optical designs are mostly focused on producing prescribed irradiance distributions on planar targets.Here,we aim to design freeform optics that could generate a desired illumination on a curved target from a point source,which is still a challenge.We reduce the difficulties that arise from the curved target by involving its varying z-coordinates in the iterative wavefront tailoring(IWT)procedure.The new IWT-based method is developed under the stereographic coordinate system with a special mesh transformation of the source domain,which is suitable for light sources with light emissions in semi space such as LED sources.The first example demonstrates that a rectangular flat-top illumination can be generated on an undulating surface by a spherical-freeform lens for a Lambertian source.The second example shows that our method is also applicable for producing a non-uniform irradiance distribution in a circular region of the undulating surface.
基金Supported by the fund of the Yunnan Education Committee!(NO.9941072)
文摘The objective of this paper is to present a Bayesian approach based on Kullback- Leibler divergence for assessing local influence in a growth curve model with general co- variance structure. Under certain prior distribution assumption, the Kullback-Leibler di- vergence is used to measure the influence of some minor perturbation on the posterior distribution of unknown parameter. This leads to the diagnostic statistic for detecting which response is locally influential. As an application, the common covariance-weighted perturbation scheme is thoroughly considered.
文摘A regional analysis of design storms, defined as the expected rainfall intensity for given storm duration and return period, is conducted to determine storm Rainfall Intensity-Duration-Frequency (IDF) relationships. The ultimate purpose was to determine IDF curves for homogeneous regions identified in Botswana. Three homogeneous regions were identified based on topographic and rainfall characteristics which were constructed with the K-Means Clustering algorithm. Using the mean annual rainfall and the 24 hr annual maximum rainfall as an indicator of rainfall intensity for each homogeneous region, IDF curves and maps of rainfall intensities of 1 to 24 hr and above durations were produced. The Gamma and Lognormal probability distribution functions were able to provide estimates of rainfall depths for low and medium return periods (up to 100 years) in any location in each homogeneous region of Botswana.
基金A.R.A.Alanzi would like to thank the Deanship of Scientific Research at Majmaah University for financial support and encouragement.
文摘Recently,the Darna distribution has been introduced as a new lifetime distribution.The two-parameter Darna distribution represents is a mixture of two well-known gamma and exponential distributions.A manufacturer or an engineer of products conducts life testing to examine whether the quality level of products meets the customer’s requirements,such as reliability or the minimum lifetime.In this article,an attribute modified chain sampling inspection plan based on the time truncated life test is proposed for items whose lifetime follows the Darna distribution.The plan parameters,including the sample size,the acceptance number,and the past lot result of the proposed sampling plan,are determined with the help of the two-point approach considering the acceptable quality level(AQL)and the limiting quality level(LQL).The plan parameters and the corresponding operating characteristic functions of a new plan are provided in tabular form for various Darna distribution parameters.Also,a few illustrated examples are presented for various distribution parameters.The usefulness of the proposed attribute modified chain sampling plan is investigated using two real failure time datasets.The results indicate that the proposed sampling plan can reduce the sample size when the termination ratio increases for fixed values of the producer’s risk and acceptance number.Hence,the proposed attribute modified chain sampling inspection plan is recommended to practitioners in the field.
基金supported by the State Key Laboratory of Coal Resources and Safe Mining (No. SKLCRSM10X04)the National Natural Science Foundation of China ((No. 21243006)+1 种基金the Foundation of Ministry of Education of China ((No. 02019)the Priority Academic Program Development of Jiangsu Higher Education Institutions (No.SZBF2011-6-B35)
文摘In order to study temperature field distribution in burnt surrounding rock and to determine ranges of burnt surrounding rock, coal-wall coking cycle and heat influence in the underground coal gasification(UCG) stope, based on the Laplace transform and inversion formula, we studied the temperature analytical solution of one-dimensional unsteady heat conduction for multi-layer overlying strata under the first and the forth kinds of boundary conditions, and we also carried out a numerical simulation of twodimensional unsteady heat conduction by the COMSOL multiphysics. The results show that when the boundary temperature of surrounding rock has a linear decrease because of a directional movement of heat source in the UCG flame working face, the temperature in surrounding rock increases first and then decreases with time, the peak of temperature curve decreases gradually and its position moves inside surrounding rock from the boundary. In the surrounding rock of UCG stope, there is an envelope curve of temperature curve clusters. We analyzed the influence of thermophysical parameters on envelope curves and put forward to take envelope curve as the calculation basis for ranges of burnt surrounding rock, coal-wall coking cycle and heat influence. Finally, the concrete numerical values are given by determining those judgement standards and temperature thresholds, which basically tally with the field geophysical prospecting results.
基金National Natural Science Foundation of China(Nos.4187400141664001)+1 种基金Support Program for Outstanding Youth Talents in Jiangxi Province(No.20162BCB23050)National Key Research and Development Program(No.2016YFB0501405)。
文摘For the determination of the smoothing factor (also known as the regularization parameter) in the co-seismic slip distribution inversion, the compromise curve between the model roughness and the data fitting residual is generally used to determine (in order to distinguish the method proposed in this paper, the method is called “L curve” according to its shape). Based on the L-curve, the Eclectic Intersection curve as a new method is proposed to determine the smoothing factor in this paper. The results of the simulated experiment show that the inversion accuracy of the parameters of the seismic slip distribution with the smoothing factor determined by the Eclectic Intersection curve method is better than that of the L curve method. Moreover, the Eclectic Intersection curve method and the L curve method are used to determine the smoothing factor of L’Aquila earthquake and the Taiwan Meinong earthquake slip distribution inversion respectively, and the inversion results are compared and analyzed. The analysis results show that the L’Aquila and the Taiwan Meinong actual earthquake slip distribution results are in the range of other scholars at home and abroad, and compared with the L curve method, the Eclectic Intersection curve method has advantages of high computation efficiency, no need to depend on data fitting degree and more appropriate of smoothing factor and so on.