The neutron spectrum unfolding by Bonner sphere spectrometer(BSS) is considered a complex multidimensional model,which requires complex mathematical methods to solve the first kind of Fredholm integral equation. In or...The neutron spectrum unfolding by Bonner sphere spectrometer(BSS) is considered a complex multidimensional model,which requires complex mathematical methods to solve the first kind of Fredholm integral equation. In order to solve the problem of the maximum likelihood expectation maximization(MLEM) algorithm which is easy to suffer the pitfalls of local optima and the particle swarm optimization(PSO) algorithm which is easy to get unreasonable flight direction and step length of particles, which leads to the invalid iteration and affect efficiency and accuracy, an improved PSO-MLEM algorithm, combined of PSO and MLEM algorithm, is proposed for neutron spectrum unfolding. The dynamic acceleration factor is used to balance the ability of global and local search, and improves the convergence speed and accuracy of the algorithm. Firstly, the Monte Carlo method was used to simulated the BSS to obtain the response function and count rates of BSS. In the simulation of count rate, four reference spectra from the IAEA Technical Report Series No. 403 were used as input parameters of the Monte Carlo method. The PSO-MLEM algorithm was used to unfold the neutron spectrum of the simulated data and was verified by the difference of the unfolded spectrum to the reference spectrum. Finally, the 252Cf neutron source was measured by BSS, and the PSO-MLEM algorithm was used to unfold the experimental neutron spectrum.Compared with maximum entropy deconvolution(MAXED), PSO and MLEM algorithm, the PSO-MLEM algorithm has fewer parameters and automatically adjusts the dynamic acceleration factor to solve the problem of local optima. The convergence speed of the PSO-MLEM algorithm is 1.4 times and 3.1 times that of the MLEM and PSO algorithms. Compared with PSO, MLEM and MAXED, the correlation coefficients of PSO-MLEM algorithm are increased by 33.1%, 33.5% and 1.9%, and the relative mean errors are decreased by 98.2%, 97.8% and 67.4%.展开更多
A passive and multi-channel microwave sounder onboard the Chang'e-2 orbiter has successfully acquired microwave observations of the lunar surface and subsurface structure. Compared with the Chang'e-1 orbiter, the Ch...A passive and multi-channel microwave sounder onboard the Chang'e-2 orbiter has successfully acquired microwave observations of the lunar surface and subsurface structure. Compared with the Chang'e-1 orbiter, the Chang'e-2 orbiter obtained more accurate and comprehensive microwave brightness temperature data, which are helpful for further research. Since there is a close relationship between mi- crowave brightness temperature data and some related properties of the lunar regolith, such as the thickness, temperature and dielectric constant, precise and high resolution brightness temperature data are necessary for such research. However, through the detection mechanism of the microwave sounder, the brightness temperature data ac- quired from the microwave sounder are weighted by the antenna radiation pattern, so the data are the convolution of the antenna radiation pattern with the lunar brightness temperature. In order to obtain the real lunar brightness temperature, a deconvolution method is needed. The aim of this paper is to solve the problem associated with per- forming deconvolution of the lunar brightness temperature. In this study, we introduce the maximum entropy method (MEM) to process the brightness temperature data and achieve excellent results. The paper mainly includes the following aspects: first, we introduce the principle of the MEM; second, through a series of simulations, the MEM has been verified as an efficient deconvolution method; and third, the MEM is used to process the Chang'e-2 microwave data and the results are significant.展开更多
Channel avulsion is a natural phenomenon that occurs abruptly on alluvial river deltas,which can affect the channel stability.The causes for avulsion could be generally categorized as topography-and flood-driven facto...Channel avulsion is a natural phenomenon that occurs abruptly on alluvial river deltas,which can affect the channel stability.The causes for avulsion could be generally categorized as topography-and flood-driven factors.However,previous studies on avulsion thresholds usually focused on topography-driven factors due to the centurial or millennial avulsion timescales of the world’s most deltas,but neglected the impacts of flood-driven factors.In the current study,a novel demarcation equation including the two driven factors was proposed,with the decadal timescale of avulsion being considered in the Yellow River Estuary(YRE).In order to quantify the contributions of different factors in each category,an entropy-based methodology was used to calculate the contributing weights of these factors.The factor with the highest weight in each category was then used to construct the demarcation equation,based on avulsion datasets associated with the YRE.An avulsion threshold was deduced according to the demarcation equation.This avulsion threshold was then applied to conduct the risk assessment of avulsion in the YRE.The results show that:two dominant factors cover respectively geomorphic coefficient representing the topography-driven factor and fluvial erosion intensity representing the flood-driven factor,which were thus employed to define a two dimensional mathematical space in which the demarcation equation can be obtained;the avulsion threshold derived from the equation was also applied in the risk assessment of avulsion;and the avulsion threshold proposed in this study is more accurate,as compared with the existing thresholds.展开更多
The entropy split method is based on the physical entropies of the thermally perfect gas Euler equations.The Euler flux derivatives are approximated as a sum of a conservative portion and a non-conservative portion in...The entropy split method is based on the physical entropies of the thermally perfect gas Euler equations.The Euler flux derivatives are approximated as a sum of a conservative portion and a non-conservative portion in conjunction with summation-by-parts(SBP)difference boundary closure of(Gerritsen and Olsson in J Comput Phys 129:245-262,1996;Olsson and Oliger in RIACS Tech Rep 94.01,1994;Yee et al.in J Comp Phys 162:33-81,2000).Sj?green and Yee(J Sci Comput)recently proved that the entropy split method is entropy conservative and stable.Stand-ard high-order spatial central differencing as well as high order central spatial dispersion relation preserving(DRP)spatial differencing is part of the entropy stable split methodol-ogy framework.The current work is our first attempt to derive a high order conservative numerical flux for the non-conservative portion of the entropy splitting of the Euler flux derivatives.Due to the construction,this conservative numerical flux requires higher oper-ations count and is less stable than the original semi-conservative split method.However,the Tadmor entropy conservative(EC)method(Tadmor in Acta Numerica 12:451-512,2003)of the same order requires more operations count than the new construction.Since the entropy split method is a semi-conservative skew-symmetric splitting of the Euler flux derivative,a modified nonlinear filter approach of(Yee et al.in J Comput Phys 150:199-238,1999,J Comp Phys 162:3381,2000;Yee and Sj?green in J Comput Phys 225:910934,2007,High Order Filter Methods for Wide Range of Compressible flow Speeds.Proceedings of the ICOSAHOM09,June 22-26,Trondheim,Norway,2009)is proposed in conjunction with the entropy split method as the base method for problems containing shock waves.Long-time integration of 2D and 3D test cases is included to show the com-parison of these new approaches.展开更多
According to the World Health Organization(WHO),cancer is the leading cause of death for children in low and middle-income countries.Around 400,000 kids get diagnosed with this illness each year,and their survival rat...According to the World Health Organization(WHO),cancer is the leading cause of death for children in low and middle-income countries.Around 400,000 kids get diagnosed with this illness each year,and their survival rate depends on the country in which they live.In this article,we present a Pythagorean fuzzy model that may help doctors identify the most likely type of cancer in children at an early stage by taking into account the symptoms of different types of cancer.The Pythagorean fuzzy decision-making techniques that we utilize are Pythagorean Fuzzy TOPSIS,Pythagorean Fuzzy Entropy(PF-Entropy),and Pythagorean Fuzzy PowerWeighted Geometric(PFPWG).Ourmodel is fed with nineteen symptoms and it diagnoses the risk of eight types of cancers in children.We develop an algorithm for each method and calculate its complexity.Additionally,we consider an example to make a clear understanding of our model.We also compare the final results of various tests that prove the authenticity of this study.展开更多
To investigate the spatiotemporal variations in the mixed layer depth(MLD)in the Arctic basins,a new criterion to determine the MLD,called the improved maximum angle method(IMAM),was developed.A total of 45123 potenti...To investigate the spatiotemporal variations in the mixed layer depth(MLD)in the Arctic basins,a new criterion to determine the MLD,called the improved maximum angle method(IMAM),was developed.A total of 45123 potential density profiles collected using Ice-Tethered Profilers(ITPs)in the Arctic basins during 2005-2021 were used to demonstrate the method’s effectiveness.By comparing the results obtained by the fixed threshold method(FTM),percentage threshold method(PTM),and maximum gradient method(MGM)for profiles in the Canada Basin,Makarov Basin,and Eurasian Basin,we determined that the quality index(1.0 for perfect identification of the MLD)of the IMAM regarding the assessment of the MLD determination method reached 0.94,which is much greater than those of other criteria.Moreover,two types of the density profiles were identified based on the mixed layer development stage.The MLDs of the typical profiles determined using the IMAM were found to have better consistency with the original definition.By utilizing the new mixed layer criterion,the seasonal variations and regional differences in the MLD in the Arctic basins were analyzed.Spatially,the summer and winter MLDs in the Canada Basin were the shallowest(13.55 m in summer,26.76 m in winter)than those in the Makarov(29.51 m in summer,49.08 m in winter)and Eurasian(20.36 m in summer,46.81 m in winter)basins due to the stable stratification in the upper ocean and the subsequent small effects of dynamic and thermodynamic processes(wind-driven stirring and brine rejection)in the Canada Basin.Seasonally,in the three Arctic basins,the average MLD was shallowest(22.77 m)in summer;it deepened through autumn and reached a winter maximum(41.12 m).展开更多
Excellent results are obtained in structure analysis with jew phases of structure factors by the maximum-entropy method (MEM) for CaGaN PbCO3 and ReBe22 single crystals. The computation time and memory space are minim...Excellent results are obtained in structure analysis with jew phases of structure factors by the maximum-entropy method (MEM) for CaGaN PbCO3 and ReBe22 single crystals. The computation time and memory space are minimized by symmetry operations so that structure analysis by the MEM can be carried out with a personal computer.展开更多
A method which is especially suitable for microcomputer calculation of the true orientation distribution function (ODF) according to the maximum-entropy estimate is proposed for hexagonal system polycrystalline materi...A method which is especially suitable for microcomputer calculation of the true orientation distribution function (ODF) according to the maximum-entropy estimate is proposed for hexagonal system polycrystalline materials with physical symmetry.The resultant computational software system has been also designed and first carried out in a microcomputer PANAFACOM-U1200 being on line with the X-ray diffractometer D/max-3A.The simu- lated calculation shows that the method is concisely pragmatic and easily popularized,while the results obtained are trust worthy.展开更多
Iteration methods and their convergences of the maximum likelihoodestimator are discussed in this paper.We study Gauss-Newton method and give a set ofsufficient conditions for the convergence of asymptotic numerical s...Iteration methods and their convergences of the maximum likelihoodestimator are discussed in this paper.We study Gauss-Newton method and give a set ofsufficient conditions for the convergence of asymptotic numerical stability.The modifiedGauss-Newton method is also studied and the sufficient conditions of the convergence arepresented.Two numerical examples are given to illustrate our results.展开更多
It is important to know the maximum solid solubility( C max ) of various transition metals in a metal when one designs multi component alloys. There have been several semi empirical approaches to qualitatively predict...It is important to know the maximum solid solubility( C max ) of various transition metals in a metal when one designs multi component alloys. There have been several semi empirical approaches to qualitatively predict the C max , such as Darken Gurry(D G) theorem, Miedema Chelikowsky(M C) theorem, electron concentration rule and the bond parameter rule. However, they are not particularly valid for the prediction of C max . It was developed on the basis of energetics of alloys as a new method to predict C max of different transition metals in metal Ti, which can be described as a semi empirical equation using the atomic parameters, i e, electronegativity difference, atomic diameter and electron concentration. It shows that the present method can be used to explain and deduce D G theorem, M C theorem and electron concentration rule.展开更多
In this paper, survival data analysis is realized by applying Generalized Entropy Optimization Methods (GEOM). It is known that all statistical distributions can be obtained as distribution by choosing corresponding m...In this paper, survival data analysis is realized by applying Generalized Entropy Optimization Methods (GEOM). It is known that all statistical distributions can be obtained as distribution by choosing corresponding moment functions. However, Generalized Entropy Optimization Distributions (GEOD) in the form of distributions which are obtained on basis of Shannon measure and supplementary optimization with respect to characterizing moment functions, more exactly represent the given statistical data. For this reason, survival data analysis by GEOD acquires a new significance. In this research, the data of the life table for engine failure data (1980) is examined. The performances of GEOD are established by Chi-Square criteria, Root Mean Square Error (RMSE) criteria and Shannon entropy measure, Kullback-Leibler measure. Comparison of GEOD with each other in the different senses shows that along of these distributions (MinMaxEnt)4 is better in the senses of Shannon measure and of Kullback-Leibler measure. It is showed that, (MinMaxEnt)3 ((MaxMaxEnt)4) is more suitable for statistical data among (MinMaxEnt)m,m=1,2,3,4(MaxMaxEnt)m,m=1,2,3,4. Moreover, (MinMaxEnt)3 is better for statistical data than (MaxMaxEnt)4 in the sense of RMSE criteria. According to obtained distribution (MinMaxEnt)3 (MaxMaxEnt)4 estimator of Probability Density Function?f^?(t), Cumulative Distribution Functio?F^ (t) , Survival Function Ŝ(t) and Hazard Rate ĥ(t) are evaluated and graphically illustrated. The results are acquired by using statistical software MATLAB.展开更多
Based on the maximunl-entropy (ME) principle, a new power spectral estimator for random waves is derived in the form of S(ω)=a/8H^2^-(2π)^(d+2)exp[-b(2π/ω)^n],1)y solving a variational problem subject ...Based on the maximunl-entropy (ME) principle, a new power spectral estimator for random waves is derived in the form of S(ω)=a/8H^2^-(2π)^(d+2)exp[-b(2π/ω)^n],1)y solving a variational problem subject to some quite general constraints. This robust method is comprehensive enough to describe the wave spectra even in extreme wave conditions and is superior to periodogranl method that is not suit'able to process comparatively short or intensively unsteady signals for its tremendous boundary effect and some inherent defects of FKF. Fortunately, the newly derived method for spectral estimation works fairly well, even though the sample data sets are very short and unsteady, and the reliability and efficiency of this spectral estimator have been preliminarily proved.展开更多
The Maximum-entropy Method(MEM)for determining the complete ODF(orientation distribution function),accompanied with the equal-volume partitioning technique for quantitative texture analysis,was first tested in analysi...The Maximum-entropy Method(MEM)for determining the complete ODF(orientation distribution function),accompanied with the equal-volume partitioning technique for quantitative texture analysis,was first tested in analysing the texture of a commercial purity titanium strip.The experimentally measured results indi- cate that the rolling planes of most grains in this sample are parallel to the{1010}and the{1210}with about ~±10°spread while the rolling directions nearly distribute uniformly and their volume fractions are 19.46% and 18.70% respectively.Besides,there are still two weaker texture components,(7526)[1544]and (1105)[2311],with 3.24%and 4.17%respectively.展开更多
Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear mode...Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.展开更多
The current theory in NF EN 1995-1-1/NA of Eurocode 5, which is based on maximum deflection, has been investigated on softwoods. Therefore, this theory is not adapted for slender glulam beam columns made of tropical h...The current theory in NF EN 1995-1-1/NA of Eurocode 5, which is based on maximum deflection, has been investigated on softwoods. Therefore, this theory is not adapted for slender glulam beam columns made of tropical hardwood species from the Congo Basin. This maximum deflection is caused by a set of loads applied to the structure. However, Eurocode 5 doesn’t provide how to predict this deflection in case of long-term load for such structures. This can be done by studying load-displacement (P-Δ) behaviour of these structures while taking into account second order effects. To reach this goal, a nonlinear analysis has been performed on a three-dimensional beam column embedded on both ends. Since conducting experimental investigations on large span structural products is time-consuming and expensive especially in developing countries, a numerical model has been implemented using the Newton-Raphson method to predict load-displacement (P-Δ) curve on a slender glulam beam column made of tropical hardwood species. On one hand, the beam has been analyzed without wood connection. On the other hand, the beam has been analyzed with a bolted wood connection and a slotted-in steel plate. The load cases considered include self-weight and a uniformly applied long-term load. Combinations of serviceability limit states (SLS) and ultimate limit states (ULS) have also been considered, among other factors. A finite-element software RFEM 5 has been used to implement the model. The results showed that the use of steel can reduce displacement by 20.96%. Additionally, compared to the maximum deflection provided by Eurocode 5 for softwoods, hardwoods can exhibit an increasing rate of 85.63%. By harnessing the plastic resistance of steel, the bending resistance of wood can be increased by 32.94%.展开更多
High-entropy materials(HEMs)have better mechanical,thermal,and electrical properties than traditional materials due to their special"high entropy effect".They can also adjust the performance of high entropy ...High-entropy materials(HEMs)have better mechanical,thermal,and electrical properties than traditional materials due to their special"high entropy effect".They can also adjust the performance of high entropy ceramics by adjusting the proportion of raw materials,and have broad application prospects in many fields.This article provides a review of the high entropy effect,preparation methods,and main applications of high entropy ceramic materials,especially exploring relevant research on high entropy perovskite ceramics.It is expected to provide reference for the promotion of scientific research and the development of further large-scale applications of high-entropy ceramic materials.展开更多
Considering the difficulty of fuzzy synthetic evaluation method in calculation of the multiple factors and ignorance of the relationship among evaluating objects, a new weight evaluation process using entropy method w...Considering the difficulty of fuzzy synthetic evaluation method in calculation of the multiple factors and ignorance of the relationship among evaluating objects, a new weight evaluation process using entropy method was introduced. This improved method for determination of weight of the evaluating indicators was applied in water quality assessment of the Three Gorges reservoir area. The results showed that this method was favorable for fuzzy synthetic evaluation when there were more than one evaluating objects. One calculation was enough for calculating every monitoring point. Compared with the original evaluation method, the method predigested the fuzzy synthetic evaluation process greatly and the evaluation results are more reasonable.展开更多
A new method based on the maximum entropy principle for reconstructing the parton distribution function(PDF)from moments is proposed.Unlike traditional methods,the new method does not require any artificial assumption...A new method based on the maximum entropy principle for reconstructing the parton distribution function(PDF)from moments is proposed.Unlike traditional methods,the new method does not require any artificial assumptions.For the case of moments with errors,we introduce Gaussian functions to soften the constraints of moments.Through a series of tests,the effectiveness and reconstruction efficiency of this new method are evaluated comprehensively,demonstrating that this method is reasonable and can achieve high-quality reconstruction with at least the first six moments as input.Finally,we select a set of lattice quantum chromodynamics(QCD)results regarding moments as input and provide reasonable reconstruction results for the pion.展开更多
基金supported by the National Natural science Foundation of China (No. 42127807)the Sichuan Science and Technology Program (No. 2020YJ0334)the Sichuan Science and Technology Breeding Program (No. 2022041)。
文摘The neutron spectrum unfolding by Bonner sphere spectrometer(BSS) is considered a complex multidimensional model,which requires complex mathematical methods to solve the first kind of Fredholm integral equation. In order to solve the problem of the maximum likelihood expectation maximization(MLEM) algorithm which is easy to suffer the pitfalls of local optima and the particle swarm optimization(PSO) algorithm which is easy to get unreasonable flight direction and step length of particles, which leads to the invalid iteration and affect efficiency and accuracy, an improved PSO-MLEM algorithm, combined of PSO and MLEM algorithm, is proposed for neutron spectrum unfolding. The dynamic acceleration factor is used to balance the ability of global and local search, and improves the convergence speed and accuracy of the algorithm. Firstly, the Monte Carlo method was used to simulated the BSS to obtain the response function and count rates of BSS. In the simulation of count rate, four reference spectra from the IAEA Technical Report Series No. 403 were used as input parameters of the Monte Carlo method. The PSO-MLEM algorithm was used to unfold the neutron spectrum of the simulated data and was verified by the difference of the unfolded spectrum to the reference spectrum. Finally, the 252Cf neutron source was measured by BSS, and the PSO-MLEM algorithm was used to unfold the experimental neutron spectrum.Compared with maximum entropy deconvolution(MAXED), PSO and MLEM algorithm, the PSO-MLEM algorithm has fewer parameters and automatically adjusts the dynamic acceleration factor to solve the problem of local optima. The convergence speed of the PSO-MLEM algorithm is 1.4 times and 3.1 times that of the MLEM and PSO algorithms. Compared with PSO, MLEM and MAXED, the correlation coefficients of PSO-MLEM algorithm are increased by 33.1%, 33.5% and 1.9%, and the relative mean errors are decreased by 98.2%, 97.8% and 67.4%.
基金Supported by the National Natural Science Foundation of China
文摘A passive and multi-channel microwave sounder onboard the Chang'e-2 orbiter has successfully acquired microwave observations of the lunar surface and subsurface structure. Compared with the Chang'e-1 orbiter, the Chang'e-2 orbiter obtained more accurate and comprehensive microwave brightness temperature data, which are helpful for further research. Since there is a close relationship between mi- crowave brightness temperature data and some related properties of the lunar regolith, such as the thickness, temperature and dielectric constant, precise and high resolution brightness temperature data are necessary for such research. However, through the detection mechanism of the microwave sounder, the brightness temperature data ac- quired from the microwave sounder are weighted by the antenna radiation pattern, so the data are the convolution of the antenna radiation pattern with the lunar brightness temperature. In order to obtain the real lunar brightness temperature, a deconvolution method is needed. The aim of this paper is to solve the problem associated with per- forming deconvolution of the lunar brightness temperature. In this study, we introduce the maximum entropy method (MEM) to process the brightness temperature data and achieve excellent results. The paper mainly includes the following aspects: first, we introduce the principle of the MEM; second, through a series of simulations, the MEM has been verified as an efficient deconvolution method; and third, the MEM is used to process the Chang'e-2 microwave data and the results are significant.
基金financially supported by the National Key Research and Development Program of China(Grant No.2023YFC3200026)the National Natural Science Foundation of China(Grant No.U2243238)。
文摘Channel avulsion is a natural phenomenon that occurs abruptly on alluvial river deltas,which can affect the channel stability.The causes for avulsion could be generally categorized as topography-and flood-driven factors.However,previous studies on avulsion thresholds usually focused on topography-driven factors due to the centurial or millennial avulsion timescales of the world’s most deltas,but neglected the impacts of flood-driven factors.In the current study,a novel demarcation equation including the two driven factors was proposed,with the decadal timescale of avulsion being considered in the Yellow River Estuary(YRE).In order to quantify the contributions of different factors in each category,an entropy-based methodology was used to calculate the contributing weights of these factors.The factor with the highest weight in each category was then used to construct the demarcation equation,based on avulsion datasets associated with the YRE.An avulsion threshold was deduced according to the demarcation equation.This avulsion threshold was then applied to conduct the risk assessment of avulsion in the YRE.The results show that:two dominant factors cover respectively geomorphic coefficient representing the topography-driven factor and fluvial erosion intensity representing the flood-driven factor,which were thus employed to define a two dimensional mathematical space in which the demarcation equation can be obtained;the avulsion threshold derived from the equation was also applied in the risk assessment of avulsion;and the avulsion threshold proposed in this study is more accurate,as compared with the existing thresholds.
基金support from the NASA TTT/RCA program for the second author is grate-fully acknowledged.
文摘The entropy split method is based on the physical entropies of the thermally perfect gas Euler equations.The Euler flux derivatives are approximated as a sum of a conservative portion and a non-conservative portion in conjunction with summation-by-parts(SBP)difference boundary closure of(Gerritsen and Olsson in J Comput Phys 129:245-262,1996;Olsson and Oliger in RIACS Tech Rep 94.01,1994;Yee et al.in J Comp Phys 162:33-81,2000).Sj?green and Yee(J Sci Comput)recently proved that the entropy split method is entropy conservative and stable.Stand-ard high-order spatial central differencing as well as high order central spatial dispersion relation preserving(DRP)spatial differencing is part of the entropy stable split methodol-ogy framework.The current work is our first attempt to derive a high order conservative numerical flux for the non-conservative portion of the entropy splitting of the Euler flux derivatives.Due to the construction,this conservative numerical flux requires higher oper-ations count and is less stable than the original semi-conservative split method.However,the Tadmor entropy conservative(EC)method(Tadmor in Acta Numerica 12:451-512,2003)of the same order requires more operations count than the new construction.Since the entropy split method is a semi-conservative skew-symmetric splitting of the Euler flux derivative,a modified nonlinear filter approach of(Yee et al.in J Comput Phys 150:199-238,1999,J Comp Phys 162:3381,2000;Yee and Sj?green in J Comput Phys 225:910934,2007,High Order Filter Methods for Wide Range of Compressible flow Speeds.Proceedings of the ICOSAHOM09,June 22-26,Trondheim,Norway,2009)is proposed in conjunction with the entropy split method as the base method for problems containing shock waves.Long-time integration of 2D and 3D test cases is included to show the com-parison of these new approaches.
基金funding this work through General Research Project under Grant No.(R.G.P.2/48/43).
文摘According to the World Health Organization(WHO),cancer is the leading cause of death for children in low and middle-income countries.Around 400,000 kids get diagnosed with this illness each year,and their survival rate depends on the country in which they live.In this article,we present a Pythagorean fuzzy model that may help doctors identify the most likely type of cancer in children at an early stage by taking into account the symptoms of different types of cancer.The Pythagorean fuzzy decision-making techniques that we utilize are Pythagorean Fuzzy TOPSIS,Pythagorean Fuzzy Entropy(PF-Entropy),and Pythagorean Fuzzy PowerWeighted Geometric(PFPWG).Ourmodel is fed with nineteen symptoms and it diagnoses the risk of eight types of cancers in children.We develop an algorithm for each method and calculate its complexity.Additionally,we consider an example to make a clear understanding of our model.We also compare the final results of various tests that prove the authenticity of this study.
基金Supported by the National Key R&D Program of China(Nos.2018 YFA 0605903,2019 YFC 1509101)the National Natural Science Foundation of China(No.41976218)the Fundamental Research Funds for the Central Universities(No.202165005)。
文摘To investigate the spatiotemporal variations in the mixed layer depth(MLD)in the Arctic basins,a new criterion to determine the MLD,called the improved maximum angle method(IMAM),was developed.A total of 45123 potential density profiles collected using Ice-Tethered Profilers(ITPs)in the Arctic basins during 2005-2021 were used to demonstrate the method’s effectiveness.By comparing the results obtained by the fixed threshold method(FTM),percentage threshold method(PTM),and maximum gradient method(MGM)for profiles in the Canada Basin,Makarov Basin,and Eurasian Basin,we determined that the quality index(1.0 for perfect identification of the MLD)of the IMAM regarding the assessment of the MLD determination method reached 0.94,which is much greater than those of other criteria.Moreover,two types of the density profiles were identified based on the mixed layer development stage.The MLDs of the typical profiles determined using the IMAM were found to have better consistency with the original definition.By utilizing the new mixed layer criterion,the seasonal variations and regional differences in the MLD in the Arctic basins were analyzed.Spatially,the summer and winter MLDs in the Canada Basin were the shallowest(13.55 m in summer,26.76 m in winter)than those in the Makarov(29.51 m in summer,49.08 m in winter)and Eurasian(20.36 m in summer,46.81 m in winter)basins due to the stable stratification in the upper ocean and the subsequent small effects of dynamic and thermodynamic processes(wind-driven stirring and brine rejection)in the Canada Basin.Seasonally,in the three Arctic basins,the average MLD was shallowest(22.77 m)in summer;it deepened through autumn and reached a winter maximum(41.12 m).
文摘Excellent results are obtained in structure analysis with jew phases of structure factors by the maximum-entropy method (MEM) for CaGaN PbCO3 and ReBe22 single crystals. The computation time and memory space are minimized by symmetry operations so that structure analysis by the MEM can be carried out with a personal computer.
文摘A method which is especially suitable for microcomputer calculation of the true orientation distribution function (ODF) according to the maximum-entropy estimate is proposed for hexagonal system polycrystalline materials with physical symmetry.The resultant computational software system has been also designed and first carried out in a microcomputer PANAFACOM-U1200 being on line with the X-ray diffractometer D/max-3A.The simu- lated calculation shows that the method is concisely pragmatic and easily popularized,while the results obtained are trust worthy.
文摘Iteration methods and their convergences of the maximum likelihoodestimator are discussed in this paper.We study Gauss-Newton method and give a set ofsufficient conditions for the convergence of asymptotic numerical stability.The modifiedGauss-Newton method is also studied and the sufficient conditions of the convergence arepresented.Two numerical examples are given to illustrate our results.
文摘It is important to know the maximum solid solubility( C max ) of various transition metals in a metal when one designs multi component alloys. There have been several semi empirical approaches to qualitatively predict the C max , such as Darken Gurry(D G) theorem, Miedema Chelikowsky(M C) theorem, electron concentration rule and the bond parameter rule. However, they are not particularly valid for the prediction of C max . It was developed on the basis of energetics of alloys as a new method to predict C max of different transition metals in metal Ti, which can be described as a semi empirical equation using the atomic parameters, i e, electronegativity difference, atomic diameter and electron concentration. It shows that the present method can be used to explain and deduce D G theorem, M C theorem and electron concentration rule.
文摘In this paper, survival data analysis is realized by applying Generalized Entropy Optimization Methods (GEOM). It is known that all statistical distributions can be obtained as distribution by choosing corresponding moment functions. However, Generalized Entropy Optimization Distributions (GEOD) in the form of distributions which are obtained on basis of Shannon measure and supplementary optimization with respect to characterizing moment functions, more exactly represent the given statistical data. For this reason, survival data analysis by GEOD acquires a new significance. In this research, the data of the life table for engine failure data (1980) is examined. The performances of GEOD are established by Chi-Square criteria, Root Mean Square Error (RMSE) criteria and Shannon entropy measure, Kullback-Leibler measure. Comparison of GEOD with each other in the different senses shows that along of these distributions (MinMaxEnt)4 is better in the senses of Shannon measure and of Kullback-Leibler measure. It is showed that, (MinMaxEnt)3 ((MaxMaxEnt)4) is more suitable for statistical data among (MinMaxEnt)m,m=1,2,3,4(MaxMaxEnt)m,m=1,2,3,4. Moreover, (MinMaxEnt)3 is better for statistical data than (MaxMaxEnt)4 in the sense of RMSE criteria. According to obtained distribution (MinMaxEnt)3 (MaxMaxEnt)4 estimator of Probability Density Function?f^?(t), Cumulative Distribution Functio?F^ (t) , Survival Function Ŝ(t) and Hazard Rate ĥ(t) are evaluated and graphically illustrated. The results are acquired by using statistical software MATLAB.
基金This research was financially supported by the National Natural Science Foundation of China(Grant No.50479028)a Research Fundfor Doctoral Programs of Higher Education of China(Grant No.20060423009)
文摘Based on the maximunl-entropy (ME) principle, a new power spectral estimator for random waves is derived in the form of S(ω)=a/8H^2^-(2π)^(d+2)exp[-b(2π/ω)^n],1)y solving a variational problem subject to some quite general constraints. This robust method is comprehensive enough to describe the wave spectra even in extreme wave conditions and is superior to periodogranl method that is not suit'able to process comparatively short or intensively unsteady signals for its tremendous boundary effect and some inherent defects of FKF. Fortunately, the newly derived method for spectral estimation works fairly well, even though the sample data sets are very short and unsteady, and the reliability and efficiency of this spectral estimator have been preliminarily proved.
文摘The Maximum-entropy Method(MEM)for determining the complete ODF(orientation distribution function),accompanied with the equal-volume partitioning technique for quantitative texture analysis,was first tested in analysing the texture of a commercial purity titanium strip.The experimentally measured results indi- cate that the rolling planes of most grains in this sample are parallel to the{1010}and the{1210}with about ~±10°spread while the rolling directions nearly distribute uniformly and their volume fractions are 19.46% and 18.70% respectively.Besides,there are still two weaker texture components,(7526)[1544]and (1105)[2311],with 3.24%and 4.17%respectively.
文摘Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.
文摘The current theory in NF EN 1995-1-1/NA of Eurocode 5, which is based on maximum deflection, has been investigated on softwoods. Therefore, this theory is not adapted for slender glulam beam columns made of tropical hardwood species from the Congo Basin. This maximum deflection is caused by a set of loads applied to the structure. However, Eurocode 5 doesn’t provide how to predict this deflection in case of long-term load for such structures. This can be done by studying load-displacement (P-Δ) behaviour of these structures while taking into account second order effects. To reach this goal, a nonlinear analysis has been performed on a three-dimensional beam column embedded on both ends. Since conducting experimental investigations on large span structural products is time-consuming and expensive especially in developing countries, a numerical model has been implemented using the Newton-Raphson method to predict load-displacement (P-Δ) curve on a slender glulam beam column made of tropical hardwood species. On one hand, the beam has been analyzed without wood connection. On the other hand, the beam has been analyzed with a bolted wood connection and a slotted-in steel plate. The load cases considered include self-weight and a uniformly applied long-term load. Combinations of serviceability limit states (SLS) and ultimate limit states (ULS) have also been considered, among other factors. A finite-element software RFEM 5 has been used to implement the model. The results showed that the use of steel can reduce displacement by 20.96%. Additionally, compared to the maximum deflection provided by Eurocode 5 for softwoods, hardwoods can exhibit an increasing rate of 85.63%. By harnessing the plastic resistance of steel, the bending resistance of wood can be increased by 32.94%.
文摘High-entropy materials(HEMs)have better mechanical,thermal,and electrical properties than traditional materials due to their special"high entropy effect".They can also adjust the performance of high entropy ceramics by adjusting the proportion of raw materials,and have broad application prospects in many fields.This article provides a review of the high entropy effect,preparation methods,and main applications of high entropy ceramic materials,especially exploring relevant research on high entropy perovskite ceramics.It is expected to provide reference for the promotion of scientific research and the development of further large-scale applications of high-entropy ceramic materials.
基金The National Natural Science Foundation of China (No. 50378008)
文摘Considering the difficulty of fuzzy synthetic evaluation method in calculation of the multiple factors and ignorance of the relationship among evaluating objects, a new weight evaluation process using entropy method was introduced. This improved method for determination of weight of the evaluating indicators was applied in water quality assessment of the Three Gorges reservoir area. The results showed that this method was favorable for fuzzy synthetic evaluation when there were more than one evaluating objects. One calculation was enough for calculating every monitoring point. Compared with the original evaluation method, the method predigested the fuzzy synthetic evaluation process greatly and the evaluation results are more reasonable.
基金Supported by Key Project for Undergraduate Teaching Reform and Quality Enhancement Research Plan in Ordinary Colleges and Universities in Tianjin (A231005505)。
文摘A new method based on the maximum entropy principle for reconstructing the parton distribution function(PDF)from moments is proposed.Unlike traditional methods,the new method does not require any artificial assumptions.For the case of moments with errors,we introduce Gaussian functions to soften the constraints of moments.Through a series of tests,the effectiveness and reconstruction efficiency of this new method are evaluated comprehensively,demonstrating that this method is reasonable and can achieve high-quality reconstruction with at least the first six moments as input.Finally,we select a set of lattice quantum chromodynamics(QCD)results regarding moments as input and provide reasonable reconstruction results for the pion.