Spatial variability of soil properties imposes a challenge for practical analysis and design in geotechnical engineering.The latter is particularly true for slope stability assessment,where the effects of uncertainty ...Spatial variability of soil properties imposes a challenge for practical analysis and design in geotechnical engineering.The latter is particularly true for slope stability assessment,where the effects of uncertainty are synthesized in the so-called probability of failure.This probability quantifies the reliability of a slope and its numerical calculation is usually quite involved from a numerical viewpoint.In view of this issue,this paper proposes an approach for failure probability assessment based on Latinized partially stratified sampling and maximum entropy distribution with fractional moments.The spatial variability of geotechnical properties is represented by means of random fields and the Karhunen-Loève expansion.Then,failure probabilities are estimated employing maximum entropy distribution with fractional moments.The application of the proposed approach is examined with two examples:a case study of an undrained slope and a case study of a slope with cross-correlated random fields of strength parameters under a drained slope.The results show that the proposed approach has excellent accuracy and high efficiency,and it can be applied straightforwardly to similar geotechnical engineering problems.展开更多
This paper takes the assessment and evaluation of computational mechanics course as the background,and constructs a diversified course evaluation system that is student-centered and integrates both quantitative and qu...This paper takes the assessment and evaluation of computational mechanics course as the background,and constructs a diversified course evaluation system that is student-centered and integrates both quantitative and qualitative evaluation methods.The system not only pays attention to students’practical operation and theoretical knowledge mastery but also puts special emphasis on the cultivation of students’innovative abilities.In order to realize a comprehensive and objective evaluation,the assessment and evaluation method of the entropy weight model combining TOPSIS(Technique for Order Preference by Similarity to Ideal Solution)multi-attribute decision analysis and entropy weight theory is adopted,and its validity and practicability are verified through example analysis.This method can not only comprehensively and objectively evaluate students’learning outcomes,but also provide a scientific decision-making basis for curriculum teaching reform.The implementation of this diversified course evaluation system can better reflect the comprehensive ability of students and promote the continuous improvement of teaching quality.展开更多
Laser-induced fluorescence(LIF)spectroscopy is employed for plasma diagnosis,necessitating the utilization of deconvolution algorithms to isolate the Doppler effect from the raw spectral signal.However,direct deconvol...Laser-induced fluorescence(LIF)spectroscopy is employed for plasma diagnosis,necessitating the utilization of deconvolution algorithms to isolate the Doppler effect from the raw spectral signal.However,direct deconvolution becomes invalid in the presence of noise as it leads to infinite amplification of high-frequency noise components.To address this issue,we propose a deconvolution algorithm based on the maximum entropy principle.We validate the effectiveness of the proposed algorithm by utilizing simulated LIF spectra at various noise levels(signal-to-noise ratio,SNR=20–80 d B)and measured LIF spectra with Xe as the working fluid.In the typical measured spectrum(SNR=26.23 d B)experiment,compared with the Gaussian filter and the Richardson–Lucy(R-L)algorithm,the proposed algorithm demonstrates an increase in SNR of 1.39 d B and 4.66 d B,respectively,along with a reduction in the root-meansquare error(RMSE)of 35%and 64%,respectively.Additionally,there is a decrease in the spectral angle(SA)of 0.05 and 0.11,respectively.In the high-quality spectrum(SNR=43.96 d B)experiment,the results show that the running time of the proposed algorithm is reduced by about98%compared with the R-L iterative algorithm.Moreover,the maximum entropy algorithm avoids parameter optimization settings and is more suitable for automatic implementation.In conclusion,the proposed algorithm can accurately resolve Doppler spectrum details while effectively suppressing noise,thus highlighting its advantage in LIF spectral deconvolution applications.展开更多
In this paper we study optimal advertising problems that model the introduction of a new product into the market in the presence of carryover effects of the advertisement and with memory effects in the level of goodwi...In this paper we study optimal advertising problems that model the introduction of a new product into the market in the presence of carryover effects of the advertisement and with memory effects in the level of goodwill. In particular, we let the dynamics of the product goodwill to depend on the past, and also on past advertising efforts. We treat the problem by means of the stochastic Pontryagin maximum principle, that here is considered for a class of problems where in the state equation either the state or the control depend on the past. Moreover the control acts on the martingale term and the space of controls U can be chosen to be non-convex but now the space of controls U can be chosen to be non-convex. The maximum principle is thus formulated using a first-order adjoint Backward Stochastic Differential Equations (BSDEs), which can be explicitly computed due to the specific characteristics of the model, and a second-order adjoint relation.展开更多
Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear mode...Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.展开更多
Taking the Dapingzhang copper-polymetallic deposit in Yunnan Province, China as the research object, the maximum entropy model was used to extract the mining information, and the mineral resource prediction model was ...Taking the Dapingzhang copper-polymetallic deposit in Yunnan Province, China as the research object, the maximum entropy model was used to extract the mining information, and the mineral resource prediction model was established by using the exploration data of the deposit and related regions in this area, so as to determine the prospecting prospect area in the study area. In this paper, the Jacknife analysis module of maximum entropy model is used to quantitatively rank the importance of 39 geochemical element variables, and finally obtain the prospecting prospect map of the study area. The research results show that the Dapingzhang mining area has the potential to find hidden ore in the deep and surrounding areas, and the northern and southern ends and western sides of the rock ore control structural belt in the eastern region of the mining area have good prospecting prospects. The research results provide an important basis for the deployment of follow-up exploration work in the study area, and the maximum entropy model has a good application effect in mineral resources exploration.展开更多
A new compound distribution model for extreme wave heights of typhoon-affected sea areas is proposed on the basis of the maximum-entropy principle. The new model is formed by nesting a discrete distribution in a conti...A new compound distribution model for extreme wave heights of typhoon-affected sea areas is proposed on the basis of the maximum-entropy principle. The new model is formed by nesting a discrete distribution in a continuous one, having eight parameters which can be determined in terms of observed data of typhoon occurrence-frequency and extreme wave heights by numerically solving two sets of equations derived in this paper. The model is examined by using it to predict the N-year return-period wave height at two hydrology stations in the Yellow Sea, and the predicted results are compared with those predicted by use of some other compound distribution models. Examinations and comparisons show that the model has some advantages for predicting the N-year return-period wave height in typhoon-affected sea areas.展开更多
The maximum entropy principle(MEP) is one of the first methods which have been used to predict droplet size and velocity distributions of liquid sprays. This method needs a mean droplets diameter as an input to predic...The maximum entropy principle(MEP) is one of the first methods which have been used to predict droplet size and velocity distributions of liquid sprays. This method needs a mean droplets diameter as an input to predict the droplet size distribution. This paper presents a new sub-model based on the deterministic aspects of liquid atomization process independent of the experimental data to provide the mean droplets diameter for using in the maximum entropy formulation(MEF). For this purpose, a theoretical model based on the approach of energy conservation law entitled energy-based model(EBM) is presented. Based on this approach, atomization occurs due to the kinetic energy loss. Prediction of the combined model(MEF/EBM) is in good agreement with the available experimental data. The energy-based model can be used as a fast and reliable enough model to obtain a good estimation of the mean droplets diameter of a spray and the combined model(MEF/EBM) can be used to well predict the droplet size distribution at the primary breakup.展开更多
Detecting objects of interest from a video sequence is a fundamental and critical task in automated visual surveillance. Most current approaches only focus on discriminating moving objects by background subtraction wh...Detecting objects of interest from a video sequence is a fundamental and critical task in automated visual surveillance. Most current approaches only focus on discriminating moving objects by background subtraction whether or not the objects of interest can be moving or stationary. In this paper, we propose layers segmentation to detect both moving and stationary target objects from surveillance video. We extend the Maximum Entropy (ME) statistical model to segment layers with features, which are collected by constructing a codebook with a set of codewords for each pixel. We also indicate how the training models are used for the discrimination of target objects in surveillance video. Our experimental results are presented in terms of the success rate and the segmenting precision.展开更多
To solve the complicated feature extraction and long distance dependency problem in Word Segmentation Disambiguation (WSD), this paper proposes to apply rough sets in WSD based on the Maximum Entropy model. Firstly, r...To solve the complicated feature extraction and long distance dependency problem in Word Segmentation Disambiguation (WSD), this paper proposes to apply rough sets in WSD based on the Maximum Entropy model. Firstly, rough set theory is applied to extract the complicated features and long distance features, even from noise or inconsistent corpus. Secondly, these features are added into the Maximum Entropy model, and consequently, the feature weights can be assigned according to the performance of the whole disambiguation model. Finally, the semantic lexicon is adopted to build class-based rough set features to overcome data sparseness. The experiment indicated that our method performed better than previous models, which got top rank in WSD in 863 Evaluation in 2003. This system ranked first and second respectively in MSR and PKU open test in the Second International Chinese Word Segmentation Bakeoff held in 2005.展开更多
The soil freezing and thawing process affects soil physical properties,such as heat conductivity,heat capacity,and hydraulic conductivity in frozen ground regions,and further affects the processes of soil energy,hydro...The soil freezing and thawing process affects soil physical properties,such as heat conductivity,heat capacity,and hydraulic conductivity in frozen ground regions,and further affects the processes of soil energy,hydrology,and carbon and nitrogen cycles.In this study,the calculation of freezing and thawing front parameterization was implemented into the earth system model of the Chinese Academy of Sciences(CAS-ESM)and its land component,the Common Land Model(CoLM),to investigate the dynamic change of freezing and thawing fronts and their effects.Our results showed that the developed models could reproduce the soil freezing and thawing process and the dynamic change of freezing and thawing fronts.The regionally averaged value of active layer thickness in the permafrost regions was 1.92 m,and the regionally averaged trend value was 0.35 cm yr–1.The regionally averaged value of maximum freezing depth in the seasonally frozen ground regions was 2.15 m,and the regionally averaged trend value was–0.48 cm yr–1.The active layer thickness increased while the maximum freezing depth decreased year by year.These results contribute to a better understanding of the freezing and thawing cycle process.展开更多
The spatial interaction model is an effective way to explore the geographical disparities inherent in the Belt and Road Initiative(BRI) by simulating spatial flows. The traditional gravity model implies the hypothesis...The spatial interaction model is an effective way to explore the geographical disparities inherent in the Belt and Road Initiative(BRI) by simulating spatial flows. The traditional gravity model implies the hypothesis of equilibrium points without any reference to when or how to achieve it. In this paper, a dynamic gravity model was established based on the Maximum Entropy(MaxEnt) theory to estimate and monitor the interconnection intensity and dynamic characters of bilateral relations. In order to detect the determinants of interconnection intensity, a Geodetector method was applied to identify and evaluate the determinants of spatial networks in five dimensions. The empirical study clearly demonstrates a heterogeneous and non-circular spatial structure. The main driving forces of spatial-temporal evolution are foreign direct investment, tourism and railway infrastructure construction, while determinants in different sub-regions show obvious spatial differentiation. Southeast Asian countries are typically multi-island area where aviation infrastructure plays a more important role. North and Central Asian countries regard oil as a pillar industry where power and port facilities have a greater impact on the interconnection. While Western Asian countries are mostly influenced by the railway infrastructure, Eastern European countries already have relatively robust infrastructure where tariff policies provide a greater impetus.展开更多
A new method for estimating the n (50 or 100) -year return-period waveheight, namely, the extreme waveheight expected to occur in n years, is presented on the basis of the maximum entropy principle. The main p...A new method for estimating the n (50 or 100) -year return-period waveheight, namely, the extreme waveheight expected to occur in n years, is presented on the basis of the maximum entropy principle. The main points of the method are as follows: (1) based on the Hamiltonian principle, a maximum entropy probability density function for the extreme waveheight H, f(H)=αHγ e -βH4 is derived from a Lagrangian function subject to some necessary and rational constraints; (2) the parameters α, β, and γ in the function are expressed in terms of the mean , variance V= (H-)2 and bias B= (H-)3 ; and (3) with , V and B estimated from observed data, the n -year return-period wave height H n is computed in accordance with the formula 11-F(H n)=n , where F(H n) is defined as F(H n)=∫ H n 0f(H) d H. Examples of estimating the 50 and 100-year return period waveheights by the present method and by some currently used method from observed data acquired from two hydrographic stations are given. A comparison of the estimated results shows that the present method is superior to the others.展开更多
The interaction between a two-level atom and a single-mode field in the k-photon Jaynes-Cummings model (JCM) in the presence of the Stark shift and a Kerr medium is studied. All terms in the Hamiltonian, such as the...The interaction between a two-level atom and a single-mode field in the k-photon Jaynes-Cummings model (JCM) in the presence of the Stark shift and a Kerr medium is studied. All terms in the Hamiltonian, such as the single-mode field, its interaction with the atom, the contribution of the Stark shift and the Kerr medium effects are considered to be f-deformed. In particular, the effect of the initial state of the radiation field on the dynamical evolution of some physical properties such as atomic inversion and entropy squeezing are investigated by considering different initial field states (coherent, squeezed and thermal states).展开更多
This paper investigates the maximum entropy restoration of blurred binary image.In concerning with the binary property of image,a new maximum entropy restoration methodwith binary constraint is proposed.The properties...This paper investigates the maximum entropy restoration of blurred binary image.In concerning with the binary property of image,a new maximum entropy restoration methodwith binary constraint is proposed.The properties of existence and uniqueness of solution arediscussed.The problem of maximum of entropy with two constraints is solved and the corre-sponding algorithm is given.In this paper,the maximum bounded entropy principle is employedconcerning the prior knowledge of binary image,and the maximum bounded entropy restora-tion method with binary constraint is put forward.The proposes methods,Wiener filter(WF)restoration method and maximum entropy restoration method are compared.The experimen-tal results show that the maximum entropy restoration method and maximum bounded entropyrestoration method with binary constraint can improve the quality of restored image.展开更多
We examine the single-atom entropy squeezing and the atom-field entanglement in a system of two moving twolevel atoms interacting with a single-mode coherent field in a lossless resonant cavity. Our numerical calculat...We examine the single-atom entropy squeezing and the atom-field entanglement in a system of two moving twolevel atoms interacting with a single-mode coherent field in a lossless resonant cavity. Our numerical calculations indicate that the squeezing period, the squeezing time and the maximM squeezing can be controlled by appropriately choosing the atomic motion and the field-mode structure. The atomic motion leads to a periodical time evolution of entanglement between the two-atom and the field. Moreover, there exists corresponding relation between the time evolution properties of the atomic entropy squeezing and that of the entanglement between the two atoms and the field.展开更多
A crowdsourcing experiment in which viewers (the “crowd”) of a British Broadcasting Corporation (BBC) television show submitted estimates of the number of coins in a tumbler was shown in an antecedent paper (Part 1)...A crowdsourcing experiment in which viewers (the “crowd”) of a British Broadcasting Corporation (BBC) television show submitted estimates of the number of coins in a tumbler was shown in an antecedent paper (Part 1) to follow a log-normal distribution ∧(m,s2). The coin-estimation experiment is an archetype of a broad class of image analysis and object counting problems suitable for solution by crowdsourcing. The objective of the current paper (Part 2) is to determine the location and scale parameters (m,s) of ∧(m,s2) by both Bayesian and maximum likelihood (ML) methods and to compare the results. One outcome of the analysis is the resolution, by means of Jeffreys’ rule, of questions regarding the appropriate Bayesian prior. It is shown that Bayesian and ML analyses lead to the same expression for the location parameter, but different expressions for the scale parameter, which become identical in the limit of an infinite sample size. A second outcome of the analysis concerns use of the sample mean as the measure of information of the crowd in applications where the distribution of responses is not sought or known. In the coin-estimation experiment, the sample mean was found to differ widely from the mean number of coins calculated from ∧(m,s2). This discordance raises critical questions concerning whether, and under what conditions, the sample mean provides a reliable measure of the information of the crowd. This paper resolves that problem by use of the principle of maximum entropy (PME). The PME yields a set of equations for finding the most probable distribution consistent with given prior information and only that information. If there is no solution to the PME equations for a specified sample mean and sample variance, then the sample mean is an unreliable statistic, since no measure can be assigned to its uncertainty. Parts 1 and 2 together demonstrate that the information content of crowdsourcing resides in the distribution of responses (very often log-normal in form), which can be obtained empirically or by appropriate modeling.展开更多
In machine-vision-based systems for detecting foreign fibers, due to the background of the cotton layer has the absolute advantage in the whole image, while the foreign fiber only account for a very small part, and w...In machine-vision-based systems for detecting foreign fibers, due to the background of the cotton layer has the absolute advantage in the whole image, while the foreign fiber only account for a very small part, and what’s more, the brightness and contrast of the image are all poor. Using the traditional image segmentation method, the segmentation results are very poor. By adopting the maximum entropy and genetic algorithm, the maximum entropy function was used as the fitness function of genetic algorithm. Through continuous optimization, the optimal segmentation threshold is determined. Experimental results prove that the image segmentation of this paper not only fast and accurate, but also has strong adaptability.展开更多
基金funding support from the China Scholarship Council(CSC).
文摘Spatial variability of soil properties imposes a challenge for practical analysis and design in geotechnical engineering.The latter is particularly true for slope stability assessment,where the effects of uncertainty are synthesized in the so-called probability of failure.This probability quantifies the reliability of a slope and its numerical calculation is usually quite involved from a numerical viewpoint.In view of this issue,this paper proposes an approach for failure probability assessment based on Latinized partially stratified sampling and maximum entropy distribution with fractional moments.The spatial variability of geotechnical properties is represented by means of random fields and the Karhunen-Loève expansion.Then,failure probabilities are estimated employing maximum entropy distribution with fractional moments.The application of the proposed approach is examined with two examples:a case study of an undrained slope and a case study of a slope with cross-correlated random fields of strength parameters under a drained slope.The results show that the proposed approach has excellent accuracy and high efficiency,and it can be applied straightforwardly to similar geotechnical engineering problems.
基金2024 Key Project of Teaching Reform Research and Practice in Higher Education in Henan Province“Exploration and Practice of Training Model for Outstanding Students in Basic Mechanics Discipline”(2024SJGLX094)Henan Province“Mechanics+X”Basic Discipline Outstanding Student Training Base2024 Research and Practice Project of Higher Education Teaching Reform in Henan University of Science and Technology“Optimization and Practice of Ability-Oriented Teaching Mode for Computational Mechanics Course:A New Exploration in Cultivating Practical Simulation Engineers”(2024BK074)。
文摘This paper takes the assessment and evaluation of computational mechanics course as the background,and constructs a diversified course evaluation system that is student-centered and integrates both quantitative and qualitative evaluation methods.The system not only pays attention to students’practical operation and theoretical knowledge mastery but also puts special emphasis on the cultivation of students’innovative abilities.In order to realize a comprehensive and objective evaluation,the assessment and evaluation method of the entropy weight model combining TOPSIS(Technique for Order Preference by Similarity to Ideal Solution)multi-attribute decision analysis and entropy weight theory is adopted,and its validity and practicability are verified through example analysis.This method can not only comprehensively and objectively evaluate students’learning outcomes,but also provide a scientific decision-making basis for curriculum teaching reform.The implementation of this diversified course evaluation system can better reflect the comprehensive ability of students and promote the continuous improvement of teaching quality.
文摘Laser-induced fluorescence(LIF)spectroscopy is employed for plasma diagnosis,necessitating the utilization of deconvolution algorithms to isolate the Doppler effect from the raw spectral signal.However,direct deconvolution becomes invalid in the presence of noise as it leads to infinite amplification of high-frequency noise components.To address this issue,we propose a deconvolution algorithm based on the maximum entropy principle.We validate the effectiveness of the proposed algorithm by utilizing simulated LIF spectra at various noise levels(signal-to-noise ratio,SNR=20–80 d B)and measured LIF spectra with Xe as the working fluid.In the typical measured spectrum(SNR=26.23 d B)experiment,compared with the Gaussian filter and the Richardson–Lucy(R-L)algorithm,the proposed algorithm demonstrates an increase in SNR of 1.39 d B and 4.66 d B,respectively,along with a reduction in the root-meansquare error(RMSE)of 35%and 64%,respectively.Additionally,there is a decrease in the spectral angle(SA)of 0.05 and 0.11,respectively.In the high-quality spectrum(SNR=43.96 d B)experiment,the results show that the running time of the proposed algorithm is reduced by about98%compared with the R-L iterative algorithm.Moreover,the maximum entropy algorithm avoids parameter optimization settings and is more suitable for automatic implementation.In conclusion,the proposed algorithm can accurately resolve Doppler spectrum details while effectively suppressing noise,thus highlighting its advantage in LIF spectral deconvolution applications.
文摘In this paper we study optimal advertising problems that model the introduction of a new product into the market in the presence of carryover effects of the advertisement and with memory effects in the level of goodwill. In particular, we let the dynamics of the product goodwill to depend on the past, and also on past advertising efforts. We treat the problem by means of the stochastic Pontryagin maximum principle, that here is considered for a class of problems where in the state equation either the state or the control depend on the past. Moreover the control acts on the martingale term and the space of controls U can be chosen to be non-convex but now the space of controls U can be chosen to be non-convex. The maximum principle is thus formulated using a first-order adjoint Backward Stochastic Differential Equations (BSDEs), which can be explicitly computed due to the specific characteristics of the model, and a second-order adjoint relation.
文摘Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.
文摘Taking the Dapingzhang copper-polymetallic deposit in Yunnan Province, China as the research object, the maximum entropy model was used to extract the mining information, and the mineral resource prediction model was established by using the exploration data of the deposit and related regions in this area, so as to determine the prospecting prospect area in the study area. In this paper, the Jacknife analysis module of maximum entropy model is used to quantitatively rank the importance of 39 geochemical element variables, and finally obtain the prospecting prospect map of the study area. The research results show that the Dapingzhang mining area has the potential to find hidden ore in the deep and surrounding areas, and the northern and southern ends and western sides of the rock ore control structural belt in the eastern region of the mining area have good prospecting prospects. The research results provide an important basis for the deployment of follow-up exploration work in the study area, and the maximum entropy model has a good application effect in mineral resources exploration.
基金supported by the Open Fund of the Key Laboratory of Research on Marine Hazards Forecasting (Grant No.LOMF1101)the Shanghai Typhoon Research Fund (Grant No. 2009ST05)the National Natural Science Foundation of China(Grant No. 40776006)
文摘A new compound distribution model for extreme wave heights of typhoon-affected sea areas is proposed on the basis of the maximum-entropy principle. The new model is formed by nesting a discrete distribution in a continuous one, having eight parameters which can be determined in terms of observed data of typhoon occurrence-frequency and extreme wave heights by numerically solving two sets of equations derived in this paper. The model is examined by using it to predict the N-year return-period wave height at two hydrology stations in the Yellow Sea, and the predicted results are compared with those predicted by use of some other compound distribution models. Examinations and comparisons show that the model has some advantages for predicting the N-year return-period wave height in typhoon-affected sea areas.
文摘The maximum entropy principle(MEP) is one of the first methods which have been used to predict droplet size and velocity distributions of liquid sprays. This method needs a mean droplets diameter as an input to predict the droplet size distribution. This paper presents a new sub-model based on the deterministic aspects of liquid atomization process independent of the experimental data to provide the mean droplets diameter for using in the maximum entropy formulation(MEF). For this purpose, a theoretical model based on the approach of energy conservation law entitled energy-based model(EBM) is presented. Based on this approach, atomization occurs due to the kinetic energy loss. Prediction of the combined model(MEF/EBM) is in good agreement with the available experimental data. The energy-based model can be used as a fast and reliable enough model to obtain a good estimation of the mean droplets diameter of a spray and the combined model(MEF/EBM) can be used to well predict the droplet size distribution at the primary breakup.
基金Project supported by the National Natural Science Foundation of China (No. 60272031), and Technology Plan Program of ZhejiangProvince (No. 2003C21010), and Zhejiang Provincial Natural Sci-ence Foundation of China (No. M603202)
文摘Detecting objects of interest from a video sequence is a fundamental and critical task in automated visual surveillance. Most current approaches only focus on discriminating moving objects by background subtraction whether or not the objects of interest can be moving or stationary. In this paper, we propose layers segmentation to detect both moving and stationary target objects from surveillance video. We extend the Maximum Entropy (ME) statistical model to segment layers with features, which are collected by constructing a codebook with a set of codewords for each pixel. We also indicate how the training models are used for the discrimination of target objects in surveillance video. Our experimental results are presented in terms of the success rate and the segmenting precision.
文摘To solve the complicated feature extraction and long distance dependency problem in Word Segmentation Disambiguation (WSD), this paper proposes to apply rough sets in WSD based on the Maximum Entropy model. Firstly, rough set theory is applied to extract the complicated features and long distance features, even from noise or inconsistent corpus. Secondly, these features are added into the Maximum Entropy model, and consequently, the feature weights can be assigned according to the performance of the whole disambiguation model. Finally, the semantic lexicon is adopted to build class-based rough set features to overcome data sparseness. The experiment indicated that our method performed better than previous models, which got top rank in WSD in 863 Evaluation in 2003. This system ranked first and second respectively in MSR and PKU open test in the Second International Chinese Word Segmentation Bakeoff held in 2005.
基金This work was jointly funded by the National Natural Science Foundation of China(Grant Nos.42205168,41830967,and 42175163)the Youth Innovation Promotion Association CAS(2021073)the National Key Scientific and Technological Infrastructure project“Earth System Science Numerical Simulator Facility”(EarthLab).
文摘The soil freezing and thawing process affects soil physical properties,such as heat conductivity,heat capacity,and hydraulic conductivity in frozen ground regions,and further affects the processes of soil energy,hydrology,and carbon and nitrogen cycles.In this study,the calculation of freezing and thawing front parameterization was implemented into the earth system model of the Chinese Academy of Sciences(CAS-ESM)and its land component,the Common Land Model(CoLM),to investigate the dynamic change of freezing and thawing fronts and their effects.Our results showed that the developed models could reproduce the soil freezing and thawing process and the dynamic change of freezing and thawing fronts.The regionally averaged value of active layer thickness in the permafrost regions was 1.92 m,and the regionally averaged trend value was 0.35 cm yr–1.The regionally averaged value of maximum freezing depth in the seasonally frozen ground regions was 2.15 m,and the regionally averaged trend value was–0.48 cm yr–1.The active layer thickness increased while the maximum freezing depth decreased year by year.These results contribute to a better understanding of the freezing and thawing cycle process.
基金the auspices of A Category of Strategic Priority Research Program of Chinese Academy of Sciences(No.XDA20010101)。
文摘The spatial interaction model is an effective way to explore the geographical disparities inherent in the Belt and Road Initiative(BRI) by simulating spatial flows. The traditional gravity model implies the hypothesis of equilibrium points without any reference to when or how to achieve it. In this paper, a dynamic gravity model was established based on the Maximum Entropy(MaxEnt) theory to estimate and monitor the interconnection intensity and dynamic characters of bilateral relations. In order to detect the determinants of interconnection intensity, a Geodetector method was applied to identify and evaluate the determinants of spatial networks in five dimensions. The empirical study clearly demonstrates a heterogeneous and non-circular spatial structure. The main driving forces of spatial-temporal evolution are foreign direct investment, tourism and railway infrastructure construction, while determinants in different sub-regions show obvious spatial differentiation. Southeast Asian countries are typically multi-island area where aviation infrastructure plays a more important role. North and Central Asian countries regard oil as a pillar industry where power and port facilities have a greater impact on the interconnection. While Western Asian countries are mostly influenced by the railway infrastructure, Eastern European countries already have relatively robust infrastructure where tariff policies provide a greater impetus.
基金ThisworkisfinanciallysupportedbythePh.D.FoundationoftheMinistryoftheEducationofChina (No .2 0 0 0 4 2 30 8)
文摘A new method for estimating the n (50 or 100) -year return-period waveheight, namely, the extreme waveheight expected to occur in n years, is presented on the basis of the maximum entropy principle. The main points of the method are as follows: (1) based on the Hamiltonian principle, a maximum entropy probability density function for the extreme waveheight H, f(H)=αHγ e -βH4 is derived from a Lagrangian function subject to some necessary and rational constraints; (2) the parameters α, β, and γ in the function are expressed in terms of the mean , variance V= (H-)2 and bias B= (H-)3 ; and (3) with , V and B estimated from observed data, the n -year return-period wave height H n is computed in accordance with the formula 11-F(H n)=n , where F(H n) is defined as F(H n)=∫ H n 0f(H) d H. Examples of estimating the 50 and 100-year return period waveheights by the present method and by some currently used method from observed data acquired from two hydrographic stations are given. A comparison of the estimated results shows that the present method is superior to the others.
文摘The interaction between a two-level atom and a single-mode field in the k-photon Jaynes-Cummings model (JCM) in the presence of the Stark shift and a Kerr medium is studied. All terms in the Hamiltonian, such as the single-mode field, its interaction with the atom, the contribution of the Stark shift and the Kerr medium effects are considered to be f-deformed. In particular, the effect of the initial state of the radiation field on the dynamical evolution of some physical properties such as atomic inversion and entropy squeezing are investigated by considering different initial field states (coherent, squeezed and thermal states).
文摘This paper investigates the maximum entropy restoration of blurred binary image.In concerning with the binary property of image,a new maximum entropy restoration methodwith binary constraint is proposed.The properties of existence and uniqueness of solution arediscussed.The problem of maximum of entropy with two constraints is solved and the corre-sponding algorithm is given.In this paper,the maximum bounded entropy principle is employedconcerning the prior knowledge of binary image,and the maximum bounded entropy restora-tion method with binary constraint is put forward.The proposes methods,Wiener filter(WF)restoration method and maximum entropy restoration method are compared.The experimen-tal results show that the maximum entropy restoration method and maximum bounded entropyrestoration method with binary constraint can improve the quality of restored image.
基金supported by the Science and Technology Program of Dezhou,Shandong Province,China (Grant No. 20080153)the Scientific Research Fund of Dezhou University,China (Grant No. 07024)
文摘We examine the single-atom entropy squeezing and the atom-field entanglement in a system of two moving twolevel atoms interacting with a single-mode coherent field in a lossless resonant cavity. Our numerical calculations indicate that the squeezing period, the squeezing time and the maximM squeezing can be controlled by appropriately choosing the atomic motion and the field-mode structure. The atomic motion leads to a periodical time evolution of entanglement between the two-atom and the field. Moreover, there exists corresponding relation between the time evolution properties of the atomic entropy squeezing and that of the entanglement between the two atoms and the field.
文摘A crowdsourcing experiment in which viewers (the “crowd”) of a British Broadcasting Corporation (BBC) television show submitted estimates of the number of coins in a tumbler was shown in an antecedent paper (Part 1) to follow a log-normal distribution ∧(m,s2). The coin-estimation experiment is an archetype of a broad class of image analysis and object counting problems suitable for solution by crowdsourcing. The objective of the current paper (Part 2) is to determine the location and scale parameters (m,s) of ∧(m,s2) by both Bayesian and maximum likelihood (ML) methods and to compare the results. One outcome of the analysis is the resolution, by means of Jeffreys’ rule, of questions regarding the appropriate Bayesian prior. It is shown that Bayesian and ML analyses lead to the same expression for the location parameter, but different expressions for the scale parameter, which become identical in the limit of an infinite sample size. A second outcome of the analysis concerns use of the sample mean as the measure of information of the crowd in applications where the distribution of responses is not sought or known. In the coin-estimation experiment, the sample mean was found to differ widely from the mean number of coins calculated from ∧(m,s2). This discordance raises critical questions concerning whether, and under what conditions, the sample mean provides a reliable measure of the information of the crowd. This paper resolves that problem by use of the principle of maximum entropy (PME). The PME yields a set of equations for finding the most probable distribution consistent with given prior information and only that information. If there is no solution to the PME equations for a specified sample mean and sample variance, then the sample mean is an unreliable statistic, since no measure can be assigned to its uncertainty. Parts 1 and 2 together demonstrate that the information content of crowdsourcing resides in the distribution of responses (very often log-normal in form), which can be obtained empirically or by appropriate modeling.
文摘In machine-vision-based systems for detecting foreign fibers, due to the background of the cotton layer has the absolute advantage in the whole image, while the foreign fiber only account for a very small part, and what’s more, the brightness and contrast of the image are all poor. Using the traditional image segmentation method, the segmentation results are very poor. By adopting the maximum entropy and genetic algorithm, the maximum entropy function was used as the fitness function of genetic algorithm. Through continuous optimization, the optimal segmentation threshold is determined. Experimental results prove that the image segmentation of this paper not only fast and accurate, but also has strong adaptability.