Cross entropy is a measure in machine learning and deep learning that assesses the difference between predicted and actual probability distributions. In this study, we propose cross entropy as a performance evaluation...Cross entropy is a measure in machine learning and deep learning that assesses the difference between predicted and actual probability distributions. In this study, we propose cross entropy as a performance evaluation metric for image classifier models and apply it to the CT image classification of lung cancer. A convolutional neural network is employed as the deep neural network (DNN) image classifier, with the residual network (ResNet) 50 chosen as the DNN archi-tecture. The image data used comprise a lung CT image set. Two classification models are built from datasets with varying amounts of data, and lung cancer is categorized into four classes using 10-fold cross-validation. Furthermore, we employ t-distributed stochastic neighbor embedding to visually explain the data distribution after classification. Experimental results demonstrate that cross en-tropy is a highly useful metric for evaluating the reliability of image classifier models. It is noted that for a more comprehensive evaluation of model perfor-mance, combining with other evaluation metrics is considered essential. .展开更多
This study evaluates the performance and reliability of a vision transformer (ViT) compared to convolutional neural networks (CNNs) using the ResNet50 model in classifying lung cancer from CT images into four categori...This study evaluates the performance and reliability of a vision transformer (ViT) compared to convolutional neural networks (CNNs) using the ResNet50 model in classifying lung cancer from CT images into four categories: lung adenocarcinoma (LUAD), lung squamous cell carcinoma (LUSC), large cell carcinoma (LULC), and normal. Although CNNs have made significant advancements in medical imaging, their limited capacity to capture long-range dependencies has led to the exploration of ViTs, which leverage self-attention mechanisms for a more comprehensive global understanding of images. The study utilized a dataset of 748 lung CT images to train both models with standardized input sizes, assessing their performance through conventional metrics—accuracy, precision, recall, F1 score, specificity, and AUC—as well as cross entropy, a novel metric for evaluating prediction uncertainty. Both models achieved similar accuracy rates (95%), with ViT demonstrating a slight edge over ResNet50 in precision and F1 scores for specific classes. However, ResNet50 exhibited higher recall for LULC, indicating fewer missed cases. Cross entropy analysis showed that the ViT model had lower average uncertainty, particularly in the LUAD, Normal, and LUSC classes, compared to ResNet50. This finding suggests that ViT predictions are generally more reliable, though ResNet50 performed better for LULC. The study underscores that accuracy alone is insufficient for model comparison, as cross entropy offers deeper insights into the reliability and confidence of model predictions. The results highlight the importance of incorporating cross entropy alongside traditional metrics for a more comprehensive evaluation of deep learning models in medical image classification, providing a nuanced understanding of their performance and reliability. While the ViT outperformed the CNN-based ResNet50 in lung cancer classification based on cross-entropy values, the performance differences were minor and may not hold clinical significance. Therefore, it may be premature to consider replacing CNNs with ViTs in this specific application.展开更多
In recent years,medium entropy alloys have become a research hotspot due to their excellent physical and chemical performances.By controlling reasonable elemental composition and processing parameters,the medium entro...In recent years,medium entropy alloys have become a research hotspot due to their excellent physical and chemical performances.By controlling reasonable elemental composition and processing parameters,the medium entropy alloys can exhibit similar properties to high entropy alloys and have lower costs.In this paper,a FeCoNi medium entropy alloy precursor was prepared via sol-gel and coprecipitation methods,respectively,and FeCoNi medium entropy alloys were prepared by carbothermal and hydrogen reduction.The phases and magnetic properties of FeCoNi medium entropy alloy were investigated.Results showed that FeCoNi medium entropy alloy was produced by carbothermal and hydrogen reduction at 1500℃.Some carbon was detected in the FeCoNi medium entropy alloy prepared by carbothermal reduction.The alloy prepared by hydrogen reduction was uniform and showed a relatively high purity.Moreover,the hydrogen reduction product exhibited better saturation magnetization and lower coercivity.展开更多
In order to increase productivity and reduce energy consumption of steelmaking-continuous casting(SCC) production process, especially with complicated technological routes, the cross entropy(CE) method was adopted to ...In order to increase productivity and reduce energy consumption of steelmaking-continuous casting(SCC) production process, especially with complicated technological routes, the cross entropy(CE) method was adopted to optimize the SCC production scheduling(SCCPS) problem. Based on the CE method, a matrix encoding scheme was proposed and a backward decoding method was used to generate a reasonable schedule. To describe the distribution of the solution space, a probability distribution model was built and used to generate individuals. In addition, the probability updating mechanism of the probability distribution model was proposed which helps to find the optimal individual gradually. Because of the poor stability and premature convergence of the standard cross entropy(SCE) algorithm, the improved cross entropy(ICE) algorithm was proposed with the following improvements: individual generation mechanism combined with heuristic rules, retention mechanism of the optimal individual, local search mechanism and dynamic parameters of the algorithm. Simulation experiments validate that the CE method is effective in solving the SCCPS problem with complicated technological routes and the ICE algorithm proposed has superior performance to the SCE algorithm and the genetic algorithm(GA).展开更多
This paper shows the Fokker-Planck equation of a dynamical system driven by coloured cross-correlated white noises in the absence and presence of a small external force. Based on the Fokker-Planck equation and the def...This paper shows the Fokker-Planck equation of a dynamical system driven by coloured cross-correlated white noises in the absence and presence of a small external force. Based on the Fokker-Planck equation and the definition of Shannon's information entropy, the time dependence of entropy flux and entropy production can be calculated. The present results can be used to explain the extremal behaviour of time dependence of entropy flux and entropy production in view of the dissipative parameter γ of the system, coloured cross-correlation time τ and coloured cross-correlation strength λ.展开更多
No-wait job-shop scheduling (NWJSS) problem is one of the classical scheduling problems that exist on many kinds of industry with no-wait constraint, such as metal working, plastic, chemical, and food industries. Seve...No-wait job-shop scheduling (NWJSS) problem is one of the classical scheduling problems that exist on many kinds of industry with no-wait constraint, such as metal working, plastic, chemical, and food industries. Several methods have been proposed to solve this problem, both exact (i.e. integer programming) and metaheuristic methods. Cross entropy (CE), as a new metaheuristic, can be an alternative method to solve NWJSS problem. This method has been used in combinatorial optimization, as well as multi-external optimization and rare-event simulation. On these problems, CE implementation results an optimal value with less computational time in average. However, using original CE to solve large scale NWJSS requires high computational time. Considering this shortcoming, this paper proposed a hybrid of cross entropy with genetic algorithm (GA), called CEGA, on m-machines NWJSS. The results are compared with other metaheuritics: Genetic Algorithm-Simulated Annealing (GASA) and hybrid tabu search. The results showed that CEGA providing better or at least equal makespans in comparison with the other two methods.展开更多
It is shown how the cross-correlation time and strength of coloured cross-correlated white noises can set an upper bound for the time derivative of entropy in a nonequilibrium system. The value of upper bound can be c...It is shown how the cross-correlation time and strength of coloured cross-correlated white noises can set an upper bound for the time derivative of entropy in a nonequilibrium system. The value of upper bound can be calculated directly based on the Schwartz inequality principle and the Fokker-Planck equation of the dynamical system driven by coloured cross-correlated white noises. The present calculations can be used to interpret the interplay of the dissipative constant and cross-correlation time and strength of coloured cross-correlated white noises on the upper bound.展开更多
In this paper, a new method for Principal Component Analysis in intuitionistic fuzzy situations has been proposed. This approach is based on cross entropy as an information index. This new method is a useful method fo...In this paper, a new method for Principal Component Analysis in intuitionistic fuzzy situations has been proposed. This approach is based on cross entropy as an information index. This new method is a useful method for data reduction for situations in which data are not exact. The inexactness in the situations assumed here is due to fuzziness and missing data information, so that we have two functions (membership and non-membership). Thus, method proposed here is suitable for Atanasov’s Intuitionistic Fuzzy Sets (A-IFSs) in which we have an uncertainty due to a mixture of fuzziness and missing data information. For the demonstration of the application of the method, we have used an example and have presented a conclusion.展开更多
In order to improve the global search ability of biogeography-based optimization(BBO)algorithm in multi-threshold image segmentation,a multi-threshold image segmentation based on improved BBO algorithm is proposed.Whe...In order to improve the global search ability of biogeography-based optimization(BBO)algorithm in multi-threshold image segmentation,a multi-threshold image segmentation based on improved BBO algorithm is proposed.When using BBO algorithm to optimize threshold,firstly,the elitist selection operator is used to retain the optimal set of solutions.Secondly,a migration strategy based on fusion of good solution and pending solution is introduced to reduce premature convergence and invalid migration of traditional migration operations.Thirdly,to reduce the blindness of traditional mutation operations,a mutation operation through binary computation is created.Then,it is applied to the multi-threshold image segmentation of two-dimensional cross entropy.Finally,this method is used to segment the typical image and compared with two-dimensional multi-threshold segmentation based on particle swarm optimization algorithm and the two-dimensional multi-threshold image segmentation based on standard BBO algorithm.The experimental results show that the method has good convergence stability,it can effectively shorten the time of iteration,and the optimization performance is better than the standard BBO algorithm.展开更多
Effect of temperature-dependent viscosity on fully developed forced convection in a duct of rectangular cross-section occupied by a fluid-saturated porous medium is investigated analytically. The Darcy flow model is a...Effect of temperature-dependent viscosity on fully developed forced convection in a duct of rectangular cross-section occupied by a fluid-saturated porous medium is investigated analytically. The Darcy flow model is applied and the viscosity-temperature relation is assumed to be an inverse-linear one. The case of uniform heat flux on the walls, i.e. the H boundary condition in the terminology of Kays and Crawford [12], is treated. For the case of a fluid whose viscosity decreases with temperature, it is found that the effect of the variation is to increase the Nusselt number for heated walls. Having found the velocity and the temperature distribution, the second law of thermodynamics is invoked to find the local and average entropy generation rate. Expressions for the entropy generation rate, the Bejan number, the heat transfer irreversibility, and the fluid flow irreversibility are presented in terms of the Brinkman number, the Péclet number, the viscosity variation number, the dimensionless wall heat flux, and the aspect ratio (width to height ratio). These expressions let a parametric study of the problem based on which it is observed that the entropy generated due to flow in a duct of square cross-section is more than those of rectangular counterparts while increasing the aspect ratio decreases the entropy generation rate similar to what previously reported for the clear flow case by Ratts and Rant [14].展开更多
Configurational information entropy(CIE)analysis has been shown to be applicable for determining the neutron skin thickness(δnp)of neutron-rich nuclei from fragment production in projectile fragmentation reactions.Th...Configurational information entropy(CIE)analysis has been shown to be applicable for determining the neutron skin thickness(δnp)of neutron-rich nuclei from fragment production in projectile fragmentation reactions.The BNN+FRACS machine learning model was adopted to predict the fragment mass cross-sections(σ_(A))of the projectile fragmentation reactions induced by calcium isotopes from ^(36)Ca to ^(56)Ca on a ^(9)Be target at 140MeV/u.The fast Fourier transform was adopted to decompose the possible information compositions inσA distributions and determine the quantity of CIE(S_(A)[f]).It was found that the range of fragments significantly influences the quantity of S_(A)[f],which results in different trends of S_(A)[f]~δnp correlation.The linear S_(A)[f]~δnp correlation in a previous study[Nucl.Sci.Tech.33,6(2022)]could be reproduced using fragments with relatively large mass fragments,which verifies that S_(A)[f]determined from fragmentσAis sensitive to the neutron skin thickness of neutron-rich isotopes.展开更多
In the present study we have formulated a Minimum Cross Fuzzy Entropy Problem (Minx(F)EntP) and proposed sufficient conditions for existence of its solution. Mentioned problem can be formulated as follows. In the ...In the present study we have formulated a Minimum Cross Fuzzy Entropy Problem (Minx(F)EntP) and proposed sufficient conditions for existence of its solution. Mentioned problem can be formulated as follows. In the set of membership functions satisfying the given moment constraints generated by given moment functions it is required to choose the membership function that is closest to a priori membership function in the sense of cross fuzzy entropy measure. The existence of solution of formulated problem is proved by virtue of concavity property of cross fuzzy entropy measure, the implicit function theorem and Lagrange multipliers method. Moreover, Generalized Cross Fuzzy Entropy Optimization Methods in the form of MinMinx(F)EntM and MaxMinx(F)EntM are suggested on the basis of primary phase of minimizing cross fuzzy entropy measure for fixed moment vector function and on the definition of the special functional with Minx(F)Ent values of cross fuzzy entropy measure. Next phase for obtaining mentioned distributions consists of optimization of defined functional with respect to moment vector functions. Distributions obtained by mentioned methods are defined as (MinMinx(F)Ent)m and (MaxMinx(F)Ent)m distributions.展开更多
The segmentation effect of Tsallis entropy method is superior to that of Shannon entropy method, and the computation speed of two-dimensional Shannon cross entropy method can be further improved by optimization. The e...The segmentation effect of Tsallis entropy method is superior to that of Shannon entropy method, and the computation speed of two-dimensional Shannon cross entropy method can be further improved by optimization. The existing two-dimensional Tsallis cross entropy method is not the strict two-dimensional extension. Thus two new methods of image thresholding using two-dimensional Tsallis cross entropy based on either Chaotic Particle Swarm Optimization (CPSO) or decomposition are proposed. The former uses CPSO to find the optimal threshold. The recursive algorithm is adopted to avoid the repetitive computation of fitness function in iterative procedure. The computing speed is improved greatly. The latter converts the two-dimensional computation into two one-dimensional spaces, which makes the computational complexity further reduced from O(L2) to O(L). The experimental results show that, compared with the proposed recently two-dimensional Shannon or Tsallis cross entropy method, the two new methods can achieve superior segmentation results and reduce running time greatly.展开更多
A matrix encoding scheme for the steelmaking continuous casting( SCC) production scheduling( SCCPS) problem and the corresponding decoding method are proposed. Based on it,a cross entropy( CE) method is adopted and an...A matrix encoding scheme for the steelmaking continuous casting( SCC) production scheduling( SCCPS) problem and the corresponding decoding method are proposed. Based on it,a cross entropy( CE) method is adopted and an improved cross entropy( ICE) algorithm is proposed to solve the SCCPS problem to minimize total power consumption. To describe the distribution of the solution space of the CE method,a probability model is built and used to generate individuals by sampling and a probability updating mechanism is introduced to trace the promising samples. For the ICE algorithm,some samples are generated by the heuristic rules for the shortest makespan due to the relation between the makespan and the total power consumption,which can reduce the search space greatly. The optimal sample in each iteration is retained through a retention mechanism to ensure that the historical optimal sample is not lost so as to improve the efficiency and global convergence. A local search procedure is carried out on a part of better samples so as to improve the local exploitation capability of the ICE algorithm and get a better result. The parameter setting is investigated by the Taguchi method of design-of-experiment. A number of simulation experiments are implemented to validate the effectiveness of the ICE algorithm in solving the SCCPS problem and also the superiority of the ICE algorithm is verified through the comparison with the standard cross entropy( SCE) algorithm.展开更多
Emergence of drug resistant bacteria is one of the serious problems in today’s public health. However, the relationship between genomic mutation of bacteria and the phenotypic difference of them is still unclear. In ...Emergence of drug resistant bacteria is one of the serious problems in today’s public health. However, the relationship between genomic mutation of bacteria and the phenotypic difference of them is still unclear. In this paper, based on the mutation information in whole genome sequences of 96 MRSA strains, two kinds of phenotypes (pathogenicity and drug resistance) were learnt and predicted by machine learning algorithms. As a result of effective feature selection by cross entropy based sparse logistic regression, these phenotypes could be predicted in sufficiently high accuracy (100% and 97.87%, respectively) with less than 10 features. It means that we could develop a novel rapid test method in the future for checking MRSA phenotypes.展开更多
The deep learning model is overfitted and the accuracy of the test set is reduced when the deep learning model is trained in the network intrusion detection parameters, due to the traditional loss function convergence...The deep learning model is overfitted and the accuracy of the test set is reduced when the deep learning model is trained in the network intrusion detection parameters, due to the traditional loss function convergence problem. Firstly, we utilize a network model architecture combining Gelu activation function and deep neural network;Secondly, the cross-entropy loss function is improved to a weighted cross entropy loss function, and at last it is applied to intrusion detection to improve the accuracy of intrusion detection. In order to compare the effect of the experiment, the KDDcup99 data set, which is commonly used in intrusion detection, is selected as the experimental data and use accuracy, precision, recall and F1-score as evaluation parameters. The experimental results show that the model using the weighted cross-entropy loss function combined with the Gelu activation function under the deep neural network architecture improves the evaluation parameters by about 2% compared with the ordinary cross-entropy loss function model. Experiments prove that the weighted cross-entropy loss function can enhance the model’s ability to discriminate samples.展开更多
文摘Cross entropy is a measure in machine learning and deep learning that assesses the difference between predicted and actual probability distributions. In this study, we propose cross entropy as a performance evaluation metric for image classifier models and apply it to the CT image classification of lung cancer. A convolutional neural network is employed as the deep neural network (DNN) image classifier, with the residual network (ResNet) 50 chosen as the DNN archi-tecture. The image data used comprise a lung CT image set. Two classification models are built from datasets with varying amounts of data, and lung cancer is categorized into four classes using 10-fold cross-validation. Furthermore, we employ t-distributed stochastic neighbor embedding to visually explain the data distribution after classification. Experimental results demonstrate that cross en-tropy is a highly useful metric for evaluating the reliability of image classifier models. It is noted that for a more comprehensive evaluation of model perfor-mance, combining with other evaluation metrics is considered essential. .
文摘This study evaluates the performance and reliability of a vision transformer (ViT) compared to convolutional neural networks (CNNs) using the ResNet50 model in classifying lung cancer from CT images into four categories: lung adenocarcinoma (LUAD), lung squamous cell carcinoma (LUSC), large cell carcinoma (LULC), and normal. Although CNNs have made significant advancements in medical imaging, their limited capacity to capture long-range dependencies has led to the exploration of ViTs, which leverage self-attention mechanisms for a more comprehensive global understanding of images. The study utilized a dataset of 748 lung CT images to train both models with standardized input sizes, assessing their performance through conventional metrics—accuracy, precision, recall, F1 score, specificity, and AUC—as well as cross entropy, a novel metric for evaluating prediction uncertainty. Both models achieved similar accuracy rates (95%), with ViT demonstrating a slight edge over ResNet50 in precision and F1 scores for specific classes. However, ResNet50 exhibited higher recall for LULC, indicating fewer missed cases. Cross entropy analysis showed that the ViT model had lower average uncertainty, particularly in the LUAD, Normal, and LUSC classes, compared to ResNet50. This finding suggests that ViT predictions are generally more reliable, though ResNet50 performed better for LULC. The study underscores that accuracy alone is insufficient for model comparison, as cross entropy offers deeper insights into the reliability and confidence of model predictions. The results highlight the importance of incorporating cross entropy alongside traditional metrics for a more comprehensive evaluation of deep learning models in medical image classification, providing a nuanced understanding of their performance and reliability. While the ViT outperformed the CNN-based ResNet50 in lung cancer classification based on cross-entropy values, the performance differences were minor and may not hold clinical significance. Therefore, it may be premature to consider replacing CNNs with ViTs in this specific application.
基金financially supported by the National Natural Science Foundation of China(Nos.52074078 and 52374327)the Applied Fundamental Research Program of Liaoning Province,China(No.2023JH2/101600002)+3 种基金the Liaoning Provincial Natural Science Foundation,China(No.2022-YQ-09)the Shenyang Young Middle-Aged Scientific and Technological Innovation Talent Support Program,China(No.RC220491)the Liaoning Province Steel Industry-University-Research Innovation Alliance Cooperation Project of Bensteel Group,China(No.KJBLM202202)the Fundamental Research Funds for the Central Universities,China(Nos.N2201023 and N2325009)。
文摘In recent years,medium entropy alloys have become a research hotspot due to their excellent physical and chemical performances.By controlling reasonable elemental composition and processing parameters,the medium entropy alloys can exhibit similar properties to high entropy alloys and have lower costs.In this paper,a FeCoNi medium entropy alloy precursor was prepared via sol-gel and coprecipitation methods,respectively,and FeCoNi medium entropy alloys were prepared by carbothermal and hydrogen reduction.The phases and magnetic properties of FeCoNi medium entropy alloy were investigated.Results showed that FeCoNi medium entropy alloy was produced by carbothermal and hydrogen reduction at 1500℃.Some carbon was detected in the FeCoNi medium entropy alloy prepared by carbothermal reduction.The alloy prepared by hydrogen reduction was uniform and showed a relatively high purity.Moreover,the hydrogen reduction product exhibited better saturation magnetization and lower coercivity.
基金Project(ZR2014FM036)supported by Shandong Provincial Natural Science Foundation of ChinaProject(ZR2010FZ001)supported by the Key Program of Shandong Provincial Natural Science Foundation of China
文摘In order to increase productivity and reduce energy consumption of steelmaking-continuous casting(SCC) production process, especially with complicated technological routes, the cross entropy(CE) method was adopted to optimize the SCC production scheduling(SCCPS) problem. Based on the CE method, a matrix encoding scheme was proposed and a backward decoding method was used to generate a reasonable schedule. To describe the distribution of the solution space, a probability distribution model was built and used to generate individuals. In addition, the probability updating mechanism of the probability distribution model was proposed which helps to find the optimal individual gradually. Because of the poor stability and premature convergence of the standard cross entropy(SCE) algorithm, the improved cross entropy(ICE) algorithm was proposed with the following improvements: individual generation mechanism combined with heuristic rules, retention mechanism of the optimal individual, local search mechanism and dynamic parameters of the algorithm. Simulation experiments validate that the CE method is effective in solving the SCCPS problem with complicated technological routes and the ICE algorithm proposed has superior performance to the SCE algorithm and the genetic algorithm(GA).
基金Project supported by the National Natural Science Foundation of China (Grant No 10472091 and 10332030) and Natural Science Foundation of Shaanxi Province, China (Grant No 2003A03). The author gratefully acknowledges the support of Youth for NPU Teachers Scientific and Technological Innovation Foundation.
文摘This paper shows the Fokker-Planck equation of a dynamical system driven by coloured cross-correlated white noises in the absence and presence of a small external force. Based on the Fokker-Planck equation and the definition of Shannon's information entropy, the time dependence of entropy flux and entropy production can be calculated. The present results can be used to explain the extremal behaviour of time dependence of entropy flux and entropy production in view of the dissipative parameter γ of the system, coloured cross-correlation time τ and coloured cross-correlation strength λ.
文摘No-wait job-shop scheduling (NWJSS) problem is one of the classical scheduling problems that exist on many kinds of industry with no-wait constraint, such as metal working, plastic, chemical, and food industries. Several methods have been proposed to solve this problem, both exact (i.e. integer programming) and metaheuristic methods. Cross entropy (CE), as a new metaheuristic, can be an alternative method to solve NWJSS problem. This method has been used in combinatorial optimization, as well as multi-external optimization and rare-event simulation. On these problems, CE implementation results an optimal value with less computational time in average. However, using original CE to solve large scale NWJSS requires high computational time. Considering this shortcoming, this paper proposed a hybrid of cross entropy with genetic algorithm (GA), called CEGA, on m-machines NWJSS. The results are compared with other metaheuritics: Genetic Algorithm-Simulated Annealing (GASA) and hybrid tabu search. The results showed that CEGA providing better or at least equal makespans in comparison with the other two methods.
基金Project supported by the National Natural Science Foundation of China (Grant Nos 10472091 and 10332030), the Natural Science Foundation of Shaanxi Province of China (Grant No 2003A03).
文摘It is shown how the cross-correlation time and strength of coloured cross-correlated white noises can set an upper bound for the time derivative of entropy in a nonequilibrium system. The value of upper bound can be calculated directly based on the Schwartz inequality principle and the Fokker-Planck equation of the dynamical system driven by coloured cross-correlated white noises. The present calculations can be used to interpret the interplay of the dissipative constant and cross-correlation time and strength of coloured cross-correlated white noises on the upper bound.
文摘In this paper, a new method for Principal Component Analysis in intuitionistic fuzzy situations has been proposed. This approach is based on cross entropy as an information index. This new method is a useful method for data reduction for situations in which data are not exact. The inexactness in the situations assumed here is due to fuzziness and missing data information, so that we have two functions (membership and non-membership). Thus, method proposed here is suitable for Atanasov’s Intuitionistic Fuzzy Sets (A-IFSs) in which we have an uncertainty due to a mixture of fuzziness and missing data information. For the demonstration of the application of the method, we have used an example and have presented a conclusion.
基金Science and Technology Plan of Gansu Province(No.144NKCA040)
文摘In order to improve the global search ability of biogeography-based optimization(BBO)algorithm in multi-threshold image segmentation,a multi-threshold image segmentation based on improved BBO algorithm is proposed.When using BBO algorithm to optimize threshold,firstly,the elitist selection operator is used to retain the optimal set of solutions.Secondly,a migration strategy based on fusion of good solution and pending solution is introduced to reduce premature convergence and invalid migration of traditional migration operations.Thirdly,to reduce the blindness of traditional mutation operations,a mutation operation through binary computation is created.Then,it is applied to the multi-threshold image segmentation of two-dimensional cross entropy.Finally,this method is used to segment the typical image and compared with two-dimensional multi-threshold segmentation based on particle swarm optimization algorithm and the two-dimensional multi-threshold image segmentation based on standard BBO algorithm.The experimental results show that the method has good convergence stability,it can effectively shorten the time of iteration,and the optimization performance is better than the standard BBO algorithm.
文摘Effect of temperature-dependent viscosity on fully developed forced convection in a duct of rectangular cross-section occupied by a fluid-saturated porous medium is investigated analytically. The Darcy flow model is applied and the viscosity-temperature relation is assumed to be an inverse-linear one. The case of uniform heat flux on the walls, i.e. the H boundary condition in the terminology of Kays and Crawford [12], is treated. For the case of a fluid whose viscosity decreases with temperature, it is found that the effect of the variation is to increase the Nusselt number for heated walls. Having found the velocity and the temperature distribution, the second law of thermodynamics is invoked to find the local and average entropy generation rate. Expressions for the entropy generation rate, the Bejan number, the heat transfer irreversibility, and the fluid flow irreversibility are presented in terms of the Brinkman number, the Péclet number, the viscosity variation number, the dimensionless wall heat flux, and the aspect ratio (width to height ratio). These expressions let a parametric study of the problem based on which it is observed that the entropy generated due to flow in a duct of square cross-section is more than those of rectangular counterparts while increasing the aspect ratio decreases the entropy generation rate similar to what previously reported for the clear flow case by Ratts and Rant [14].
基金the National Natural Science Foundation of China(No.11975091)the Program for Innovative Research Team(in Science and Technology)in the University of Henan Province,China(No.21IRTSTHN011).
文摘Configurational information entropy(CIE)analysis has been shown to be applicable for determining the neutron skin thickness(δnp)of neutron-rich nuclei from fragment production in projectile fragmentation reactions.The BNN+FRACS machine learning model was adopted to predict the fragment mass cross-sections(σ_(A))of the projectile fragmentation reactions induced by calcium isotopes from ^(36)Ca to ^(56)Ca on a ^(9)Be target at 140MeV/u.The fast Fourier transform was adopted to decompose the possible information compositions inσA distributions and determine the quantity of CIE(S_(A)[f]).It was found that the range of fragments significantly influences the quantity of S_(A)[f],which results in different trends of S_(A)[f]~δnp correlation.The linear S_(A)[f]~δnp correlation in a previous study[Nucl.Sci.Tech.33,6(2022)]could be reproduced using fragments with relatively large mass fragments,which verifies that S_(A)[f]determined from fragmentσAis sensitive to the neutron skin thickness of neutron-rich isotopes.
文摘In the present study we have formulated a Minimum Cross Fuzzy Entropy Problem (Minx(F)EntP) and proposed sufficient conditions for existence of its solution. Mentioned problem can be formulated as follows. In the set of membership functions satisfying the given moment constraints generated by given moment functions it is required to choose the membership function that is closest to a priori membership function in the sense of cross fuzzy entropy measure. The existence of solution of formulated problem is proved by virtue of concavity property of cross fuzzy entropy measure, the implicit function theorem and Lagrange multipliers method. Moreover, Generalized Cross Fuzzy Entropy Optimization Methods in the form of MinMinx(F)EntM and MaxMinx(F)EntM are suggested on the basis of primary phase of minimizing cross fuzzy entropy measure for fixed moment vector function and on the definition of the special functional with Minx(F)Ent values of cross fuzzy entropy measure. Next phase for obtaining mentioned distributions consists of optimization of defined functional with respect to moment vector functions. Distributions obtained by mentioned methods are defined as (MinMinx(F)Ent)m and (MaxMinx(F)Ent)m distributions.
基金supported by National Natural Science Foundation of China under Grant No.60872065Open Foundation of State Key Laboratory for Novel Software Technology at Nanjing University under Grant No.KFKT2010B17
文摘The segmentation effect of Tsallis entropy method is superior to that of Shannon entropy method, and the computation speed of two-dimensional Shannon cross entropy method can be further improved by optimization. The existing two-dimensional Tsallis cross entropy method is not the strict two-dimensional extension. Thus two new methods of image thresholding using two-dimensional Tsallis cross entropy based on either Chaotic Particle Swarm Optimization (CPSO) or decomposition are proposed. The former uses CPSO to find the optimal threshold. The recursive algorithm is adopted to avoid the repetitive computation of fitness function in iterative procedure. The computing speed is improved greatly. The latter converts the two-dimensional computation into two one-dimensional spaces, which makes the computational complexity further reduced from O(L2) to O(L). The experimental results show that, compared with the proposed recently two-dimensional Shannon or Tsallis cross entropy method, the two new methods can achieve superior segmentation results and reduce running time greatly.
基金Key Project of Shandong Provincial Natural Science Foundation,China(No.ZR2010FZ001)National High-Tech Research and Development Program of China(863 Program)(No.2007AA04Z157)
文摘A matrix encoding scheme for the steelmaking continuous casting( SCC) production scheduling( SCCPS) problem and the corresponding decoding method are proposed. Based on it,a cross entropy( CE) method is adopted and an improved cross entropy( ICE) algorithm is proposed to solve the SCCPS problem to minimize total power consumption. To describe the distribution of the solution space of the CE method,a probability model is built and used to generate individuals by sampling and a probability updating mechanism is introduced to trace the promising samples. For the ICE algorithm,some samples are generated by the heuristic rules for the shortest makespan due to the relation between the makespan and the total power consumption,which can reduce the search space greatly. The optimal sample in each iteration is retained through a retention mechanism to ensure that the historical optimal sample is not lost so as to improve the efficiency and global convergence. A local search procedure is carried out on a part of better samples so as to improve the local exploitation capability of the ICE algorithm and get a better result. The parameter setting is investigated by the Taguchi method of design-of-experiment. A number of simulation experiments are implemented to validate the effectiveness of the ICE algorithm in solving the SCCPS problem and also the superiority of the ICE algorithm is verified through the comparison with the standard cross entropy( SCE) algorithm.
文摘Emergence of drug resistant bacteria is one of the serious problems in today’s public health. However, the relationship between genomic mutation of bacteria and the phenotypic difference of them is still unclear. In this paper, based on the mutation information in whole genome sequences of 96 MRSA strains, two kinds of phenotypes (pathogenicity and drug resistance) were learnt and predicted by machine learning algorithms. As a result of effective feature selection by cross entropy based sparse logistic regression, these phenotypes could be predicted in sufficiently high accuracy (100% and 97.87%, respectively) with less than 10 features. It means that we could develop a novel rapid test method in the future for checking MRSA phenotypes.
文摘The deep learning model is overfitted and the accuracy of the test set is reduced when the deep learning model is trained in the network intrusion detection parameters, due to the traditional loss function convergence problem. Firstly, we utilize a network model architecture combining Gelu activation function and deep neural network;Secondly, the cross-entropy loss function is improved to a weighted cross entropy loss function, and at last it is applied to intrusion detection to improve the accuracy of intrusion detection. In order to compare the effect of the experiment, the KDDcup99 data set, which is commonly used in intrusion detection, is selected as the experimental data and use accuracy, precision, recall and F1-score as evaluation parameters. The experimental results show that the model using the weighted cross-entropy loss function combined with the Gelu activation function under the deep neural network architecture improves the evaluation parameters by about 2% compared with the ordinary cross-entropy loss function model. Experiments prove that the weighted cross-entropy loss function can enhance the model’s ability to discriminate samples.