It is prominent that conjugate gradient method is a high-efficient solution way for large-scale optimization problems.However,most of the conjugate gradient methods do not have sufficient descent property.In this pape...It is prominent that conjugate gradient method is a high-efficient solution way for large-scale optimization problems.However,most of the conjugate gradient methods do not have sufficient descent property.In this paper,without any line search,the presented method can generate sufficient descent directions and trust region property.While use some suitable conditions,the global convergence of the method is established with Armijo line search.Moreover,we study the proposed method for solving nonsmooth problems and establish its global convergence.The experiments show that the presented method can be applied to solve smooth and nonsmooth unconstrained problems,image restoration problems and Muskingum model successfully.展开更多
This paper studies Γ-convergence for a sequence of parabolic functionals, where the integrand f is nonconvex, and periodic on the first variable. The author obtains the representation formula of the Γ-limit. The res...This paper studies Γ-convergence for a sequence of parabolic functionals, where the integrand f is nonconvex, and periodic on the first variable. The author obtains the representation formula of the Γ-limit. The results in this paper support a conclusion which relates Γ-convergence of parabolic functionals to the associated gradiend flows and confirms one of De Giorgi’s conjectures partially.展开更多
This paper presents a geometric characterization of convex sets in locally convex spaces onwhich a strong optimization theorem of the Stegall-type holds, and gives Collier's theorem ofw* Asplund spaces a localized...This paper presents a geometric characterization of convex sets in locally convex spaces onwhich a strong optimization theorem of the Stegall-type holds, and gives Collier's theorem ofw* Asplund spaces a localized setting.展开更多
基金supported by the National Natural Science Foundation of China(No.11661009)the High Level Innovation Teams and Excellent Scholars Program in Guangxi institutions of higher education(No.[2019]52)+2 种基金the Guangxi Natural Science Key Fund(No.2017GXNSFDA198046)the Special Funds for Local Science and Technology Development Guided by the Central Government(No.ZY20198003)the special foundation for Guangxi Ba Gui Scholars.
文摘It is prominent that conjugate gradient method is a high-efficient solution way for large-scale optimization problems.However,most of the conjugate gradient methods do not have sufficient descent property.In this paper,without any line search,the presented method can generate sufficient descent directions and trust region property.While use some suitable conditions,the global convergence of the method is established with Armijo line search.Moreover,we study the proposed method for solving nonsmooth problems and establish its global convergence.The experiments show that the presented method can be applied to solve smooth and nonsmooth unconstrained problems,image restoration problems and Muskingum model successfully.
基金National Natural Science Foundation of China!(No. 19701018).
文摘This paper studies Γ-convergence for a sequence of parabolic functionals, where the integrand f is nonconvex, and periodic on the first variable. The author obtains the representation formula of the Γ-limit. The results in this paper support a conclusion which relates Γ-convergence of parabolic functionals to the associated gradiend flows and confirms one of De Giorgi’s conjectures partially.
基金Project supported by the National Natural Science Foundation of China(No.10071063)
文摘This paper presents a geometric characterization of convex sets in locally convex spaces onwhich a strong optimization theorem of the Stegall-type holds, and gives Collier's theorem ofw* Asplund spaces a localized setting.