Implicit determinant method is an effective method for some linear eigenvalue optimization problems since it solves linear systems of equations rather than eigenpairs.In this paper,we generalize the implicit determina...Implicit determinant method is an effective method for some linear eigenvalue optimization problems since it solves linear systems of equations rather than eigenpairs.In this paper,we generalize the implicit determinant method to solve an Hermitian eigenvalue optimization problem for smooth case and non-smooth case.We prove that the implicit determinant method converges locally and quadratically.Numerical experiments confirm our theoretical results and illustrate the efficiency of implicit determinant method.展开更多
In this paper,we review computational approaches to optimization problems of inhomogeneous rods and plates.We consider both the optimization of eigenvalues and the localization of eigenfunctions.These problems are mot...In this paper,we review computational approaches to optimization problems of inhomogeneous rods and plates.We consider both the optimization of eigenvalues and the localization of eigenfunctions.These problems are motivated by physical problems including the determination of the extremum of the fundamental vibration frequency and the localization of the vibration displacement.We demonstrate how an iterative rearrangement approach and a gradient descent approach with projection can successfully solve these optimization problems under different boundary conditions with different densities given.展开更多
In this paper we study optimization problems involving convex nonlinear semidefinite programming(CSDP).Here we convert CSDP into eigenvalue problem by exact penalty function,and apply the U-Lagrangian theory to the fu...In this paper we study optimization problems involving convex nonlinear semidefinite programming(CSDP).Here we convert CSDP into eigenvalue problem by exact penalty function,and apply the U-Lagrangian theory to the function of the largest eigenvalues,with matrix-convex valued mappings.We give the first-and second-order derivatives of U-Lagrangian in the space of decision variables Rm when transversality condition holds.Moreover,an algorithm frame with superlinear convergence is presented.Finally,we give one application:bilinear matrix inequality(BMI)optimization;meanwhile,list their UV decomposition results.展开更多
基金supported by the China NSF Project (No.11971122)。
文摘Implicit determinant method is an effective method for some linear eigenvalue optimization problems since it solves linear systems of equations rather than eigenpairs.In this paper,we generalize the implicit determinant method to solve an Hermitian eigenvalue optimization problem for smooth case and non-smooth case.We prove that the implicit determinant method converges locally and quadratically.Numerical experiments confirm our theoretical results and illustrate the efficiency of implicit determinant method.
基金supported by the DMS-1853701supported in part by the DMS-2208373.
文摘In this paper,we review computational approaches to optimization problems of inhomogeneous rods and plates.We consider both the optimization of eigenvalues and the localization of eigenfunctions.These problems are motivated by physical problems including the determination of the extremum of the fundamental vibration frequency and the localization of the vibration displacement.We demonstrate how an iterative rearrangement approach and a gradient descent approach with projection can successfully solve these optimization problems under different boundary conditions with different densities given.
基金This paper is supported by the National Natural Science Foundation of China(Nos.11701063,11901075)the Project funded by China Postdoctoral Science Foundation(Nos.2019M651091,2019M661073)+5 种基金the Fundamental Research Funds for the Central Universities(Nos.3132021193,3132021199)the Natural Science Foundation of Liaoning Province in China(Doctoral Startup Foundation of Liaoning Province in China(Nos.2020-BS-074)Dalian Youth Science and Technology Star(No.2020RQ047)Huzhou Science and Technology Plan(No.2016GY03)Key Research and Development Projects of Shandong Province(No.2019GGX104089)the Natural Science Foundation of Shandong Province(No.ZR2019BA014).
文摘In this paper we study optimization problems involving convex nonlinear semidefinite programming(CSDP).Here we convert CSDP into eigenvalue problem by exact penalty function,and apply the U-Lagrangian theory to the function of the largest eigenvalues,with matrix-convex valued mappings.We give the first-and second-order derivatives of U-Lagrangian in the space of decision variables Rm when transversality condition holds.Moreover,an algorithm frame with superlinear convergence is presented.Finally,we give one application:bilinear matrix inequality(BMI)optimization;meanwhile,list their UV decomposition results.