摘要
首先给出了绝对值函数的3个光滑逼近函数,分析了这些光滑逼近函数的性质;然后选取性质较好的光滑函数来处理绝对值方程,得到一个可微的无约束优化问题;建立了求解无约束优化问题的梯度下降神经网络模型。通过求解唯一解、多个解的绝对值方程,结果表明该方法不依赖初始点,且具有收敛快等优点。最后把该方法应用于求解线性互补问题。
Three smooth approximating functions of absolute value function are listed,and their properties are analyzed.After selecting smooth function with better properties to deal with the absolute value equation,a differentiable unconstrained optimization problem is reached,and the problem can be solved by neural network model based on gradient descend.By solving some absolute value equations with unique solution or multiple solutions,the results show that this method is not dependent on the initial point,and has the advantages of fast convergence.Finally we apply the method to solve the linear complementary problem.
作者
雍龙泉
YONG Long-quan(School of Mathematics and Computer Science,Shaanxi University of Technology,Hanzhong 723000,China;Shaanxi Key Laboratory of Industrial Automation,Hanzhong 723000,China)
出处
《陕西理工大学学报(自然科学版)》
2020年第5期72-81,共10页
Journal of Shaanxi University of Technology:Natural Science Edition
基金
国家自然科学基金资助项目(11401357)
陕西省青年科技新星项目(2016KJXX-95)
陕西省教育厅科研基金资助项目(20JS021)
陕西理工大学科研基金资助项目(SLGYQZX2002)。
关键词
绝对值函数
光滑逼近函数
绝对值方程
无约束优化
梯度下降神经网络
线性互补
absolute value function
smooth approximation function
absolute value equation
unconstrained optimization
neural network model based on gradient descend
linear complementarity problem