期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
Fluidized roasting reduction kinetics of low-grade pyrolusite coupling with pretreatment of stone coal 被引量:8
1
作者 Ya-li Feng Zhen-lei Cai +2 位作者 Hao-ran Li Zhu-wei Du xin-wei liu 《International Journal of Minerals,Metallurgy and Materials》 SCIE EI CAS CSCD 2013年第3期221-227,共7页
Based on the fluidized roasting reduction technology of low-grade pyrolusite coupling with pretreatment of stone coal, the manganese reduction efficiency was investigated and technical conditions were optimized. It is... Based on the fluidized roasting reduction technology of low-grade pyrolusite coupling with pretreatment of stone coal, the manganese reduction efficiency was investigated and technical conditions were optimized. It is found that the optimum manganese reduction efficiency can be up to 98.97% under the conditions that the mass ratio of stone coal to pyrolusite is 3:1, the roasting temperature of stone coal is 1000℃, the roasting temperature of pyrolusite is 800℃, and the roasting time is 2 h. Other low-grade pyrolusite ores in China from Guangxi, Hunan, and Guizhou Provinces were tested and all these minerals responded well, giving -99% manganese reduction efficiency. Meanwhile, the reduction kinetic model has been established. It is confirmed that the reduction process is controlled by the interface chemical reaction. The apparent activation energy is 36.397 kJ/mol. 展开更多
关键词 PYROLUSITE ore roasting ore reduction fluidized beds KINETICS stone coal
下载PDF
A Mini-Batch Proximal Stochastic Recursive Gradient Algorithm with Diagonal Barzilai–Borwein Stepsize
2
作者 Teng-Teng Yu xin-wei liu +1 位作者 Yu-Hong Dai Jie Sun 《Journal of the Operations Research Society of China》 EI CSCD 2023年第2期277-307,共31页
Many machine learning problems can be formulated as minimizing the sum of a function and a non-smooth regularization term.Proximal stochastic gradient methods are popular for solving such composite optimization proble... Many machine learning problems can be formulated as minimizing the sum of a function and a non-smooth regularization term.Proximal stochastic gradient methods are popular for solving such composite optimization problems.We propose a minibatch proximal stochastic recursive gradient algorithm SRG-DBB,which incorporates the diagonal Barzilai–Borwein(DBB)stepsize strategy to capture the local geometry of the problem.The linear convergence and complexity of SRG-DBB are analyzed for strongly convex functions.We further establish the linear convergence of SRGDBB under the non-strong convexity condition.Moreover,it is proved that SRG-DBB converges sublinearly in the convex case.Numerical experiments on standard data sets indicate that the performance of SRG-DBB is better than or comparable to the proximal stochastic recursive gradient algorithm with best-tuned scalar stepsizes or BB stepsizes.Furthermore,SRG-DBB is superior to some advanced mini-batch proximal stochastic gradient methods. 展开更多
关键词 Stochastic recursive gradient Proximal gradient algorithm Barzilai-Borwein method Composite optimization
原文传递
A ROBUST TRUST REGION ALGORITHM FOR SOLVING GENERAL NONLINEAR PROGRAMMING
3
作者 xin-wei liu Ya-xiang Yuan 《Journal of Computational Mathematics》 SCIE EI CSCD 2001年第3期309-322,共14页
Provides information on a study which presented a trust region approach for solving nonlinear constrained optimization. Algorithm of the trust region approach; Information on the global convergence of the algorithm; N... Provides information on a study which presented a trust region approach for solving nonlinear constrained optimization. Algorithm of the trust region approach; Information on the global convergence of the algorithm; Numerical results of the study. 展开更多
关键词 trust region algorithm nonlinear programming
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部