Based on the fluidized roasting reduction technology of low-grade pyrolusite coupling with pretreatment of stone coal, the manganese reduction efficiency was investigated and technical conditions were optimized. It is...Based on the fluidized roasting reduction technology of low-grade pyrolusite coupling with pretreatment of stone coal, the manganese reduction efficiency was investigated and technical conditions were optimized. It is found that the optimum manganese reduction efficiency can be up to 98.97% under the conditions that the mass ratio of stone coal to pyrolusite is 3:1, the roasting temperature of stone coal is 1000℃, the roasting temperature of pyrolusite is 800℃, and the roasting time is 2 h. Other low-grade pyrolusite ores in China from Guangxi, Hunan, and Guizhou Provinces were tested and all these minerals responded well, giving -99% manganese reduction efficiency. Meanwhile, the reduction kinetic model has been established. It is confirmed that the reduction process is controlled by the interface chemical reaction. The apparent activation energy is 36.397 kJ/mol.展开更多
Many machine learning problems can be formulated as minimizing the sum of a function and a non-smooth regularization term.Proximal stochastic gradient methods are popular for solving such composite optimization proble...Many machine learning problems can be formulated as minimizing the sum of a function and a non-smooth regularization term.Proximal stochastic gradient methods are popular for solving such composite optimization problems.We propose a minibatch proximal stochastic recursive gradient algorithm SRG-DBB,which incorporates the diagonal Barzilai–Borwein(DBB)stepsize strategy to capture the local geometry of the problem.The linear convergence and complexity of SRG-DBB are analyzed for strongly convex functions.We further establish the linear convergence of SRGDBB under the non-strong convexity condition.Moreover,it is proved that SRG-DBB converges sublinearly in the convex case.Numerical experiments on standard data sets indicate that the performance of SRG-DBB is better than or comparable to the proximal stochastic recursive gradient algorithm with best-tuned scalar stepsizes or BB stepsizes.Furthermore,SRG-DBB is superior to some advanced mini-batch proximal stochastic gradient methods.展开更多
Provides information on a study which presented a trust region approach for solving nonlinear constrained optimization. Algorithm of the trust region approach; Information on the global convergence of the algorithm; N...Provides information on a study which presented a trust region approach for solving nonlinear constrained optimization. Algorithm of the trust region approach; Information on the global convergence of the algorithm; Numerical results of the study.展开更多
基金financially supported by the National Natural Science Foundation of China (Nos. 21176026 and 21176242)the National High Technology Research and Development Program of China (No. 2012AA062401)+2 种基金the National Key Technology R&D Program of China (Nos.2012BAB07B05 and 2012BAB14B05)China Ocean Mineral resources R&D Association (No. DY125-15-T-08)the Fundamental Reserarch Funds for the Central Universities of China (No. FRT-TP-09-002B)
文摘Based on the fluidized roasting reduction technology of low-grade pyrolusite coupling with pretreatment of stone coal, the manganese reduction efficiency was investigated and technical conditions were optimized. It is found that the optimum manganese reduction efficiency can be up to 98.97% under the conditions that the mass ratio of stone coal to pyrolusite is 3:1, the roasting temperature of stone coal is 1000℃, the roasting temperature of pyrolusite is 800℃, and the roasting time is 2 h. Other low-grade pyrolusite ores in China from Guangxi, Hunan, and Guizhou Provinces were tested and all these minerals responded well, giving -99% manganese reduction efficiency. Meanwhile, the reduction kinetic model has been established. It is confirmed that the reduction process is controlled by the interface chemical reaction. The apparent activation energy is 36.397 kJ/mol.
基金the National Natural Science Foundation of China(Nos.11671116,11701137,12071108,11991020,11991021 and 12021001)the Major Research Plan of the NSFC(No.91630202)+1 种基金the Strategic Priority Research Program of Chinese Academy of Sciences(No.XDA27000000)the Natural Science Foundation of Hebei Province(No.A2021202010)。
文摘Many machine learning problems can be formulated as minimizing the sum of a function and a non-smooth regularization term.Proximal stochastic gradient methods are popular for solving such composite optimization problems.We propose a minibatch proximal stochastic recursive gradient algorithm SRG-DBB,which incorporates the diagonal Barzilai–Borwein(DBB)stepsize strategy to capture the local geometry of the problem.The linear convergence and complexity of SRG-DBB are analyzed for strongly convex functions.We further establish the linear convergence of SRGDBB under the non-strong convexity condition.Moreover,it is proved that SRG-DBB converges sublinearly in the convex case.Numerical experiments on standard data sets indicate that the performance of SRG-DBB is better than or comparable to the proximal stochastic recursive gradient algorithm with best-tuned scalar stepsizes or BB stepsizes.Furthermore,SRG-DBB is superior to some advanced mini-batch proximal stochastic gradient methods.
基金Chinese NSF grants 19525101, 19731001, and by State key project 96-221-04-02-02. It is also partially supported by Hebei provi
文摘Provides information on a study which presented a trust region approach for solving nonlinear constrained optimization. Algorithm of the trust region approach; Information on the global convergence of the algorithm; Numerical results of the study.