期刊文献+
共找到5篇文章
< 1 >
每页显示 20 50 100
Feasibility of stochastic gradient boosting approach for predicting rockburst damage in burst-prone mines 被引量:4
1
作者 周健 史秀志 +2 位作者 黄仁东 邱贤阳 陈冲 《Transactions of Nonferrous Metals Society of China》 SCIE EI CAS CSCD 2016年第7期1938-1945,共8页
The database of 254 rockburst events was examined for rockburst damage classification using stochastic gradient boosting (SGB) methods. Five potentially relevant indicators including the stress condition factor, the... The database of 254 rockburst events was examined for rockburst damage classification using stochastic gradient boosting (SGB) methods. Five potentially relevant indicators including the stress condition factor, the ground support system capacity, the excavation span, the geological structure and the peak particle velocity of rockburst sites were analyzed. The performance of the model was evaluated using a 10 folds cross-validation (CV) procedure with 80%of original data during modeling, and an external testing set (20%) was employed to validate the prediction performance of the SGB model. Two accuracy measures for multi-class problems were employed: classification accuracy rate and Cohen’s Kappa. The accuracy analysis together with Kappa for the rockburst damage dataset reveals that the SGB model for the prediction of rockburst damage is acceptable. 展开更多
关键词 burst-prone mine rockburst damage stochastic gradient boosting method
下载PDF
Least-Squares Seismic Inversion with Stochastic Conjugate Gradient Method 被引量:2
2
作者 Wei Huang Hua-Wei Zhou 《Journal of Earth Science》 SCIE CAS CSCD 2015年第4期463-470,共8页
With the development of computational power, there has been an increased focus on data-fitting related seismic inversion techniques for high fidelity seismic velocity model and image, such as full-waveform inversion a... With the development of computational power, there has been an increased focus on data-fitting related seismic inversion techniques for high fidelity seismic velocity model and image, such as full-waveform inversion and least squares migration. However, though more advanced than conventional methods, these data fitting methods can be very expensive in terms of computational cost. Recently, various techniques to optimize these data-fitting seismic inversion problems have been implemented to cater for the industrial need for much improved efficiency. In this study, we propose a general stochastic conjugate gradient method for these data-fitting related inverse problems. We first prescribe the basic theory of our method and then give synthetic examples. Our numerical experiments illustrate the potential of this method for large-size seismic inversion application. 展开更多
关键词 least-squares seismic inversion stochastic conjugate gradient method data fitting Kirchhoff migration.
原文传递
A Framework of Convergence Analysis of Mini-batch Stochastic Projected Gradient Methods 被引量:1
3
作者 Jian Gu Xian-Tao Xiao 《Journal of the Operations Research Society of China》 EI CSCD 2023年第2期347-369,共23页
In this paper,we establish a unified framework to study the almost sure global convergence and the expected convergencerates of a class ofmini-batch stochastic(projected)gradient(SG)methods,including two popular types... In this paper,we establish a unified framework to study the almost sure global convergence and the expected convergencerates of a class ofmini-batch stochastic(projected)gradient(SG)methods,including two popular types of SG:stepsize diminished SG and batch size increased SG.We also show that the standard variance uniformly bounded assumption,which is frequently used in the literature to investigate the convergence of SG,is actually not required when the gradient of the objective function is Lipschitz continuous.Finally,we show that our framework can also be used for analyzing the convergence of a mini-batch stochastic extragradient method for stochastic variational inequality. 展开更多
关键词 stochastic projected gradient method Variance uniformly bounded Convergence analysis
原文传递
A Stochastic Gradient Descent Method for Computational Design of Random Rough Surfaces in Solar Cells
4
作者 Qiang Li Gang Bao +1 位作者 Yanzhao Cao Junshan Lin 《Communications in Computational Physics》 SCIE 2023年第10期1361-1390,共30页
In this work,we develop a stochastic gradient descent method for the computational optimal design of random rough surfaces in thin-film solar cells.We formulate the design problems as random PDE-constrained optimizati... In this work,we develop a stochastic gradient descent method for the computational optimal design of random rough surfaces in thin-film solar cells.We formulate the design problems as random PDE-constrained optimization problems and seek the optimal statistical parameters for the random surfaces.The optimizations at fixed frequency as well as at multiple frequencies and multiple incident angles are investigated.To evaluate the gradient of the objective function,we derive the shape derivatives for the interfaces and apply the adjoint state method to perform the computation.The stochastic gradient descent method evaluates the gradient of the objective function only at a few samples for each iteration,which reduces the computational cost significantly.Various numerical experiments are conducted to illustrate the efficiency of the method and significant increases of the absorptance for the optimal random structures.We also examine the convergence of the stochastic gradient descent algorithm theoretically and prove that the numerical method is convergent under certain assumptions for the random interfaces. 展开更多
关键词 Optimal design random rough surface solar cell Helmholtz equation stochastic gradient descent method
原文传递
Performance Enhancement of Adaptive Neural Networks Based on Learning Rate
5
作者 Swaleha Zubair Anjani Kumar Singha +3 位作者 Nitish Pathak Neelam Sharma Shabana Urooj Samia Rabeh Larguech 《Computers, Materials & Continua》 SCIE EI 2023年第1期2005-2019,共15页
Deep learning is the process of determining parameters that reduce the cost function derived from the dataset.The optimization in neural networks at the time is known as the optimal parameters.To solve optimization,it... Deep learning is the process of determining parameters that reduce the cost function derived from the dataset.The optimization in neural networks at the time is known as the optimal parameters.To solve optimization,it initialize the parameters during the optimization process.There should be no variation in the cost function parameters at the global minimum.The momentum technique is a parameters optimization approach;however,it has difficulties stopping the parameter when the cost function value fulfills the global minimum(non-stop problem).Moreover,existing approaches use techniques;the learning rate is reduced during the iteration period.These techniques are monotonically reducing at a steady rate over time;our goal is to make the learning rate parameters.We present a method for determining the best parameters that adjust the learning rate in response to the cost function value.As a result,after the cost function has been optimized,the process of the rate Schedule is complete.This approach is shown to ensure convergence to the optimal parameters.This indicates that our strategy minimizes the cost function(or effective learning).The momentum approach is used in the proposed method.To solve the Momentum approach non-stop problem,we use the cost function of the parameter in our proposed method.As a result,this learning technique reduces the quantity of the parameter due to the impact of the cost function parameter.To verify that the learning works to test the strategy,we employed proof of convergence and empirical tests using current methods and the results are obtained using Python. 展开更多
关键词 Deep learning OPTIMIZATION CONVERGENCE stochastic gradient methods
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部