期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
单位国内生产总值建设用地下降目标细化分解思路研究——以辽源市为例
1
作者 寻爱武 毕云志 杨继红 《农业与技术》 2015年第23期181-183,共3页
研究目的:全面贯彻落实单位国内生产总值建设用地下降30%目标,是"十二五"国家实施最严格节约集约用地制度刚性指标。将指标细化分解落实到县,对小城镇欠发达地区大力推进土地节约集约利用十分重要。研究方法:以辽源市为例,从... 研究目的:全面贯彻落实单位国内生产总值建设用地下降30%目标,是"十二五"国家实施最严格节约集约用地制度刚性指标。将指标细化分解落实到县,对小城镇欠发达地区大力推进土地节约集约利用十分重要。研究方法:以辽源市为例,从基础数据数理分析入手,考虑区域发展差异和土地利用发展趋势等因素影响,采用等率法和指标加权法相结合将指标进一步细化分解到县,提出年均净增量控制范围。研究结果:落实下降目标,实施下降指标国家-省-市-县逐层分解落实到县市,并制定相关配套政策和措施,是完成30%国家目标,建设节约集约用地制度,提升土地节约集约利用水平的有效途径。 展开更多
关键词 国内生产总值 建设用地 下降目标 辽源市
下载PDF
Full waveform inversion with spectral conjugategradient method
2
作者 LIU Xiao LIU Mingchen +1 位作者 SUN Hui WANG Qianlong 《Global Geology》 2017年第1期40-45,共6页
Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient m... Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient method,and small storage of conjugate gradient method.Besides,the spectral conjugate gradient method was proved that the search direction at each iteration is a descent direction of objective function even without relying on any line search method.Spectral conjugate gradient method is applied to full waveform inversion for numerical tests on Marmousi model.The authors give a comparison on numerical results obtained by steepest descent method,conjugate gradient method and spectral conjugate gradient method,which shows that the spectral conjugate gradient method is superior to the other two methods. 展开更多
关键词 ful l waveform inversion spectral conjugate gradient method conjugate gradient method steepest descent method
下载PDF
A NEW DESCENT MEMORY GRADIENT METHOD AND ITS GLOBAL CONVERGENCE 被引量:3
3
作者 Min SUN Qingguo BAI 《Journal of Systems Science & Complexity》 SCIE EI CSCD 2011年第4期784-794,共11页
In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direc... In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems. 展开更多
关键词 Global convergence memory gradient method sufficiently descent.
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部