摘要
指出传统拟线性回归和近代非线性最小二乘回归存在的问题,以及有截距的线性回归和无截距的线性回归之间的差别;提出通过附加残差和为零的强制条件改进非线性最小二乘回归的思想和方法,即残差和为零的非线性回归,并从理论上和实验数据上证明其优越性。得出的结论和结果为:对于幂函数、指数函数、双曲函数、对数函数和S型曲线等非线性函数,通过换元线性化进行的回归(简称拟线性回归)存在异方差问题;有截距的线性回归的残差和为零,无截距的线性回归和非线性回归的残差和通常不为零;残差和为零的非线性最小二乘回归之参数的精度高于普通非线性最小二乘回归参数的精度;对无截距的线性回归问题,通过附加残差和为零的强制条件后,参数的精度亦会提高。
The problems in quasi-linearization regression and non-linear least square regression,as well as the difference between the linear regression with intercept and the linear regression without intercept are pointed out.The thought and method to improve non-linear least square regression by adding one constraint to make the sum of residual errors become into zero are advanced,and the new method is named as the regression analysis with zero sum of residual errors.The advantages of the new method are proved by theory and experimental data.The main conclusions are as follows: for power function,exponential function, hyperbolic function and S-shape curve etc.,there is heteroscedaticity when they are making linear regression by substitution(quasi-linearization regression for short);the sum of residual errors is zero in the linear regression with intercept and the sum of residual errors is not zero usually in the linear regression without intercept;the parameter precision in the non-linear least square regression with zero sum of residual errors is higher than that in the non-linear least square regression in common use;the parameter precision can be also improved by adding one constraint to make the sum of residual errors become into zero in the linear regression without intercept.
出处
《东北林业大学学报》
CAS
CSCD
北大核心
2011年第2期125-127,130,共4页
Journal of Northeast Forestry University
关键词
回归分析
残差和
非线性最小二乘法
异方差性
拟线性回归
林木材积
Regression analysis
Sum of residual errors
Non-linear least square method
Heteroscedaticity
Quasi-linearization regression
Tree volume