Regression analysis is often formulated as an optimization problem with squared loss functions. Facing the challenge of the selection of the proper function class with polynomial smooth techniques applied to support v...Regression analysis is often formulated as an optimization problem with squared loss functions. Facing the challenge of the selection of the proper function class with polynomial smooth techniques applied to support vector regression models, this study takes cubic spline interpolation to generate a new polynomial smooth function |×|ε^ 2, in g-insensitive support vector regression. Theoretical analysis shows that Sε^2 -function is better than pε^2 -function in properties, and the approximation accuracy of the proposed smoothing function is two order higher than that of classical pε^2 -function. The experimental data shows the efficiency of the new approach.展开更多
基金Supported by Guangdong Natural Science Foundation Project(No.S2011010002144)Province and Ministry Production and Research Projects(No.2012B091100497,2012B091100191,2012B091100383)+1 种基金Guangdong Province Enterprise Laboratory Project(No.2011A091000046)Guangdong Province Science and Technology Major Project(No.2012A080103010)
文摘Regression analysis is often formulated as an optimization problem with squared loss functions. Facing the challenge of the selection of the proper function class with polynomial smooth techniques applied to support vector regression models, this study takes cubic spline interpolation to generate a new polynomial smooth function |×|ε^ 2, in g-insensitive support vector regression. Theoretical analysis shows that Sε^2 -function is better than pε^2 -function in properties, and the approximation accuracy of the proposed smoothing function is two order higher than that of classical pε^2 -function. The experimental data shows the efficiency of the new approach.