Two traditional methods for compensating function model errors, the method of adding systematic parameters and the least-squares collection method, are introduced. A proposed method based on a BP neural network (call...Two traditional methods for compensating function model errors, the method of adding systematic parameters and the least-squares collection method, are introduced. A proposed method based on a BP neural network (called the H-BP algorithm) for compensating function model errors is put forward. The function model is assumed as y =f(x1, x2,… ,xn), and the special structure of the H-BP algorithm is determined as ( n + 1) ×p × 1, where (n + 1) is the element number of the input layer, and the elements are xl, x2,…, xn and y' ( y' is the value calculated by the function model); p is the element number of the hidden layer, and it is usually determined after many tests; 1 is the dement number of the output layer, and the element is △y = y0-y'(y0 is the known value of the sample). The calculation steps of the H-BP algorithm are introduced in detail. And then, the results of three methods for compensating function model errors from one engineering project are compared with each other. After being compensated, the accuracy of the traditional methods is about ± 19 mm, and the accuracy of the H-BP algorithm is ± 4. 3 mm. It shows that the proposed method based on a neural network is more effective than traditional methods for compensating function model errors.展开更多
Aimed at the great computing complexity of optimal brain surgeon (OBS) process, a pruning algorithm with penalty OBS process is presented. Compared with sensitive and regularized methods, the penalty OBS algorithm not...Aimed at the great computing complexity of optimal brain surgeon (OBS) process, a pruning algorithm with penalty OBS process is presented. Compared with sensitive and regularized methods, the penalty OBS algorithm not only avoids time-consuming defect and low pruning efficiency in OBS process, but also keeps higher generalization and pruning accuracy than Levenberg-Marquardt method.展开更多
基金The National Basic Research Program of China(973 Program)(No.2006CB705501)the National High Technology Research and Development Program of China (863 Program)(No.2007AA12Z228)
文摘Two traditional methods for compensating function model errors, the method of adding systematic parameters and the least-squares collection method, are introduced. A proposed method based on a BP neural network (called the H-BP algorithm) for compensating function model errors is put forward. The function model is assumed as y =f(x1, x2,… ,xn), and the special structure of the H-BP algorithm is determined as ( n + 1) ×p × 1, where (n + 1) is the element number of the input layer, and the elements are xl, x2,…, xn and y' ( y' is the value calculated by the function model); p is the element number of the hidden layer, and it is usually determined after many tests; 1 is the dement number of the output layer, and the element is △y = y0-y'(y0 is the known value of the sample). The calculation steps of the H-BP algorithm are introduced in detail. And then, the results of three methods for compensating function model errors from one engineering project are compared with each other. After being compensated, the accuracy of the traditional methods is about ± 19 mm, and the accuracy of the H-BP algorithm is ± 4. 3 mm. It shows that the proposed method based on a neural network is more effective than traditional methods for compensating function model errors.
文摘Aimed at the great computing complexity of optimal brain surgeon (OBS) process, a pruning algorithm with penalty OBS process is presented. Compared with sensitive and regularized methods, the penalty OBS algorithm not only avoids time-consuming defect and low pruning efficiency in OBS process, but also keeps higher generalization and pruning accuracy than Levenberg-Marquardt method.