A semi-supervised vector machine is a relatively new learning method using both labeled and unlabeled data in classifi- cation. Since the objective function of the model for an unstrained semi-supervised vector machin...A semi-supervised vector machine is a relatively new learning method using both labeled and unlabeled data in classifi- cation. Since the objective function of the model for an unstrained semi-supervised vector machine is not smooth, many fast opti- mization algorithms cannot be applied to solve the model. In order to overcome the difficulty of dealing with non-smooth objective functions, new methods that can solve the semi-supervised vector machine with desired classification accuracy are in great demand. A quintic spline function with three-times differentiability at the ori- gin is constructed by a general three-moment method, which can be used to approximate the symmetric hinge loss function. The approximate accuracy of the quintic spiine function is estimated. Moreover, a quintic spline smooth semi-support vector machine is obtained and the convergence accuracy of the smooth model to the non-smooth one is analyzed. Three experiments are performed to test the efficiency of the model. The experimental results show that the new model outperforms other smooth models, in terms of classification performance. Furthermore, the new model is not sensitive to the increasing number of the labeled samples, which means that the new model is more efficient.展开更多
The acquired hyperspectral images (HSIs) are inherently attected by noise wlm Dano-varylng level, which cannot be removed easily by current approaches. In this study, a new denoising method is proposed for removing ...The acquired hyperspectral images (HSIs) are inherently attected by noise wlm Dano-varylng level, which cannot be removed easily by current approaches. In this study, a new denoising method is proposed for removing such kind of noise by smoothing spectral signals in the transformed multi- scale domain. Specifically, the proposed method includes three procedures: 1 ) applying a discrete wavelet transform (DWT) to each band; 2) performing cubic spline smoothing on each noisy coeffi- cient vector along the spectral axis; 3 ) reconstructing each band by an inverse DWT. In order to adapt to the band-varying noise statistics of HSIs, the noise covariance is estimated to control the smoothing degree at different spectra| positions. Generalized cross validation (GCV) is employed to choose the smoothing parameter during the optimization. The experimental results on simulated and real HSIs demonstrate that the proposed method can be well adapted to band-varying noise statistics of noisy HSIs and also can well preserve the spectral and spatial features.展开更多
In this article, we use penalized spline to estimate the hazard function from a set of censored failure time data. A new approach to estimate the amount of smoothing is provided. Under regularity conditions we establi...In this article, we use penalized spline to estimate the hazard function from a set of censored failure time data. A new approach to estimate the amount of smoothing is provided. Under regularity conditions we establish the consistency and the asymptotic normality of the penalized likelihood estimators. Numerical studies and an example are conducted to evaluate the performances of the new procedure.展开更多
In order to achieve refined precipitation grid data with high accuracy and high spatial resolution,hourly precipitation grid dataset with 1 km spatial resolution in Anhui Province from May to September (in the rainy s...In order to achieve refined precipitation grid data with high accuracy and high spatial resolution,hourly precipitation grid dataset with 1 km spatial resolution in Anhui Province from May to September (in the rainy season) in 2016 and from August 2017 to July 2018 was established based on thin plate smoothing spline (TPS),which meets the needs of climate change research and meteorological disaster risk assessment.The interpolation errors were then analyzed.The grid precipitation product obtained based on thin plate smoothing spline,CLDAS-FAST and CLDAS-FRT were evaluated.The results show that the interpolated values of hourly precipitation by thin plate smoothing spline are close to the observed values in the rainy season of 2016.The errors are generally below 0.9 mm/h.However,the errors of precipitation in the mountainous areas of eastern Huaibei and western Anhui are above 1.2 mm/h.On monthly scale,the errors in June and July are the largest,and the proportion of absolute values of the errors ≥2 mm/h is up to 2.0% in June and 2.2% in July.The errors in September are the smallest,and the proportion of absolute values of the errors ≥2 mm/h is only 0.6%.The root mean square (RMSE) is only 0.37 mm/h,and the correlation coefficient (COR) is 0.93.The interpolation accuracy of CLDAS-FRT is the highest,with the smallest RMSE (0.65 mm/h) and mean error ( ME =0.01 mm/h),and the largest COR (0.81).The accuracy of the precipitation product obtained by thin plate smoothing spline interpolation is close to that of CLDAS-FAST.Its RMSE is up to 0.80 mm/h,and its ME is only -0.01 mm/h.Its COR is 0.73,but its bias (BIAS) is up to 1.06 mm/h.展开更多
In the present paper,a new criterion is derived to obtain the optimum fitting curve while using Cubic B-spline basis functions to remove the statistical noise in the spectroscopic data.In this criterion,firstly,smooth...In the present paper,a new criterion is derived to obtain the optimum fitting curve while using Cubic B-spline basis functions to remove the statistical noise in the spectroscopic data.In this criterion,firstly,smoothed fitting curves using Cubic B-spline basis functions are selected with the increasing knot number.Then,the best fitting curves are selected according to the value of the minimum residual sum of squares(RSS)of two adjacent fitting curves.In the case of more than one best fitting curves,the authors use Reinsch's first condition to find a better one.The minimum residual sum of squares(RSS)of fitting curve with noisy data is not recommended as the criterion to determine the best fitting curve,because this value decreases to zero as the number of selected channels increases and the minimum value gives no smoothing effect.Compared with Reinsch's method,the derived criterion is simple and enables the smoothing conditions to be determined automatically without any initial input parameter.With the derived criterion,the satisfactory result was obtained for the experimental spectroscopic data to remove the statistical noise using Cubic B-spline basis functions.展开更多
In this paper,the kernel of the cubic spline interpolation is given.An optimal error bound for the cu- bic spline interpolation of lower smooth functions is obtained.
In this paper, a fast approach to generate time optimal and smooth trajectory has been developed and tested. Minimum time is critical for the productivity in industrial applications. Meanwhile, smooth trajectories bas...In this paper, a fast approach to generate time optimal and smooth trajectory has been developed and tested. Minimum time is critical for the productivity in industrial applications. Meanwhile, smooth trajectories based on cubic splines are desirable for their ability to limit vibrations and ensure the continuity of position, velocity and acceleration during the robot movement. The main feature of the approach is a satisfactory solution that can be obtained by a local modification process among each intermal between two consecutive via-points. An analytical formulation simplifies the approach to smooth trajectory and few,iterations are enough to determine the correct values. The approach can be applied in many robot manipulators which require high performance on time and smooth. The simulation and application of the approach on a palletizer robot are performed, and the experimental results provide evidence that the approach can realize the robot manipulators more efficiency and high smooth performance.展开更多
In the investigation of disease dynamics, the effect of covariates on the hazard function is a major topic. Some recent smoothed estimation methods have been proposed, both frequentist and Bayesian, based on the relat...In the investigation of disease dynamics, the effect of covariates on the hazard function is a major topic. Some recent smoothed estimation methods have been proposed, both frequentist and Bayesian, based on the relationship between penalized splines and mixed models theory. These approaches are also motivated by the possibility of using automatic procedures for determining the optimal amount of smoothing. However, estimation algorithms involve an analytically intractable hazard function, and thus require ad-hoc software routines. We propose a more user-friendly alternative, consisting in regularized estimation of piecewise exponential models by Bayesian P-splines. A further facilitation is that widespread Bayesian software, such as WinBUGS, can be used. The aim is assessing the robustness of this approach with respect to different prior functions and penalties. A large dataset from breast cancer patients, where results from validated clinical studies are available, is used as a benchmark to evaluate the reliability of the estimates. A second dataset from a small case series of sarcoma patients is used for evaluating the performances of the PE model as a tool for exploratory analysis. Concerning breast cancer data, the estimates are robust with respect to priors and penalties, and consistent with clinical knowledge. Concerning soft tissue sarcoma data, the estimates of the hazard function are sensitive with respect to the prior for the smoothing parameter, whereas the estimates of regression coefficients are robust. In conclusion, Gibbs sampling results an efficient computational strategy. The issue of the sensitivity with respect to the priors concerns only the estimates of the hazard function, and seems more likely to occur when non-large case series are investigated, calling for tailored solutions.展开更多
We present the analysis of three independent and most widely used image smoothing techniques on a new fractional based convolution edge detector originally constructed by same authors for image edge analysis. The impl...We present the analysis of three independent and most widely used image smoothing techniques on a new fractional based convolution edge detector originally constructed by same authors for image edge analysis. The implementation was done using only Gaussian function as its smoothing function based on predefined assumptions and therefore did not scale well for some types of edges and noise. The experiments conducted on this mask using known images with realistic geometry suggested the need for image smoothing adaptation to obtain a more optimal performance. In this paper, we use the structural similarity index measure and show that the adaptation technique for choosing smoothing function has significant advantages over a single function implementation. The new adaptive fractional based convolution mask can smoothly find edges of various types in detail quite significantly. The method can now trap both local discontinuities in intensity and its derivatives as well as locating Dirac edges.展开更多
针对快速扩展随机树(rapidly-exploring random tree,RRT)算法在无人机路径规划过程中采样次数多、生成路径曲折等问题,提出了一种将路径重规划策略和平滑度优化相结合的路径规划算法。首先,通过重新构造采样区域降低RRT算法采样次数,...针对快速扩展随机树(rapidly-exploring random tree,RRT)算法在无人机路径规划过程中采样次数多、生成路径曲折等问题,提出了一种将路径重规划策略和平滑度优化相结合的路径规划算法。首先,通过重新构造采样区域降低RRT算法采样次数,利用目标偏向寻优策略为RRT算法添加导向性;其次,在筛选初始航迹点的同时引入无人机性能约束;然后,利用B样条对重规划路径进行平滑处理;最后,利用Matlab对所提出的算法进行仿真实验。实验结果为平均采样次数为386次,平均运行时间为0.43 s,平均航迹距离为1392.16(无量纲),表明了算法可有效降低采样次数并改善路径平滑性。展开更多
基金supported by the Fundamental Research Funds for University of Science and Technology Beijing(FRF-BR-12-021)
文摘A semi-supervised vector machine is a relatively new learning method using both labeled and unlabeled data in classifi- cation. Since the objective function of the model for an unstrained semi-supervised vector machine is not smooth, many fast opti- mization algorithms cannot be applied to solve the model. In order to overcome the difficulty of dealing with non-smooth objective functions, new methods that can solve the semi-supervised vector machine with desired classification accuracy are in great demand. A quintic spline function with three-times differentiability at the ori- gin is constructed by a general three-moment method, which can be used to approximate the symmetric hinge loss function. The approximate accuracy of the quintic spiine function is estimated. Moreover, a quintic spline smooth semi-support vector machine is obtained and the convergence accuracy of the smooth model to the non-smooth one is analyzed. Three experiments are performed to test the efficiency of the model. The experimental results show that the new model outperforms other smooth models, in terms of classification performance. Furthermore, the new model is not sensitive to the increasing number of the labeled samples, which means that the new model is more efficient.
基金Supported by the National Natural Science Foundation of China(No.60972126,60921061)the State Key Program of National Natural Science of China(No.61032007)
文摘The acquired hyperspectral images (HSIs) are inherently attected by noise wlm Dano-varylng level, which cannot be removed easily by current approaches. In this study, a new denoising method is proposed for removing such kind of noise by smoothing spectral signals in the transformed multi- scale domain. Specifically, the proposed method includes three procedures: 1 ) applying a discrete wavelet transform (DWT) to each band; 2) performing cubic spline smoothing on each noisy coeffi- cient vector along the spectral axis; 3 ) reconstructing each band by an inverse DWT. In order to adapt to the band-varying noise statistics of HSIs, the noise covariance is estimated to control the smoothing degree at different spectra| positions. Generalized cross validation (GCV) is employed to choose the smoothing parameter during the optimization. The experimental results on simulated and real HSIs demonstrate that the proposed method can be well adapted to band-varying noise statistics of noisy HSIs and also can well preserve the spectral and spatial features.
基金supported by the Natural Science Foundation of China(10771017,10971015,10231030)Key Project to Ministry of Education of the People’s Republic of China(309007)
文摘In this article, we use penalized spline to estimate the hazard function from a set of censored failure time data. A new approach to estimate the amount of smoothing is provided. Under regularity conditions we establish the consistency and the asymptotic normality of the penalized likelihood estimators. Numerical studies and an example are conducted to evaluate the performances of the new procedure.
基金Supported by New Technology Integration Project of Anhui Meteorological Bureau(AHXJ201704)Project for Masters and Doctors of Anhui Meteorological Bureau(RC201701)
文摘In order to achieve refined precipitation grid data with high accuracy and high spatial resolution,hourly precipitation grid dataset with 1 km spatial resolution in Anhui Province from May to September (in the rainy season) in 2016 and from August 2017 to July 2018 was established based on thin plate smoothing spline (TPS),which meets the needs of climate change research and meteorological disaster risk assessment.The interpolation errors were then analyzed.The grid precipitation product obtained based on thin plate smoothing spline,CLDAS-FAST and CLDAS-FRT were evaluated.The results show that the interpolated values of hourly precipitation by thin plate smoothing spline are close to the observed values in the rainy season of 2016.The errors are generally below 0.9 mm/h.However,the errors of precipitation in the mountainous areas of eastern Huaibei and western Anhui are above 1.2 mm/h.On monthly scale,the errors in June and July are the largest,and the proportion of absolute values of the errors ≥2 mm/h is up to 2.0% in June and 2.2% in July.The errors in September are the smallest,and the proportion of absolute values of the errors ≥2 mm/h is only 0.6%.The root mean square (RMSE) is only 0.37 mm/h,and the correlation coefficient (COR) is 0.93.The interpolation accuracy of CLDAS-FRT is the highest,with the smallest RMSE (0.65 mm/h) and mean error ( ME =0.01 mm/h),and the largest COR (0.81).The accuracy of the precipitation product obtained by thin plate smoothing spline interpolation is close to that of CLDAS-FAST.Its RMSE is up to 0.80 mm/h,and its ME is only -0.01 mm/h.Its COR is 0.73,but its bias (BIAS) is up to 1.06 mm/h.
基金Supported by the Science and Technology Development Fund of Macao(China)grant(No.042/2007/A3,No.003/2008/A1)partly supported by NSFC Project(No.10631080)National Key Basic Research Project of China grant(No.2004CB318000)
文摘In the present paper,a new criterion is derived to obtain the optimum fitting curve while using Cubic B-spline basis functions to remove the statistical noise in the spectroscopic data.In this criterion,firstly,smoothed fitting curves using Cubic B-spline basis functions are selected with the increasing knot number.Then,the best fitting curves are selected according to the value of the minimum residual sum of squares(RSS)of two adjacent fitting curves.In the case of more than one best fitting curves,the authors use Reinsch's first condition to find a better one.The minimum residual sum of squares(RSS)of fitting curve with noisy data is not recommended as the criterion to determine the best fitting curve,because this value decreases to zero as the number of selected channels increases and the minimum value gives no smoothing effect.Compared with Reinsch's method,the derived criterion is simple and enables the smoothing conditions to be determined automatically without any initial input parameter.With the derived criterion,the satisfactory result was obtained for the experimental spectroscopic data to remove the statistical noise using Cubic B-spline basis functions.
文摘In this paper,the kernel of the cubic spline interpolation is given.An optimal error bound for the cu- bic spline interpolation of lower smooth functions is obtained.
文摘In this paper, a fast approach to generate time optimal and smooth trajectory has been developed and tested. Minimum time is critical for the productivity in industrial applications. Meanwhile, smooth trajectories based on cubic splines are desirable for their ability to limit vibrations and ensure the continuity of position, velocity and acceleration during the robot movement. The main feature of the approach is a satisfactory solution that can be obtained by a local modification process among each intermal between two consecutive via-points. An analytical formulation simplifies the approach to smooth trajectory and few,iterations are enough to determine the correct values. The approach can be applied in many robot manipulators which require high performance on time and smooth. The simulation and application of the approach on a palletizer robot are performed, and the experimental results provide evidence that the approach can realize the robot manipulators more efficiency and high smooth performance.
文摘In the investigation of disease dynamics, the effect of covariates on the hazard function is a major topic. Some recent smoothed estimation methods have been proposed, both frequentist and Bayesian, based on the relationship between penalized splines and mixed models theory. These approaches are also motivated by the possibility of using automatic procedures for determining the optimal amount of smoothing. However, estimation algorithms involve an analytically intractable hazard function, and thus require ad-hoc software routines. We propose a more user-friendly alternative, consisting in regularized estimation of piecewise exponential models by Bayesian P-splines. A further facilitation is that widespread Bayesian software, such as WinBUGS, can be used. The aim is assessing the robustness of this approach with respect to different prior functions and penalties. A large dataset from breast cancer patients, where results from validated clinical studies are available, is used as a benchmark to evaluate the reliability of the estimates. A second dataset from a small case series of sarcoma patients is used for evaluating the performances of the PE model as a tool for exploratory analysis. Concerning breast cancer data, the estimates are robust with respect to priors and penalties, and consistent with clinical knowledge. Concerning soft tissue sarcoma data, the estimates of the hazard function are sensitive with respect to the prior for the smoothing parameter, whereas the estimates of regression coefficients are robust. In conclusion, Gibbs sampling results an efficient computational strategy. The issue of the sensitivity with respect to the priors concerns only the estimates of the hazard function, and seems more likely to occur when non-large case series are investigated, calling for tailored solutions.
文摘We present the analysis of three independent and most widely used image smoothing techniques on a new fractional based convolution edge detector originally constructed by same authors for image edge analysis. The implementation was done using only Gaussian function as its smoothing function based on predefined assumptions and therefore did not scale well for some types of edges and noise. The experiments conducted on this mask using known images with realistic geometry suggested the need for image smoothing adaptation to obtain a more optimal performance. In this paper, we use the structural similarity index measure and show that the adaptation technique for choosing smoothing function has significant advantages over a single function implementation. The new adaptive fractional based convolution mask can smoothly find edges of various types in detail quite significantly. The method can now trap both local discontinuities in intensity and its derivatives as well as locating Dirac edges.
文摘针对快速扩展随机树(rapidly-exploring random tree,RRT)算法在无人机路径规划过程中采样次数多、生成路径曲折等问题,提出了一种将路径重规划策略和平滑度优化相结合的路径规划算法。首先,通过重新构造采样区域降低RRT算法采样次数,利用目标偏向寻优策略为RRT算法添加导向性;其次,在筛选初始航迹点的同时引入无人机性能约束;然后,利用B样条对重规划路径进行平滑处理;最后,利用Matlab对所提出的算法进行仿真实验。实验结果为平均采样次数为386次,平均运行时间为0.43 s,平均航迹距离为1392.16(无量纲),表明了算法可有效降低采样次数并改善路径平滑性。