Existing curve fitting algorithms of NC machining path mainly focus on the control of fitting error,but ignore the problem that the original discrete cutter position points are not enough in the high curvature area of...Existing curve fitting algorithms of NC machining path mainly focus on the control of fitting error,but ignore the problem that the original discrete cutter position points are not enough in the high curvature area of the tool path.It may cause a sudden change in the drive force of the feed axis,resulting in a large fluctuation in the feed speed.This paper proposes a new non-uniform rational B-spline(NURBS)curve fitting optimization method based on curvature smoothing preset point constraints.First,the short line segments generated by the CAM software are optimally divided into different segment regions,and then the curvature of the short line segments in each region is adjusted to make it smoother.Secondly,a set of characteristic points reflecting the change of the curvature of the fitted curve is constructed as the control apex of the fitted curve,and the curve is fitted using the NURBS curve fitting optimization method based on the curvature smoothing preset point constraint.Finally,the curve fitting error and curve volatility are analyzed with an example,which verifies that the method can significantly improve the curvature smoothness of the high-curvature tool path,reduce the fitting error,and improve the feed speed.展开更多
Melanoma is the most lethal malignant tumour,and its prevalence is increasing.Early detection and diagnosis of skin cancer can alert patients to manage precautions and dramatically improve the lives of people.Recently...Melanoma is the most lethal malignant tumour,and its prevalence is increasing.Early detection and diagnosis of skin cancer can alert patients to manage precautions and dramatically improve the lives of people.Recently,deep learning has grown increasingly popular in the extraction and categorization of skin cancer features for effective prediction.A deep learning model learns and co-adapts representations and features from training data to the point where it fails to perform well on test data.As a result,overfitting and poor performance occur.To deal with this issue,we proposed a novel Consecutive Layerwise weight Con-straint MaxNorm model(CLCM-net)for constraining the norm of the weight vector that is scaled each time and bounding to a limit.This method uses deep convolutional neural networks and also custom layer-wise weight constraints that are set to the whole weight matrix directly to learn features efficiently.In this research,a detailed analysis of these weight norms is performed on two distinct datasets,International Skin Imaging Collaboration(ISIC)of 2018 and 2019,which are challenging for convolutional networks to handle.According to thefindings of this work,CLCM-net did a better job of raising the model’s performance by learning the features efficiently within the size limit of weights with appropriate weight constraint settings.The results proved that the proposed techniques achieved 94.42%accuracy on ISIC 2018,91.73%accuracy on ISIC 2019 datasets and 93%of accuracy on combined dataset.展开更多
This study investigates data-processing methods and examines the precipitation effect on gravity measurements at the Dali gravity network, established in 2005. High-quality gravity data were collected during four meas...This study investigates data-processing methods and examines the precipitation effect on gravity measurements at the Dali gravity network, established in 2005. High-quality gravity data were collected during four measurement campaigns. To use the gravity data validly, some geophysical corrections must be considered carefully. We first discuss data-processing methods using weighted least- squares adjustment with the constraint of the absolute gravity datum. Results indicate that the gravity precision can be improved if all absolute gravity data are used as constraints and if calibration functions of relative gravi- meters are modeled within the observation function. Using this data-processing scheme, the mean point gravity pre- cision is better than 12 μgal. After determining the best data-processing scheme, we then process the gravity data obtained in the four measurement campaigns, and obtain gravity changes in three time periods. Results show that the gravity has a remarkable change of more than 50 pgal in the first time period from Apr-May of 2005 to Aug-Sept of 2007. To interpret the large gravity change, a mean water mass change (0.6 m in height) is assumed in the ETOPO1 topographic model. Calculations of the precipitation effect on gravity show that it can reach the same order of the observed gravity change. It is regarded as a main source of the remarkable gravity change in the Dali gravity network, suggesting that the precipitation effect on gravity mea- surements must be considered carefully.展开更多
Gravity inversion requires much computation,and inversion results are often non-unique.The first problem is often due to the large number of grid cells.Edge detection method,i.e.,tilt angle method of analytical signal...Gravity inversion requires much computation,and inversion results are often non-unique.The first problem is often due to the large number of grid cells.Edge detection method,i.e.,tilt angle method of analytical signal amplitude(TAS),helps to identify the boundaries of underground geological anomalies at different depths,which can be used to optimize the grid and reduce the number of grid cells.The requirement of smooth inversion is that the boundaries of the meshing area should be continuous rather than jagged.In this paper,the optimized meshing strategy is improved,and the optimized meshing region obtained by the TAS is changed to a regular region to facilitate the smooth inversion.For the second problem,certain constraints can be used to improve the accuracy of inversion.The results of analytic signal amplitude(ASA)are used to delineate the central distribution of geological bodies.We propose a new method using the results of ASA to perform local constraints to reduce the non-uniqueness of inversion.The guided fuzzy c-means(FCM)clustering algorithm combined with priori petrophysical information is also used to reduce the non-uniqueness of gravity inversion.The Open Acc technology is carried out to speed up the computation for parallelizing the serial program on GPU.In general,the TAS is used to reduce the number of grid cells.The local weighting and priori petrophysical constraint are used in conjunction with the FCM algorithm during the inversion,which improves the accuracy of inversion.The inversion is accelerated by the Open Acc technology on GPU.The proposed method is validated using synthetic data,and the results show that the efficiency and accuracy of gravity inversion are greatly improved by using the proposed method.展开更多
This paper addresses the state estimation problem for linear systems with additive uncertainties in both the state and output equations using a moving horizon approach. Based on the full information estimation setting...This paper addresses the state estimation problem for linear systems with additive uncertainties in both the state and output equations using a moving horizon approach. Based on the full information estimation setting and the game-theoretic approach to the H∞filtering, a new optimization-based estimation scheme for uncertain linear systems is proposed, namely the H∞-full information estimator, H∞-FIE in short. In this formulation, the set of processed data grows with time as more measurements are received preventing recursive formulations as in Kalman filtering. To overcome the latter problem, a moving horizon approximation to the H∞-FIE is also presented, the H∞-MHE in short. This moving horizon approximation is achieved since the arrival cost is suitably defined for the proposed scheme. Sufficient conditions for the stability of the H∞-MHE are derived. Simulation results show the benefits of the proposed scheme when compared with two H∞filters and the well-known Kalman filter.展开更多
In the paper, an approach is proposed for the problem of consistency in depth maps estimation from binocular stereo video sequence. The consistent method includes temporal consistency and spatial consistency to elimin...In the paper, an approach is proposed for the problem of consistency in depth maps estimation from binocular stereo video sequence. The consistent method includes temporal consistency and spatial consistency to eliminate the flickering artifacts and smooth inaccuracy in depth recovery. So the improved global stereo matching based on graph cut and energy optimization is implemented. In temporal domain, the penalty function with coherence factor is introduced for temporal consistency, and the factor is determined by Lucas-Kanade optical flow weighted histogram similarity constraint(LKWHSC). In spatial domain, the joint bilateral truncated absolute difference(JBTAD) is proposed for segmentation smoothing. The method can smooth naturally and uniformly in low-gradient region and avoid over-smoothing as well as keep edge sharpness in high-gradient discontinuities to realize spatial consistency. The experimental results show that the algorithm can obtain better spatial and temporal consistent depth maps compared with the existing algorithms.展开更多
基金the Open Foundation Project of Jiangsu Key Laboratory of Precision and Micro-manufacturing Technology Open Fund Project.
文摘Existing curve fitting algorithms of NC machining path mainly focus on the control of fitting error,but ignore the problem that the original discrete cutter position points are not enough in the high curvature area of the tool path.It may cause a sudden change in the drive force of the feed axis,resulting in a large fluctuation in the feed speed.This paper proposes a new non-uniform rational B-spline(NURBS)curve fitting optimization method based on curvature smoothing preset point constraints.First,the short line segments generated by the CAM software are optimally divided into different segment regions,and then the curvature of the short line segments in each region is adjusted to make it smoother.Secondly,a set of characteristic points reflecting the change of the curvature of the fitted curve is constructed as the control apex of the fitted curve,and the curve is fitted using the NURBS curve fitting optimization method based on the curvature smoothing preset point constraint.Finally,the curve fitting error and curve volatility are analyzed with an example,which verifies that the method can significantly improve the curvature smoothness of the high-curvature tool path,reduce the fitting error,and improve the feed speed.
文摘Melanoma is the most lethal malignant tumour,and its prevalence is increasing.Early detection and diagnosis of skin cancer can alert patients to manage precautions and dramatically improve the lives of people.Recently,deep learning has grown increasingly popular in the extraction and categorization of skin cancer features for effective prediction.A deep learning model learns and co-adapts representations and features from training data to the point where it fails to perform well on test data.As a result,overfitting and poor performance occur.To deal with this issue,we proposed a novel Consecutive Layerwise weight Con-straint MaxNorm model(CLCM-net)for constraining the norm of the weight vector that is scaled each time and bounding to a limit.This method uses deep convolutional neural networks and also custom layer-wise weight constraints that are set to the whole weight matrix directly to learn features efficiently.In this research,a detailed analysis of these weight norms is performed on two distinct datasets,International Skin Imaging Collaboration(ISIC)of 2018 and 2019,which are challenging for convolutional networks to handle.According to thefindings of this work,CLCM-net did a better job of raising the model’s performance by learning the features efficiently within the size limit of weights with appropriate weight constraint settings.The results proved that the proposed techniques achieved 94.42%accuracy on ISIC 2018,91.73%accuracy on ISIC 2019 datasets and 93%of accuracy on combined dataset.
基金financially supported by the CAS/CAFEA International Partnership Program for creative research teams (No. KZZD-EW-TZ-19)the National Natural Science Foundation of China (Nos. 41331066 and 41174063)
文摘This study investigates data-processing methods and examines the precipitation effect on gravity measurements at the Dali gravity network, established in 2005. High-quality gravity data were collected during four measurement campaigns. To use the gravity data validly, some geophysical corrections must be considered carefully. We first discuss data-processing methods using weighted least- squares adjustment with the constraint of the absolute gravity datum. Results indicate that the gravity precision can be improved if all absolute gravity data are used as constraints and if calibration functions of relative gravi- meters are modeled within the observation function. Using this data-processing scheme, the mean point gravity pre- cision is better than 12 μgal. After determining the best data-processing scheme, we then process the gravity data obtained in the four measurement campaigns, and obtain gravity changes in three time periods. Results show that the gravity has a remarkable change of more than 50 pgal in the first time period from Apr-May of 2005 to Aug-Sept of 2007. To interpret the large gravity change, a mean water mass change (0.6 m in height) is assumed in the ETOPO1 topographic model. Calculations of the precipitation effect on gravity show that it can reach the same order of the observed gravity change. It is regarded as a main source of the remarkable gravity change in the Dali gravity network, suggesting that the precipitation effect on gravity mea- surements must be considered carefully.
基金supported by the National Key Research and Development Program of China Project(Grant No.2018YFC0603502)
文摘Gravity inversion requires much computation,and inversion results are often non-unique.The first problem is often due to the large number of grid cells.Edge detection method,i.e.,tilt angle method of analytical signal amplitude(TAS),helps to identify the boundaries of underground geological anomalies at different depths,which can be used to optimize the grid and reduce the number of grid cells.The requirement of smooth inversion is that the boundaries of the meshing area should be continuous rather than jagged.In this paper,the optimized meshing strategy is improved,and the optimized meshing region obtained by the TAS is changed to a regular region to facilitate the smooth inversion.For the second problem,certain constraints can be used to improve the accuracy of inversion.The results of analytic signal amplitude(ASA)are used to delineate the central distribution of geological bodies.We propose a new method using the results of ASA to perform local constraints to reduce the non-uniqueness of inversion.The guided fuzzy c-means(FCM)clustering algorithm combined with priori petrophysical information is also used to reduce the non-uniqueness of gravity inversion.The Open Acc technology is carried out to speed up the computation for parallelizing the serial program on GPU.In general,the TAS is used to reduce the number of grid cells.The local weighting and priori petrophysical constraint are used in conjunction with the FCM algorithm during the inversion,which improves the accuracy of inversion.The inversion is accelerated by the Open Acc technology on GPU.The proposed method is validated using synthetic data,and the results show that the efficiency and accuracy of gravity inversion are greatly improved by using the proposed method.
基金supported by the European Community s Seventh Framework Programme FP7/2007-2013(No.223854)COLCIENCIAS-Departamento Administrativo de Ciencia,Tecnologíae Innovacin de Colombia
文摘This paper addresses the state estimation problem for linear systems with additive uncertainties in both the state and output equations using a moving horizon approach. Based on the full information estimation setting and the game-theoretic approach to the H∞filtering, a new optimization-based estimation scheme for uncertain linear systems is proposed, namely the H∞-full information estimator, H∞-FIE in short. In this formulation, the set of processed data grows with time as more measurements are received preventing recursive formulations as in Kalman filtering. To overcome the latter problem, a moving horizon approximation to the H∞-FIE is also presented, the H∞-MHE in short. This moving horizon approximation is achieved since the arrival cost is suitably defined for the proposed scheme. Sufficient conditions for the stability of the H∞-MHE are derived. Simulation results show the benefits of the proposed scheme when compared with two H∞filters and the well-known Kalman filter.
基金the Science and Technology Innovation Project of Ministry of Culture of China(No.2014KJCXXM08)the National Key Technology Research and Development Program of the Ministry of Science and Technology of China(No.2012BAH37F02)the National High Technology Research and Development Program(863)of China(No.2011AA01A107)
文摘In the paper, an approach is proposed for the problem of consistency in depth maps estimation from binocular stereo video sequence. The consistent method includes temporal consistency and spatial consistency to eliminate the flickering artifacts and smooth inaccuracy in depth recovery. So the improved global stereo matching based on graph cut and energy optimization is implemented. In temporal domain, the penalty function with coherence factor is introduced for temporal consistency, and the factor is determined by Lucas-Kanade optical flow weighted histogram similarity constraint(LKWHSC). In spatial domain, the joint bilateral truncated absolute difference(JBTAD) is proposed for segmentation smoothing. The method can smooth naturally and uniformly in low-gradient region and avoid over-smoothing as well as keep edge sharpness in high-gradient discontinuities to realize spatial consistency. The experimental results show that the algorithm can obtain better spatial and temporal consistent depth maps compared with the existing algorithms.