期刊文献+
共找到6篇文章
< 1 >
每页显示 20 50 100
A NURBS Fitting Optimization Method for High⁃Speed Five⁃Axis NC Machining Path Based on Curvature Smoothing Preset Point Constraint 被引量:1
1
作者 YANG Gaojie XU Xiang +1 位作者 SHI Zhongquan YE Wenhua 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI CSCD 2021年第3期404-414,共11页
Existing curve fitting algorithms of NC machining path mainly focus on the control of fitting error,but ignore the problem that the original discrete cutter position points are not enough in the high curvature area of... Existing curve fitting algorithms of NC machining path mainly focus on the control of fitting error,but ignore the problem that the original discrete cutter position points are not enough in the high curvature area of the tool path.It may cause a sudden change in the drive force of the feed axis,resulting in a large fluctuation in the feed speed.This paper proposes a new non-uniform rational B-spline(NURBS)curve fitting optimization method based on curvature smoothing preset point constraints.First,the short line segments generated by the CAM software are optimally divided into different segment regions,and then the curvature of the short line segments in each region is adjusted to make it smoother.Secondly,a set of characteristic points reflecting the change of the curvature of the fitted curve is constructed as the control apex of the fitted curve,and the curve is fitted using the NURBS curve fitting optimization method based on the curvature smoothing preset point constraint.Finally,the curve fitting error and curve volatility are analyzed with an example,which verifies that the method can significantly improve the curvature smoothness of the high-curvature tool path,reduce the fitting error,and improve the feed speed. 展开更多
关键词 curvature smoothing NC machining path NURBS curve fitting weighted constraint
下载PDF
Regularised Layerwise Weight Norm Based Skin Lesion Features Extraction and Classification
2
作者 S.Gopikha M.Balamurugan 《Computer Systems Science & Engineering》 SCIE EI 2023年第3期2727-2742,共16页
Melanoma is the most lethal malignant tumour,and its prevalence is increasing.Early detection and diagnosis of skin cancer can alert patients to manage precautions and dramatically improve the lives of people.Recently... Melanoma is the most lethal malignant tumour,and its prevalence is increasing.Early detection and diagnosis of skin cancer can alert patients to manage precautions and dramatically improve the lives of people.Recently,deep learning has grown increasingly popular in the extraction and categorization of skin cancer features for effective prediction.A deep learning model learns and co-adapts representations and features from training data to the point where it fails to perform well on test data.As a result,overfitting and poor performance occur.To deal with this issue,we proposed a novel Consecutive Layerwise weight Con-straint MaxNorm model(CLCM-net)for constraining the norm of the weight vector that is scaled each time and bounding to a limit.This method uses deep convolutional neural networks and also custom layer-wise weight constraints that are set to the whole weight matrix directly to learn features efficiently.In this research,a detailed analysis of these weight norms is performed on two distinct datasets,International Skin Imaging Collaboration(ISIC)of 2018 and 2019,which are challenging for convolutional networks to handle.According to thefindings of this work,CLCM-net did a better job of raising the model’s performance by learning the features efficiently within the size limit of weights with appropriate weight constraint settings.The results proved that the proposed techniques achieved 94.42%accuracy on ISIC 2018,91.73%accuracy on ISIC 2019 datasets and 93%of accuracy on combined dataset. 展开更多
关键词 NORM OVERFITTING REGULARIZATION MELANOMA weight constraints
下载PDF
Gravity change observed in a local gravity network and its implication to seasonal precipitation in Dali county, Yunnan province, China 被引量:1
3
作者 Xin Zhou Wenke Sun +5 位作者 Hui Li Shuhei Okubo Shaoan Sun Lelin Xing Dongzhi Liu Chongyang Shen 《Earthquake Science》 2014年第1期79-88,共10页
This study investigates data-processing methods and examines the precipitation effect on gravity measurements at the Dali gravity network, established in 2005. High-quality gravity data were collected during four meas... This study investigates data-processing methods and examines the precipitation effect on gravity measurements at the Dali gravity network, established in 2005. High-quality gravity data were collected during four measurement campaigns. To use the gravity data validly, some geophysical corrections must be considered carefully. We first discuss data-processing methods using weighted least- squares adjustment with the constraint of the absolute gravity datum. Results indicate that the gravity precision can be improved if all absolute gravity data are used as constraints and if calibration functions of relative gravi- meters are modeled within the observation function. Using this data-processing scheme, the mean point gravity pre- cision is better than 12 μgal. After determining the best data-processing scheme, we then process the gravity data obtained in the four measurement campaigns, and obtain gravity changes in three time periods. Results show that the gravity has a remarkable change of more than 50 pgal in the first time period from Apr-May of 2005 to Aug-Sept of 2007. To interpret the large gravity change, a mean water mass change (0.6 m in height) is assumed in the ETOPO1 topographic model. Calculations of the precipitation effect on gravity show that it can reach the same order of the observed gravity change. It is regarded as a main source of the remarkable gravity change in the Dali gravity network, suggesting that the precipitation effect on gravity mea- surements must be considered carefully. 展开更多
关键词 Gravity network Gravity change Gravity datum Weighted constraint Precipitationeffect
下载PDF
Three-dimensional gravity inversion based on optimization processing from edge detection
4
作者 Sheng Liu Shuanggen Jin Qiang Chen 《Geodesy and Geodynamics》 CSCD 2022年第5期503-524,共22页
Gravity inversion requires much computation,and inversion results are often non-unique.The first problem is often due to the large number of grid cells.Edge detection method,i.e.,tilt angle method of analytical signal... Gravity inversion requires much computation,and inversion results are often non-unique.The first problem is often due to the large number of grid cells.Edge detection method,i.e.,tilt angle method of analytical signal amplitude(TAS),helps to identify the boundaries of underground geological anomalies at different depths,which can be used to optimize the grid and reduce the number of grid cells.The requirement of smooth inversion is that the boundaries of the meshing area should be continuous rather than jagged.In this paper,the optimized meshing strategy is improved,and the optimized meshing region obtained by the TAS is changed to a regular region to facilitate the smooth inversion.For the second problem,certain constraints can be used to improve the accuracy of inversion.The results of analytic signal amplitude(ASA)are used to delineate the central distribution of geological bodies.We propose a new method using the results of ASA to perform local constraints to reduce the non-uniqueness of inversion.The guided fuzzy c-means(FCM)clustering algorithm combined with priori petrophysical information is also used to reduce the non-uniqueness of gravity inversion.The Open Acc technology is carried out to speed up the computation for parallelizing the serial program on GPU.In general,the TAS is used to reduce the number of grid cells.The local weighting and priori petrophysical constraint are used in conjunction with the FCM algorithm during the inversion,which improves the accuracy of inversion.The inversion is accelerated by the Open Acc technology on GPU.The proposed method is validated using synthetic data,and the results show that the efficiency and accuracy of gravity inversion are greatly improved by using the proposed method. 展开更多
关键词 Gravity inversion Locally weighted constraint Petrophysical constrain Fuzzy c-means clustering algorithm Open Acc technology
下载PDF
A New Approach to State Estimation for Uncertain Linear Systems in a Moving Horizon Estimation Setting 被引量:2
5
作者 J.Garcia-Tirado H.Botero F.Angulo 《International Journal of Automation and computing》 EI CSCD 2016年第6期653-664,共12页
This paper addresses the state estimation problem for linear systems with additive uncertainties in both the state and output equations using a moving horizon approach. Based on the full information estimation setting... This paper addresses the state estimation problem for linear systems with additive uncertainties in both the state and output equations using a moving horizon approach. Based on the full information estimation setting and the game-theoretic approach to the H∞filtering, a new optimization-based estimation scheme for uncertain linear systems is proposed, namely the H∞-full information estimator, H∞-FIE in short. In this formulation, the set of processed data grows with time as more measurements are received preventing recursive formulations as in Kalman filtering. To overcome the latter problem, a moving horizon approximation to the H∞-FIE is also presented, the H∞-MHE in short. This moving horizon approximation is achieved since the arrival cost is suitably defined for the proposed scheme. Sufficient conditions for the stability of the H∞-MHE are derived. Simulation results show the benefits of the proposed scheme when compared with two H∞filters and the well-known Kalman filter. 展开更多
关键词 uncertain processed overcome estimator latter horizon filtering recursive weighting constraints
原文传递
Consistent Depth Maps Estimation from Binocular Stereo Video Sequence
6
作者 段峰峰 《Journal of Shanghai Jiaotong university(Science)》 EI 2016年第2期184-191,共8页
In the paper, an approach is proposed for the problem of consistency in depth maps estimation from binocular stereo video sequence. The consistent method includes temporal consistency and spatial consistency to elimin... In the paper, an approach is proposed for the problem of consistency in depth maps estimation from binocular stereo video sequence. The consistent method includes temporal consistency and spatial consistency to eliminate the flickering artifacts and smooth inaccuracy in depth recovery. So the improved global stereo matching based on graph cut and energy optimization is implemented. In temporal domain, the penalty function with coherence factor is introduced for temporal consistency, and the factor is determined by Lucas-Kanade optical flow weighted histogram similarity constraint(LKWHSC). In spatial domain, the joint bilateral truncated absolute difference(JBTAD) is proposed for segmentation smoothing. The method can smooth naturally and uniformly in low-gradient region and avoid over-smoothing as well as keep edge sharpness in high-gradient discontinuities to realize spatial consistency. The experimental results show that the algorithm can obtain better spatial and temporal consistent depth maps compared with the existing algorithms. 展开更多
关键词 consistent depth maps binocular stereo video sequence Lucas-Kanade optical flow weighted histogram similarity constraint(LKWHSC) joint bilateral truncated absolute difference(JBTAD)
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部