In the process of numerical control machining simulation,the workpiece surface is usually described with the uniform triangular mesh model.To alleviate the contradiction between the simulation speed and accuracy in th...In the process of numerical control machining simulation,the workpiece surface is usually described with the uniform triangular mesh model.To alleviate the contradiction between the simulation speed and accuracy in this model,two improved methods,i.e.,the local refinement triangular mesh modeling method and the adaptive triangular mesh modeling method were presented.The simulation results show that when the final shape of the workpiece is known and its mathematic representation is simple,the local refinement triangular mesh modeling method is preferred;when the final shape of the workpiece is unknown and its mathematic description is complicated,the adaptive triangular mesh modeling method is more suitable.The experimental results show that both methods are more targeted and practical and can meet the requirements of real-time and precision in simulation.展开更多
In the machining process of large-scale complex curved surface,workers will encounter problems such as empty stroke of tool,collision interference,and overcut or undercut of the workpieces.This paper presents a method...In the machining process of large-scale complex curved surface,workers will encounter problems such as empty stroke of tool,collision interference,and overcut or undercut of the workpieces.This paper presents a method for generating the optimized tool path,compiling and checking the numerical control(NC)program.Taking the bogie frame as an example,the tool paths of all machining surface are optimized by the dynamic programming algorithm,Creo software is utilized to compile the optimized computerized numerical control(CNC)machining program,and VERICUT software is employed to simulate the machining process,optimize the amount of cutting and inspect the machining quality.The method saves the machining time,guarantees the correctness of NC program,and the overall machining efficiency is improved.The method lays a good theoretical and practical foundation for integration of the similar platform.展开更多
With the development of manufacturing,numerical control(NC) machining simulation has become a modern tool to obtain safe and reliable machining operations.Although some research and commercial software about NC machin...With the development of manufacturing,numerical control(NC) machining simulation has become a modern tool to obtain safe and reliable machining operations.Although some research and commercial software about NC machining simulations is available,most of them is oriented for G&M code.It is a low-level data model for computer numerical control(CNC),which has inherent drawbacks such as incomplete data and lack of accuracy.These limitations hinder the development of a real simulation system.Whereas,standard for the exchange of product data-compliant numerical control(STEP-NC) is a new and high-level data model for CNC.It provides rich information for CNC machine tools,which creates the condition for an informative and real simulation.Therefore,this paper proposes STEP-NC based high-level NC machining simulations solution integrated with computer-aided design/computeraided process planning/computer-aided manufacturing(CAD/CAPP/CAM).It turned out that the research provides a better informed simulation environment and promotes the development of modern manufacturing.展开更多
为提高数控机床的精度,基于模拟退火算法设计数控机床热误差补偿方法,分别建立机床内部零件沿X轴、Y轴、Z轴方向做平移与旋转运动时的变化矩阵,计算电动机与轴承的发热量,二者相加后就可以得到高速运动下机床发热量。基于模拟退火算法...为提高数控机床的精度,基于模拟退火算法设计数控机床热误差补偿方法,分别建立机床内部零件沿X轴、Y轴、Z轴方向做平移与旋转运动时的变化矩阵,计算电动机与轴承的发热量,二者相加后就可以得到高速运动下机床发热量。基于模拟退火算法建立热误差偏移补偿模型,获得系统温度的状态参量,得到温度下降后求和单元的传递函数,计算偏移补偿模型内X轴、Y轴、Z轴上经过多次迭代后的位置。设计数控机床热误差补偿算法,得到数控机床热误差补偿结果。实验结果显示,该数控机床在Y轴上的热误差值较小,但是在X轴与Y轴上的热误差较大,经过误差补偿后,其热误差分别降低至1~2 m m和0~1 m m,可见该热误差补偿方法效果较好。展开更多
Wax pattern fabrication in the investment casting of hollow turbine blades directly determines the dimension accuracy of subsequent casting,and therefore significantly affects the quality of final product.In this work...Wax pattern fabrication in the investment casting of hollow turbine blades directly determines the dimension accuracy of subsequent casting,and therefore significantly affects the quality of final product.In this work,we develop a machine learning-based multi-objective optimization framework for improving dimension accuracy of wax pattern by optimizing its process parameters.We consider two optimization objectives on the dimension of wax pattern,i.e.,the surface warpage and core offset.An active learning of Bayesian optimization is employed in data sampling to determine process parameters,and a validated numerical model of injection molding is used to compute objective results of dimension under varied process parameters.The collected dataset is then leveraged to train different machine learning models,and it turns out that the Gaussian process regression model performs best in prediction accuracy,which is then used as the surrogate model in the optimization framework.A genetic algorithm is employed to produce a non-dominated Pareto front using the surrogate model in searching,followed by an entropy weight method to select the most optimal solution from the Pareto front.The optimized set of process parameters is then compared to empirical parameters obtained from previous trial-and-error experiments,and it turns out that the maximum and average warpage results of the optimized solution decrease 26.0%and 20.2%,and the maximum and average errors of wall thickness compared to standard part decrease from 0.22 mm and 0.0517 mm using empirical parameters to 0.10 mm and 0.0356 mm using optimized parameters,respectively.This framework is demonstrated capable of addressing the challenge of dimension control arising in the wax pattern production,and it can be reliably deployed in varied types of turbine blades to significantly reduce the manufacturing cost of turbine blades.展开更多
基金Project(60772089) supported by the National Natural Science Foundation of ChinaProject(20080440939) supported by the China Postdoctoral Science Foundation
文摘In the process of numerical control machining simulation,the workpiece surface is usually described with the uniform triangular mesh model.To alleviate the contradiction between the simulation speed and accuracy in this model,two improved methods,i.e.,the local refinement triangular mesh modeling method and the adaptive triangular mesh modeling method were presented.The simulation results show that when the final shape of the workpiece is known and its mathematic representation is simple,the local refinement triangular mesh modeling method is preferred;when the final shape of the workpiece is unknown and its mathematic description is complicated,the adaptive triangular mesh modeling method is more suitable.The experimental results show that both methods are more targeted and practical and can meet the requirements of real-time and precision in simulation.
基金supported by the Collaborative Innovation Center of Ma jor Machine Manufacturing in Liaoning
文摘In the machining process of large-scale complex curved surface,workers will encounter problems such as empty stroke of tool,collision interference,and overcut or undercut of the workpieces.This paper presents a method for generating the optimized tool path,compiling and checking the numerical control(NC)program.Taking the bogie frame as an example,the tool paths of all machining surface are optimized by the dynamic programming algorithm,Creo software is utilized to compile the optimized computerized numerical control(CNC)machining program,and VERICUT software is employed to simulate the machining process,optimize the amount of cutting and inspect the machining quality.The method saves the machining time,guarantees the correctness of NC program,and the overall machining efficiency is improved.The method lays a good theoretical and practical foundation for integration of the similar platform.
基金supported by National Natural Science Foundation of China (No.51205054)National Key Technology Research and Development Program During the Twelfth Five-year Plan(Nos.2012BAF10B11,2012BAF12B08)
文摘With the development of manufacturing,numerical control(NC) machining simulation has become a modern tool to obtain safe and reliable machining operations.Although some research and commercial software about NC machining simulations is available,most of them is oriented for G&M code.It is a low-level data model for computer numerical control(CNC),which has inherent drawbacks such as incomplete data and lack of accuracy.These limitations hinder the development of a real simulation system.Whereas,standard for the exchange of product data-compliant numerical control(STEP-NC) is a new and high-level data model for CNC.It provides rich information for CNC machine tools,which creates the condition for an informative and real simulation.Therefore,this paper proposes STEP-NC based high-level NC machining simulations solution integrated with computer-aided design/computeraided process planning/computer-aided manufacturing(CAD/CAPP/CAM).It turned out that the research provides a better informed simulation environment and promotes the development of modern manufacturing.
文摘为提高数控机床的精度,基于模拟退火算法设计数控机床热误差补偿方法,分别建立机床内部零件沿X轴、Y轴、Z轴方向做平移与旋转运动时的变化矩阵,计算电动机与轴承的发热量,二者相加后就可以得到高速运动下机床发热量。基于模拟退火算法建立热误差偏移补偿模型,获得系统温度的状态参量,得到温度下降后求和单元的传递函数,计算偏移补偿模型内X轴、Y轴、Z轴上经过多次迭代后的位置。设计数控机床热误差补偿算法,得到数控机床热误差补偿结果。实验结果显示,该数控机床在Y轴上的热误差值较小,但是在X轴与Y轴上的热误差较大,经过误差补偿后,其热误差分别降低至1~2 m m和0~1 m m,可见该热误差补偿方法效果较好。
基金funded by the National Key Research and Development Program of China(Grant No.2019YFA0705302)the National Science and Technology Major Project“Aeroengine and Gas Turbine”of China(Grant No.2017-VII-0008-0102).
文摘Wax pattern fabrication in the investment casting of hollow turbine blades directly determines the dimension accuracy of subsequent casting,and therefore significantly affects the quality of final product.In this work,we develop a machine learning-based multi-objective optimization framework for improving dimension accuracy of wax pattern by optimizing its process parameters.We consider two optimization objectives on the dimension of wax pattern,i.e.,the surface warpage and core offset.An active learning of Bayesian optimization is employed in data sampling to determine process parameters,and a validated numerical model of injection molding is used to compute objective results of dimension under varied process parameters.The collected dataset is then leveraged to train different machine learning models,and it turns out that the Gaussian process regression model performs best in prediction accuracy,which is then used as the surrogate model in the optimization framework.A genetic algorithm is employed to produce a non-dominated Pareto front using the surrogate model in searching,followed by an entropy weight method to select the most optimal solution from the Pareto front.The optimized set of process parameters is then compared to empirical parameters obtained from previous trial-and-error experiments,and it turns out that the maximum and average warpage results of the optimized solution decrease 26.0%and 20.2%,and the maximum and average errors of wall thickness compared to standard part decrease from 0.22 mm and 0.0517 mm using empirical parameters to 0.10 mm and 0.0356 mm using optimized parameters,respectively.This framework is demonstrated capable of addressing the challenge of dimension control arising in the wax pattern production,and it can be reliably deployed in varied types of turbine blades to significantly reduce the manufacturing cost of turbine blades.