To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new lig...To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new light attention module,and a residue module—that are specially designed to learn the general dynamic behavior,transient disturbances,and other input factors of chemical processes,respectively.Combined with a hyperparameter optimization framework,Optuna,the effectiveness of the proposed LACG is tested by distributed control system data-driven modeling experiments on the discharge flowrate of an actual deethanization process.The LACG model provides significant advantages in prediction accuracy and model generalization compared with other models,including the feedforward neural network,convolution neural network,long short-term memory(LSTM),and attention-LSTM.Moreover,compared with the simulation results of a deethanization model built using Aspen Plus Dynamics V12.1,the LACG parameters are demonstrated to be interpretable,and more details on the variable interactions can be observed from the model parameters in comparison with the traditional interpretable model attention-LSTM.This contribution enriches interpretable machine learning knowledge and provides a reliable method with high accuracy for actual chemical process modeling,paving a route to intelligent manufacturing.展开更多
With the development of automation and informatization in the steelmaking industry,the human brain gradually fails to cope with an increasing amount of data generated during the steelmaking process.Machine learning te...With the development of automation and informatization in the steelmaking industry,the human brain gradually fails to cope with an increasing amount of data generated during the steelmaking process.Machine learning technology provides a new method other than production experience and metallurgical principles in dealing with large amounts of data.The application of machine learning in the steelmaking process has become a research hotspot in recent years.This paper provides an overview of the applications of machine learning in the steelmaking process modeling involving hot metal pretreatment,primary steelmaking,secondary refining,and some other aspects.The three most frequently used machine learning algorithms in steelmaking process modeling are the artificial neural network,support vector machine,and case-based reasoning,demonstrating proportions of 56%,14%,and 10%,respectively.Collected data in the steelmaking plants are frequently faulty.Thus,data processing,especially data cleaning,is crucially important to the performance of machine learning models.The detection of variable importance can be used to optimize the process parameters and guide production.Machine learning is used in hot metal pretreatment modeling mainly for endpoint S content prediction.The predictions of the endpoints of element compositions and the process parameters are widely investigated in primary steelmaking.Machine learning is used in secondary refining modeling mainly for ladle furnaces,Ruhrstahl–Heraeus,vacuum degassing,argon oxygen decarburization,and vacuum oxygen decarburization processes.Further development of machine learning in the steelmaking process modeling can be realized through additional efforts in the construction of the data platform,the industrial transformation of the research achievements to the practical steelmaking process,and the improvement of the universality of the machine learning models.展开更多
The comprehensive tire building and shaping processes are investigated through the finite element method(FEM)in this article.The mechanical properties of the uncured rubber from different tire components are investiga...The comprehensive tire building and shaping processes are investigated through the finite element method(FEM)in this article.The mechanical properties of the uncured rubber from different tire components are investigated through cyclic loading-unloading experiments under different strain rates.Based on the experiments,an elastoviscoplastic constitutive model is adopted to describe themechanical behaviors of the uncured rubber.The distinct mechanical properties,including the stress level,hysteresis and residual strain,of the uncured rubber can all be well characterized.The whole tire building process(including component winding,rubber bladder inflation,component stitching and carcass band folding-back)and the shaping process are simulated using this constitutive model.The simulated green tire profile is in good agreement with the actual profile obtained through 3D scanning.The deformation and stress of the rubber components and the cord reinforcements during production can be obtained fromthe FE simulation,which is helpful for judging the rationality of the tire construction design.Finally,the influence of the parameter“drum width”is investigated,and the simulated result is found to be consistent with the experimental observations,which verifies the effectiveness of the simulation.The established simulation strategy provides some guiding significance for the improvement of tire design parameters and the elimination of tire production defects.展开更多
The heating,ventilating,and air conditioning(HVAC)system consumes nearly 50%of the building’s energy,especially in Taiwan with a hot and humid climate.Due to the challenges in obtaining energy sources and the negativ...The heating,ventilating,and air conditioning(HVAC)system consumes nearly 50%of the building’s energy,especially in Taiwan with a hot and humid climate.Due to the challenges in obtaining energy sources and the negative impacts of excessive energy use on the environment,it is essential to employ an energy-efficient HVAC system.This study conducted the machine tools building in a university.The field measurement was carried out,and the data were used to conduct energymodelling with EnergyPlus(EP)in order to discover some improvements in energy-efficient design.The validation between fieldmeasurement and energymodelling was performed,and the error rate was less than 10%.The following strategies were proposed in this study based on several energy-efficient approaches,including room temperature settings,chilled water supply temperature settings,chiller coefficient of performance(COP),shading,and building location.Energy-efficient approaches have been evaluated and could reduce energy consumption annually.The results reveal that the proposed energy-efficient approaches of room temperature settings(3.8%),chilled water supply temperature settings(2.1%),chiller COP(5.9%),using shading(9.1%),and building location(3.0%),respectively,could reduce energy consumption.The analysis discovered that using a well-performing HVAC system and building shading were effective in lowering the amount of energy used,and the energy modelling method could be an effective and satisfactory tool in determining potential energy savings.展开更多
Presented is a multiple model soft sensing method based on Affinity Propagation (AP), Gaussian process (GP) and Bayesian committee machine (BCM). AP clustering arithmetic is used to cluster training samples acco...Presented is a multiple model soft sensing method based on Affinity Propagation (AP), Gaussian process (GP) and Bayesian committee machine (BCM). AP clustering arithmetic is used to cluster training samples according to their operating points. Then, the sub-models are estimated by Gaussian Process Regression (GPR). Finally, in order to get a global probabilistic prediction, Bayesian committee mactnne is used to combine the outputs of the sub-estimators. The proposed method has been applied to predict the light naphtha end point in hydrocracker fractionators. Practical applications indicate that it is useful for the online prediction of quality monitoring in chemical processes.展开更多
Current orchestration and choreography process engines only serve with dedicate process languages.To solve these problems,an Event-driven Process Execution Model(EPEM) was developed.Formalization and mapping principle...Current orchestration and choreography process engines only serve with dedicate process languages.To solve these problems,an Event-driven Process Execution Model(EPEM) was developed.Formalization and mapping principles of the model were presented to guarantee the correctness and efficiency for process transformation.As a case study,the EPEM descriptions of Web Services Business Process Execution Language(WS-BPEL) were represented and a Process Virtual Machine(PVM)-OncePVM was implemented in compliance with the EPEM.展开更多
Many high-quality forging productions require the large-sized hydraulic press machine(HPM) to have a desirable dynamic response. Since the forging process is complex under the low velocity, its response is difficult...Many high-quality forging productions require the large-sized hydraulic press machine(HPM) to have a desirable dynamic response. Since the forging process is complex under the low velocity, its response is difficult to estimate. And this often causes the desirable low-velocity forging condition difficult to obtain. So far little work has been found to estimate the dynamic response of the forging process under low velocity. In this paper, an approximate-model based estimation method is proposed to estimate the dynamic response of the forging process under low velocity. First, an approximate model is developed to represent the forging process of this complex HPM around the low-velocity working point. Under guaranteeing the modeling performance, the model may greatly ease the complexity of the subsequent estimation of the dynamic response because it has a good linear structure. On this basis, the dynamic response is estimated and the conditions for stability, vibration, and creep are derived according to the solution of the velocity. All these analytical results are further verified by both simulations and experiment. In the simulation verification for modeling, the original movement model and the derived approximate model always have the same dynamic responses with very small approximate error. The simulations and experiment finally demonstrate and test the effectiveness of the derived conditions for stability, vibration, and creep, and these conditions will benefit both the prediction of the dynamic response of the forging process and the design of the controller for the high-quality forging. The proposed method is an effective solution to achieve the desirable low-velocity forging condition.展开更多
The curse of dimensionality refers to the problem o increased sparsity and computational complexity when dealing with high-dimensional data.In recent years,the types and vari ables of industrial data have increased si...The curse of dimensionality refers to the problem o increased sparsity and computational complexity when dealing with high-dimensional data.In recent years,the types and vari ables of industrial data have increased significantly,making data driven models more challenging to develop.To address this prob lem,data augmentation technology has been introduced as an effective tool to solve the sparsity problem of high-dimensiona industrial data.This paper systematically explores and discusses the necessity,feasibility,and effectiveness of augmented indus trial data-driven modeling in the context of the curse of dimen sionality and virtual big data.Then,the process of data augmen tation modeling is analyzed,and the concept of data boosting augmentation is proposed.The data boosting augmentation involves designing the reliability weight and actual-virtual weigh functions,and developing a double weighted partial least squares model to optimize the three stages of data generation,data fusion and modeling.This approach significantly improves the inter pretability,effectiveness,and practicality of data augmentation in the industrial modeling.Finally,the proposed method is verified using practical examples of fault diagnosis systems and virtua measurement systems in the industry.The results demonstrate the effectiveness of the proposed approach in improving the accu racy and robustness of data-driven models,making them more suitable for real-world industrial applications.展开更多
The production process plan design and configurations of reconfigurable machine tool (RMT) interact with each other. Reasonable process plans with suitable configurations of RMT help to improve product quality and r...The production process plan design and configurations of reconfigurable machine tool (RMT) interact with each other. Reasonable process plans with suitable configurations of RMT help to improve product quality and reduce production cost. Therefore, a cooperative strategy is needed to concurrently solve the above issue. In this paper, the cooperative optimization model for RMT configurations and production process plan is presented. Its objectives take into account both impacts of process and configuration. Moreover, a novel genetic algorithm is also developed to provide optimal or near-optimal solutions: firstly, its chromosome is redesigned which is composed of three parts, operations, process plan and configurations of RMTs, respectively; secondly, its new selection, crossover and mutation operators are also developed to deal with the process constraints from operation processes (OP) graph, otherwise these operators could generate illegal solutions violating the limits; eventually the optimal configurations for RMT under optimal process plan design can be obtained. At last, a manufacturing line case is applied which is composed of three RMTs. It is shown from the case that the optimal process plan and configurations of RMT are concurrently obtained, and the production cost decreases 6.28% and nonmonetary performance increases 22%. The proposed method can figure out both RMT configurations and production process, improve production capacity, functions and equipment utilization for RMT.展开更多
The axial selection of tunnels constructed in the interlayered soft-hard rock mass affects the stability and safety during construction.Previous optimization is primarily based on experience or comparison and selectio...The axial selection of tunnels constructed in the interlayered soft-hard rock mass affects the stability and safety during construction.Previous optimization is primarily based on experience or comparison and selection of alternative values under specific geological conditions.In this work,an intelligent optimization framework has been proposed by combining numerical analysis,machine learning(ML)and optimization algorithm.An automatic and intelligent numerical analysis process was proposed and coded to reduce redundant manual intervention.The conventional optimization algorithm was developed from two aspects and applied to the hyperparameters estimation of the support vector machine(SVM)model and the axial orientation optimization of the tunnel.Finally,the comprehensive framework was applied to a numerical case study,and the results were compared with those of other studies.The results of this study indicate that the determination coefficients between the predicted and the numerical stability evaluation indices(STIs)on the training and testing datasets are 0.998 and 0.997,respectively.For a given geological condition,the STI that changes with the axial orientation shows the trend of first decreasing and then increasing,and the optimal tunnel axial orientation is estimated to be 87.This method provides an alternative and quick approach to the overall design of the tunnels.展开更多
The building sector significantly contributes to climate change.To improve its carbon footprint,applications like model predictive control and predictive maintenance rely on system models.However,the high modeling eff...The building sector significantly contributes to climate change.To improve its carbon footprint,applications like model predictive control and predictive maintenance rely on system models.However,the high modeling effort hinders practical application.Machine learning models can significantly reduce this modeling effort.To ensure a machine learning model’s reliability in all operating states,it is essential to know its validity domain.Operating states outside the validity domain might lead to extrapolation,resulting in unpredictable behavior.This paper addresses the challenge of identifying extrapolation in data-driven building energy system models and aims to raise knowledge about it.For that,a novel approach is proposed that calibrates novelty detection algorithms towards the machine learning model.Suitable novelty detection algorithms are identified through a literature review and a benchmark test with 15 candidates.A subset of five algorithms is then evaluated on building energy systems.First,on two-dimensional data,displaying the results with a novel visualization scheme.Then on more complex multi-dimensional use cases.The methodology performs well,and the validity domain could be approximated.The visualization allows for a profound analysis and an improved understanding of the fundamental effects behind a machine learning model’s validity domain and the extrapolation regimes.展开更多
The rapid evolution of wireless communication technologies has underscored the critical role of antennas in ensuring seamless connectivity.Antenna defects,ranging from manufacturing imperfections to environmental wear...The rapid evolution of wireless communication technologies has underscored the critical role of antennas in ensuring seamless connectivity.Antenna defects,ranging from manufacturing imperfections to environmental wear,pose significant challenges to the reliability and performance of communication systems.This review paper navigates the landscape of antenna defect detection,emphasizing the need for a nuanced understanding of various defect types and the associated challenges in visual detection.This review paper serves as a valuable resource for researchers,engineers,and practitioners engaged in the design and maintenance of communication systems.The insights presented here pave the way for enhanced reliability in antenna systems through targeted defect detection measures.In this study,a comprehensive literature analysis on computer vision algorithms that are employed in end-of-line visual inspection of antenna parts is presented.The PRISMA principles will be followed throughout the review,and its goals are to provide a summary of recent research,identify relevant computer vision techniques,and evaluate how effective these techniques are in discovering defects during inspections.It contains articles from scholarly journals as well as papers presented at conferences up until June 2023.This research utilized search phrases that were relevant,and papers were chosen based on whether or not they met certain inclusion and exclusion criteria.In this study,several different computer vision approaches,such as feature extraction and defect classification,are broken down and analyzed.Additionally,their applicability and performance are discussed.The review highlights the significance of utilizing a wide variety of datasets and measurement criteria.The findings of this study add to the existing body of knowledge and point researchers in the direction of promising new areas of investigation,such as real-time inspection systems and multispectral imaging.This review,on its whole,offers a complete study of computer vision approaches for quality control in antenna parts.It does so by providing helpful insights and drawing attention to areas that require additional exploration.展开更多
Implementing new machine learning(ML)algorithms for credit default prediction is associated with better predictive performance;however,it also generates new model risks,particularly concerning the supervisory validati...Implementing new machine learning(ML)algorithms for credit default prediction is associated with better predictive performance;however,it also generates new model risks,particularly concerning the supervisory validation process.Recent industry surveys often mention that uncertainty about how supervisors might assess these risks could be a barrier to innovation.In this study,we propose a new framework to quantify model risk-adjustments to compare the performance of several ML methods.To address this challenge,we first harness the internal ratings-based approach to identify up to 13 risk components that we classify into 3 main categories—statistics,technology,and market conduct.Second,to evaluate the importance of each risk category,we collect a series of regulatory documents related to three potential use cases—regulatory capital,credit scoring,or provisioning—and we compute the weight of each category according to the intensity of their mentions,using natural language processing and a risk terminology based on expert knowledge.Finally,we test our framework using popular ML models in credit risk,and a publicly available database,to quantify some proxies of a subset of risk factors that we deem representative.We measure the statistical risk according to the number of hyperparameters and the stability of the predictions.The technological risk is assessed through the transparency of the algorithm and the latency of the ML training method,while the market conduct risk is quantified by the time it takes to run a post hoc technique(SHapley Additive exPlanations)to interpret the output.展开更多
In view of the lack of research on the information model of tufting carpet machine in China,an information modeling method based on Object Linking and Embedding for Process Control Unified Architecture(OPC UA)framewor...In view of the lack of research on the information model of tufting carpet machine in China,an information modeling method based on Object Linking and Embedding for Process Control Unified Architecture(OPC UA)framework was proposed to solve the problem of“information island”caused by the differentiated data interface between heterogeneous equipment and system in tufting carpet machine workshop.This paper established an information model of tufting carpet machine based on analyzing the system architecture,workshop equipment composition and information flow of the workshop,combined with the OPC UA information modeling specification.Subsequently,the OPC UA protocol is used to instantiate and map the information model,and the OPC UA server is developed.Finally,the practicability of tufting carpet machine information model under the OPC UA framework and the feasibility of realizing the information interconnection of heterogeneous devices in the tufting carpet machine digital workshop are verified.On this basis,the cloud and remote access to the underlying device data are realized.The application of this information model and information integration scheme in actual production explores and practices the application of OPC UA technology in the digital workshop of tufting carpet machine.展开更多
For the significant energy consumption and environmental impact,it is crucial to identify the carbon emission characteristics of building foundations construction during the design phase.This study would like to estab...For the significant energy consumption and environmental impact,it is crucial to identify the carbon emission characteristics of building foundations construction during the design phase.This study would like to establish a process-based carbon evaluating model,by adopting Building Information Modeling(BIM),and calculated the materialization-stage carbon emissions of building foundations without basement space in China,and identifying factors influencing the emissions through correlation analysis.These five factors include the building function type,building structure type,foundation area,foundation treatment method,and foundation depth.Additionally,this study develops several machine learning-based predictive models,including Decision Tree,Random Forest,XGBoost,and Neural Network.Among these models,XGBoost demonstrates a relatively higher degree of accuracy and minimal errors,can achieve the RMSE of 206.62 and R2 of 0.88 based on testing group feedback.The study reveals a substantial variability carbon emissions per building’s floor area of foundations,ranging from 100 to 2000 kgCO_(2)e/m^(2),demonstrating the potential for optimizing carbon emissions during the design phase of buildings.Besides,materials contribute significantly to total carbon emissions,accounting for 78%e97%,suggesting a significant opportunity for using BIM technology in the design phase to optimize carbon reduction efforts.展开更多
The metastable retained austenite(RA)plays a significant role in the excellent mechanical performance of quenching and partitioning(Q&P)steels,while the volume fraction of RA(V_(RA))is challengeable to directly pr...The metastable retained austenite(RA)plays a significant role in the excellent mechanical performance of quenching and partitioning(Q&P)steels,while the volume fraction of RA(V_(RA))is challengeable to directly predict due to the complicated relationships between the chemical composition and process(like quenching temperature(Qr)).A Gaussian process regression model in machine learning was developed to predict V_(RA),and the model accuracy was further improved by introducing a metallurgical parameter of martensite fraction(fo)to accurately predict V_(RA) in Q&P steels.The developed machine learning model combined with Bayesian global optimization can serve as another selection strategy for the quenching temperature,and this strategy is very effcient as it found the"optimum"Qr with the maximum V_(RA) using only seven consecutive iterations.The benchmark experiment also reveals that the developed machine learning model predicts V_(RA) more accurately than the popular constrained carbon equilibrium thermodynamic model,even better than a thermo-kinetic quenching-partitioning-tempering-local equilibrium model.展开更多
The world’s increasing population requires the process industry to produce food,fuels,chemicals,and consumer products in a more efficient and sustainable way.Functional process materials lie at the heart of this chal...The world’s increasing population requires the process industry to produce food,fuels,chemicals,and consumer products in a more efficient and sustainable way.Functional process materials lie at the heart of this challenge.Traditionally,new advanced materials are found empirically or through trial-and-error approaches.As theoretical methods and associated tools are being continuously improved and computer power has reached a high level,it is now efficient and popular to use computational methods to guide material selection and design.Due to the strong interaction between material selection and the operation of the process in which the material is used,it is essential to perform material and process design simultaneously.Despite this significant connection,the solution of the integrated material and process design problem is not easy because multiple models at different scales are usually required.Hybrid modeling provides a promising option to tackle such complex design problems.In hybrid modeling,the material properties,which are computationally expensive to obtain,are described by data-driven models,while the well-known process-related principles are represented by mechanistic models.This article highlights the significance of hybrid modeling in multiscale material and process design.The generic design methodology is first introduced.Six important application areas are then selected:four from the chemical engineering field and two from the energy systems engineering domain.For each selected area,state-ofthe-art work using hybrid modeling for multiscale material and process design is discussed.Concluding remarks are provided at the end,and current limitations and future opportunities are pointed out.展开更多
In this paper,an interacting multiple-model(IMM)method based on datadriven identification model is proposed for the prediction of nonlinear dynamic systems.Firstly,two basic models are selected as combination componen...In this paper,an interacting multiple-model(IMM)method based on datadriven identification model is proposed for the prediction of nonlinear dynamic systems.Firstly,two basic models are selected as combination components due to their proved effectiveness.One is Gaussian process(GP)model,which can provide the predictive variance of the predicted output and only has several optimizing parameters.The other is regularized extreme learning machine(RELM)model,which can improve the overfitting problem resulted by empirical risk minimization principle and enhances the overall generalization performance.Then both of the models are updated continually using meaningful new data selected by data selection methods.Furthermore,recursive methods are employed in the two models to reduce the computational burden caused by continuous renewal.Finally,the two models are combined in IMM algorithm to realize the hybrid prediction,which can avoid the error accumulation in the single-model prediction.In order to verify the performance,the proposed method is applied to the prediction of moisture content of alkali-surfactant-polymer(ASP)flooding.The simulation results show that the proposed model can match the process very well.And IMM algorithm can outperform its components and provide a nice improvement in accuracy and robustness.展开更多
In this research, effect of varying spatial orientations on the build time requirements for fused deposition modelling process is studied. Constructive solid geometry cylindrical primitive is taken as work piece and m...In this research, effect of varying spatial orientations on the build time requirements for fused deposition modelling process is studied. Constructive solid geometry cylindrical primitive is taken as work piece and modeling is accomplished for it. Response surface methodology is used to design the experiments and obtain statistical models for build time requirements corresponding to different orientations of the given primitive in modeller build volume. Contour width, air gap, slice height, raster width, raster angle and angle of orientation are treated as process parameters. Percentage contribution of individual process parameter is found to change for build time corresponding to different spatial orientations. Also, the average of build time requirement changes with spatial orientation. This paper attempts to clearly discuss and describe the observations with an aim to develop a clear understanding of effect of spatial variations on the build time for Fused Deposition Modelling process. This work is an integral part of process layout optimization and these results can effectively aid designers specially while tackling nesting issues.展开更多
基金support provided by the National Natural Science Foundation of China(22122802,22278044,and 21878028)the Chongqing Science Fund for Distinguished Young Scholars(CSTB2022NSCQ-JQX0021)the Fundamental Research Funds for the Central Universities(2022CDJXY-003).
文摘To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new light attention module,and a residue module—that are specially designed to learn the general dynamic behavior,transient disturbances,and other input factors of chemical processes,respectively.Combined with a hyperparameter optimization framework,Optuna,the effectiveness of the proposed LACG is tested by distributed control system data-driven modeling experiments on the discharge flowrate of an actual deethanization process.The LACG model provides significant advantages in prediction accuracy and model generalization compared with other models,including the feedforward neural network,convolution neural network,long short-term memory(LSTM),and attention-LSTM.Moreover,compared with the simulation results of a deethanization model built using Aspen Plus Dynamics V12.1,the LACG parameters are demonstrated to be interpretable,and more details on the variable interactions can be observed from the model parameters in comparison with the traditional interpretable model attention-LSTM.This contribution enriches interpretable machine learning knowledge and provides a reliable method with high accuracy for actual chemical process modeling,paving a route to intelligent manufacturing.
基金supported by the National Natural Science Foundation of China(No.U1960202)。
文摘With the development of automation and informatization in the steelmaking industry,the human brain gradually fails to cope with an increasing amount of data generated during the steelmaking process.Machine learning technology provides a new method other than production experience and metallurgical principles in dealing with large amounts of data.The application of machine learning in the steelmaking process has become a research hotspot in recent years.This paper provides an overview of the applications of machine learning in the steelmaking process modeling involving hot metal pretreatment,primary steelmaking,secondary refining,and some other aspects.The three most frequently used machine learning algorithms in steelmaking process modeling are the artificial neural network,support vector machine,and case-based reasoning,demonstrating proportions of 56%,14%,and 10%,respectively.Collected data in the steelmaking plants are frequently faulty.Thus,data processing,especially data cleaning,is crucially important to the performance of machine learning models.The detection of variable importance can be used to optimize the process parameters and guide production.Machine learning is used in hot metal pretreatment modeling mainly for endpoint S content prediction.The predictions of the endpoints of element compositions and the process parameters are widely investigated in primary steelmaking.Machine learning is used in secondary refining modeling mainly for ladle furnaces,Ruhrstahl–Heraeus,vacuum degassing,argon oxygen decarburization,and vacuum oxygen decarburization processes.Further development of machine learning in the steelmaking process modeling can be realized through additional efforts in the construction of the data platform,the industrial transformation of the research achievements to the practical steelmaking process,and the improvement of the universality of the machine learning models.
基金funded by the NationalNatural Science Foundation of China (Nos.11902229,11502181)the Strategic Priority Research Program of the Chinese Academy of Sciences (Grant Nos.XDB22040502,XDC06030200).
文摘The comprehensive tire building and shaping processes are investigated through the finite element method(FEM)in this article.The mechanical properties of the uncured rubber from different tire components are investigated through cyclic loading-unloading experiments under different strain rates.Based on the experiments,an elastoviscoplastic constitutive model is adopted to describe themechanical behaviors of the uncured rubber.The distinct mechanical properties,including the stress level,hysteresis and residual strain,of the uncured rubber can all be well characterized.The whole tire building process(including component winding,rubber bladder inflation,component stitching and carcass band folding-back)and the shaping process are simulated using this constitutive model.The simulated green tire profile is in good agreement with the actual profile obtained through 3D scanning.The deformation and stress of the rubber components and the cord reinforcements during production can be obtained fromthe FE simulation,which is helpful for judging the rationality of the tire construction design.Finally,the influence of the parameter“drum width”is investigated,and the simulated result is found to be consistent with the experimental observations,which verifies the effectiveness of the simulation.The established simulation strategy provides some guiding significance for the improvement of tire design parameters and the elimination of tire production defects.
基金support by the Ministry of Science and Technology under Grant No.MOST 108-2622-E-169-006-CC3.
文摘The heating,ventilating,and air conditioning(HVAC)system consumes nearly 50%of the building’s energy,especially in Taiwan with a hot and humid climate.Due to the challenges in obtaining energy sources and the negative impacts of excessive energy use on the environment,it is essential to employ an energy-efficient HVAC system.This study conducted the machine tools building in a university.The field measurement was carried out,and the data were used to conduct energymodelling with EnergyPlus(EP)in order to discover some improvements in energy-efficient design.The validation between fieldmeasurement and energymodelling was performed,and the error rate was less than 10%.The following strategies were proposed in this study based on several energy-efficient approaches,including room temperature settings,chilled water supply temperature settings,chiller coefficient of performance(COP),shading,and building location.Energy-efficient approaches have been evaluated and could reduce energy consumption annually.The results reveal that the proposed energy-efficient approaches of room temperature settings(3.8%),chilled water supply temperature settings(2.1%),chiller COP(5.9%),using shading(9.1%),and building location(3.0%),respectively,could reduce energy consumption.The analysis discovered that using a well-performing HVAC system and building shading were effective in lowering the amount of energy used,and the energy modelling method could be an effective and satisfactory tool in determining potential energy savings.
基金Supported by the National High Technology Research and Development Program of China (2006AA040309)National BasicResearch Program of China (2007CB714000)
文摘Presented is a multiple model soft sensing method based on Affinity Propagation (AP), Gaussian process (GP) and Bayesian committee machine (BCM). AP clustering arithmetic is used to cluster training samples according to their operating points. Then, the sub-models are estimated by Gaussian Process Regression (GPR). Finally, in order to get a global probabilistic prediction, Bayesian committee mactnne is used to combine the outputs of the sub-estimators. The proposed method has been applied to predict the light naphtha end point in hydrocracker fractionators. Practical applications indicate that it is useful for the online prediction of quality monitoring in chemical processes.
文摘Current orchestration and choreography process engines only serve with dedicate process languages.To solve these problems,an Event-driven Process Execution Model(EPEM) was developed.Formalization and mapping principles of the model were presented to guarantee the correctness and efficiency for process transformation.As a case study,the EPEM descriptions of Web Services Business Process Execution Language(WS-BPEL) were represented and a Process Virtual Machine(PVM)-OncePVM was implemented in compliance with the EPEM.
基金Supported by National Basic Research Program of China(973 Program,Grant No.2011CB706802)National Natural Science Foundation of China(Grant No.51205420)+1 种基金Program for New Century Excellent Talents in University of China(Grant No.NCET-13-0593)Hunan Provincial Natural Science Foundation of China(Grant No.14JJ3011)
文摘Many high-quality forging productions require the large-sized hydraulic press machine(HPM) to have a desirable dynamic response. Since the forging process is complex under the low velocity, its response is difficult to estimate. And this often causes the desirable low-velocity forging condition difficult to obtain. So far little work has been found to estimate the dynamic response of the forging process under low velocity. In this paper, an approximate-model based estimation method is proposed to estimate the dynamic response of the forging process under low velocity. First, an approximate model is developed to represent the forging process of this complex HPM around the low-velocity working point. Under guaranteeing the modeling performance, the model may greatly ease the complexity of the subsequent estimation of the dynamic response because it has a good linear structure. On this basis, the dynamic response is estimated and the conditions for stability, vibration, and creep are derived according to the solution of the velocity. All these analytical results are further verified by both simulations and experiment. In the simulation verification for modeling, the original movement model and the derived approximate model always have the same dynamic responses with very small approximate error. The simulations and experiment finally demonstrate and test the effectiveness of the derived conditions for stability, vibration, and creep, and these conditions will benefit both the prediction of the dynamic response of the forging process and the design of the controller for the high-quality forging. The proposed method is an effective solution to achieve the desirable low-velocity forging condition.
基金supported in part by the National Natural Science Foundation of China(NSFC)(92167106,61833014)Key Research and Development Program of Zhejiang Province(2022C01206)。
文摘The curse of dimensionality refers to the problem o increased sparsity and computational complexity when dealing with high-dimensional data.In recent years,the types and vari ables of industrial data have increased significantly,making data driven models more challenging to develop.To address this prob lem,data augmentation technology has been introduced as an effective tool to solve the sparsity problem of high-dimensiona industrial data.This paper systematically explores and discusses the necessity,feasibility,and effectiveness of augmented indus trial data-driven modeling in the context of the curse of dimen sionality and virtual big data.Then,the process of data augmen tation modeling is analyzed,and the concept of data boosting augmentation is proposed.The data boosting augmentation involves designing the reliability weight and actual-virtual weigh functions,and developing a double weighted partial least squares model to optimize the three stages of data generation,data fusion and modeling.This approach significantly improves the inter pretability,effectiveness,and practicality of data augmentation in the industrial modeling.Finally,the proposed method is verified using practical examples of fault diagnosis systems and virtua measurement systems in the industry.The results demonstrate the effectiveness of the proposed approach in improving the accu racy and robustness of data-driven models,making them more suitable for real-world industrial applications.
基金supported by National Natural Science Foundation of China (Grant Nos. 51005169, 50875187, 50975209)Shanghai Municipal Natural Science Foundation of China (Grant No. 10ZR1432300)+1 种基金International Science & Technology Cooperation Program of China (Grant No. 2012DFG72210)Zhejiang Provincial Key International Science & Technology Cooperation Program of China (Grant No. 2011C14025)
文摘The production process plan design and configurations of reconfigurable machine tool (RMT) interact with each other. Reasonable process plans with suitable configurations of RMT help to improve product quality and reduce production cost. Therefore, a cooperative strategy is needed to concurrently solve the above issue. In this paper, the cooperative optimization model for RMT configurations and production process plan is presented. Its objectives take into account both impacts of process and configuration. Moreover, a novel genetic algorithm is also developed to provide optimal or near-optimal solutions: firstly, its chromosome is redesigned which is composed of three parts, operations, process plan and configurations of RMTs, respectively; secondly, its new selection, crossover and mutation operators are also developed to deal with the process constraints from operation processes (OP) graph, otherwise these operators could generate illegal solutions violating the limits; eventually the optimal configurations for RMT under optimal process plan design can be obtained. At last, a manufacturing line case is applied which is composed of three RMTs. It is shown from the case that the optimal process plan and configurations of RMT are concurrently obtained, and the production cost decreases 6.28% and nonmonetary performance increases 22%. The proposed method can figure out both RMT configurations and production process, improve production capacity, functions and equipment utilization for RMT.
基金supported by the National Natural Science Foundation of China(Grant Nos.51991392 and 51922104).
文摘The axial selection of tunnels constructed in the interlayered soft-hard rock mass affects the stability and safety during construction.Previous optimization is primarily based on experience or comparison and selection of alternative values under specific geological conditions.In this work,an intelligent optimization framework has been proposed by combining numerical analysis,machine learning(ML)and optimization algorithm.An automatic and intelligent numerical analysis process was proposed and coded to reduce redundant manual intervention.The conventional optimization algorithm was developed from two aspects and applied to the hyperparameters estimation of the support vector machine(SVM)model and the axial orientation optimization of the tunnel.Finally,the comprehensive framework was applied to a numerical case study,and the results were compared with those of other studies.The results of this study indicate that the determination coefficients between the predicted and the numerical stability evaluation indices(STIs)on the training and testing datasets are 0.998 and 0.997,respectively.For a given geological condition,the STI that changes with the axial orientation shows the trend of first decreasing and then increasing,and the optimal tunnel axial orientation is estimated to be 87.This method provides an alternative and quick approach to the overall design of the tunnels.
基金the financial support by the Federal Ministry for Economic Affairs and Climate Action(BMWK),promotional reference 03EN1066A and 03EN3060Dfunding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No.101023666.
文摘The building sector significantly contributes to climate change.To improve its carbon footprint,applications like model predictive control and predictive maintenance rely on system models.However,the high modeling effort hinders practical application.Machine learning models can significantly reduce this modeling effort.To ensure a machine learning model’s reliability in all operating states,it is essential to know its validity domain.Operating states outside the validity domain might lead to extrapolation,resulting in unpredictable behavior.This paper addresses the challenge of identifying extrapolation in data-driven building energy system models and aims to raise knowledge about it.For that,a novel approach is proposed that calibrates novelty detection algorithms towards the machine learning model.Suitable novelty detection algorithms are identified through a literature review and a benchmark test with 15 candidates.A subset of five algorithms is then evaluated on building energy systems.First,on two-dimensional data,displaying the results with a novel visualization scheme.Then on more complex multi-dimensional use cases.The methodology performs well,and the validity domain could be approximated.The visualization allows for a profound analysis and an improved understanding of the fundamental effects behind a machine learning model’s validity domain and the extrapolation regimes.
文摘The rapid evolution of wireless communication technologies has underscored the critical role of antennas in ensuring seamless connectivity.Antenna defects,ranging from manufacturing imperfections to environmental wear,pose significant challenges to the reliability and performance of communication systems.This review paper navigates the landscape of antenna defect detection,emphasizing the need for a nuanced understanding of various defect types and the associated challenges in visual detection.This review paper serves as a valuable resource for researchers,engineers,and practitioners engaged in the design and maintenance of communication systems.The insights presented here pave the way for enhanced reliability in antenna systems through targeted defect detection measures.In this study,a comprehensive literature analysis on computer vision algorithms that are employed in end-of-line visual inspection of antenna parts is presented.The PRISMA principles will be followed throughout the review,and its goals are to provide a summary of recent research,identify relevant computer vision techniques,and evaluate how effective these techniques are in discovering defects during inspections.It contains articles from scholarly journals as well as papers presented at conferences up until June 2023.This research utilized search phrases that were relevant,and papers were chosen based on whether or not they met certain inclusion and exclusion criteria.In this study,several different computer vision approaches,such as feature extraction and defect classification,are broken down and analyzed.Additionally,their applicability and performance are discussed.The review highlights the significance of utilizing a wide variety of datasets and measurement criteria.The findings of this study add to the existing body of knowledge and point researchers in the direction of promising new areas of investigation,such as real-time inspection systems and multispectral imaging.This review,on its whole,offers a complete study of computer vision approaches for quality control in antenna parts.It does so by providing helpful insights and drawing attention to areas that require additional exploration.
文摘Implementing new machine learning(ML)algorithms for credit default prediction is associated with better predictive performance;however,it also generates new model risks,particularly concerning the supervisory validation process.Recent industry surveys often mention that uncertainty about how supervisors might assess these risks could be a barrier to innovation.In this study,we propose a new framework to quantify model risk-adjustments to compare the performance of several ML methods.To address this challenge,we first harness the internal ratings-based approach to identify up to 13 risk components that we classify into 3 main categories—statistics,technology,and market conduct.Second,to evaluate the importance of each risk category,we collect a series of regulatory documents related to three potential use cases—regulatory capital,credit scoring,or provisioning—and we compute the weight of each category according to the intensity of their mentions,using natural language processing and a risk terminology based on expert knowledge.Finally,we test our framework using popular ML models in credit risk,and a publicly available database,to quantify some proxies of a subset of risk factors that we deem representative.We measure the statistical risk according to the number of hyperparameters and the stability of the predictions.The technological risk is assessed through the transparency of the algorithm and the latency of the ML training method,while the market conduct risk is quantified by the time it takes to run a post hoc technique(SHapley Additive exPlanations)to interpret the output.
文摘In view of the lack of research on the information model of tufting carpet machine in China,an information modeling method based on Object Linking and Embedding for Process Control Unified Architecture(OPC UA)framework was proposed to solve the problem of“information island”caused by the differentiated data interface between heterogeneous equipment and system in tufting carpet machine workshop.This paper established an information model of tufting carpet machine based on analyzing the system architecture,workshop equipment composition and information flow of the workshop,combined with the OPC UA information modeling specification.Subsequently,the OPC UA protocol is used to instantiate and map the information model,and the OPC UA server is developed.Finally,the practicability of tufting carpet machine information model under the OPC UA framework and the feasibility of realizing the information interconnection of heterogeneous devices in the tufting carpet machine digital workshop are verified.On this basis,the cloud and remote access to the underlying device data are realized.The application of this information model and information integration scheme in actual production explores and practices the application of OPC UA technology in the digital workshop of tufting carpet machine.
基金supported by the National Key Research and Development Program of China(Grant No.2022YFE0208600)the Key Research and Development Plan of Shaanxi Province of China(Grant No.2023-ZDLSF-66)+1 种基金the National Natural Science Foundation of China(Grant No.51908111)the SRTP project of Southeast University(Grant No.202310286006Z).
文摘For the significant energy consumption and environmental impact,it is crucial to identify the carbon emission characteristics of building foundations construction during the design phase.This study would like to establish a process-based carbon evaluating model,by adopting Building Information Modeling(BIM),and calculated the materialization-stage carbon emissions of building foundations without basement space in China,and identifying factors influencing the emissions through correlation analysis.These five factors include the building function type,building structure type,foundation area,foundation treatment method,and foundation depth.Additionally,this study develops several machine learning-based predictive models,including Decision Tree,Random Forest,XGBoost,and Neural Network.Among these models,XGBoost demonstrates a relatively higher degree of accuracy and minimal errors,can achieve the RMSE of 206.62 and R2 of 0.88 based on testing group feedback.The study reveals a substantial variability carbon emissions per building’s floor area of foundations,ranging from 100 to 2000 kgCO_(2)e/m^(2),demonstrating the potential for optimizing carbon emissions during the design phase of buildings.Besides,materials contribute significantly to total carbon emissions,accounting for 78%e97%,suggesting a significant opportunity for using BIM technology in the design phase to optimize carbon reduction efforts.
基金The authors acknowledge financial support from the National Natural Science Foundation of China(Grant Nos.51771114 and 51371117).
文摘The metastable retained austenite(RA)plays a significant role in the excellent mechanical performance of quenching and partitioning(Q&P)steels,while the volume fraction of RA(V_(RA))is challengeable to directly predict due to the complicated relationships between the chemical composition and process(like quenching temperature(Qr)).A Gaussian process regression model in machine learning was developed to predict V_(RA),and the model accuracy was further improved by introducing a metallurgical parameter of martensite fraction(fo)to accurately predict V_(RA) in Q&P steels.The developed machine learning model combined with Bayesian global optimization can serve as another selection strategy for the quenching temperature,and this strategy is very effcient as it found the"optimum"Qr with the maximum V_(RA) using only seven consecutive iterations.The benchmark experiment also reveals that the developed machine learning model predicts V_(RA) more accurately than the popular constrained carbon equilibrium thermodynamic model,even better than a thermo-kinetic quenching-partitioning-tempering-local equilibrium model.
文摘The world’s increasing population requires the process industry to produce food,fuels,chemicals,and consumer products in a more efficient and sustainable way.Functional process materials lie at the heart of this challenge.Traditionally,new advanced materials are found empirically or through trial-and-error approaches.As theoretical methods and associated tools are being continuously improved and computer power has reached a high level,it is now efficient and popular to use computational methods to guide material selection and design.Due to the strong interaction between material selection and the operation of the process in which the material is used,it is essential to perform material and process design simultaneously.Despite this significant connection,the solution of the integrated material and process design problem is not easy because multiple models at different scales are usually required.Hybrid modeling provides a promising option to tackle such complex design problems.In hybrid modeling,the material properties,which are computationally expensive to obtain,are described by data-driven models,while the well-known process-related principles are represented by mechanistic models.This article highlights the significance of hybrid modeling in multiscale material and process design.The generic design methodology is first introduced.Six important application areas are then selected:four from the chemical engineering field and two from the energy systems engineering domain.For each selected area,state-ofthe-art work using hybrid modeling for multiscale material and process design is discussed.Concluding remarks are provided at the end,and current limitations and future opportunities are pointed out.
基金supported by National Natural Science Foundation under Grant No.60974039National Natural Science Foundation under Grant No.61573378+1 种基金Natural Science Foundation of Shandong province under Grant No.ZR2011FM002the Fundamental Research Funds for the Central Universities under Grant No.15CX06064A.
文摘In this paper,an interacting multiple-model(IMM)method based on datadriven identification model is proposed for the prediction of nonlinear dynamic systems.Firstly,two basic models are selected as combination components due to their proved effectiveness.One is Gaussian process(GP)model,which can provide the predictive variance of the predicted output and only has several optimizing parameters.The other is regularized extreme learning machine(RELM)model,which can improve the overfitting problem resulted by empirical risk minimization principle and enhances the overall generalization performance.Then both of the models are updated continually using meaningful new data selected by data selection methods.Furthermore,recursive methods are employed in the two models to reduce the computational burden caused by continuous renewal.Finally,the two models are combined in IMM algorithm to realize the hybrid prediction,which can avoid the error accumulation in the single-model prediction.In order to verify the performance,the proposed method is applied to the prediction of moisture content of alkali-surfactant-polymer(ASP)flooding.The simulation results show that the proposed model can match the process very well.And IMM algorithm can outperform its components and provide a nice improvement in accuracy and robustness.
文摘In this research, effect of varying spatial orientations on the build time requirements for fused deposition modelling process is studied. Constructive solid geometry cylindrical primitive is taken as work piece and modeling is accomplished for it. Response surface methodology is used to design the experiments and obtain statistical models for build time requirements corresponding to different orientations of the given primitive in modeller build volume. Contour width, air gap, slice height, raster width, raster angle and angle of orientation are treated as process parameters. Percentage contribution of individual process parameter is found to change for build time corresponding to different spatial orientations. Also, the average of build time requirement changes with spatial orientation. This paper attempts to clearly discuss and describe the observations with an aim to develop a clear understanding of effect of spatial variations on the build time for Fused Deposition Modelling process. This work is an integral part of process layout optimization and these results can effectively aid designers specially while tackling nesting issues.