Efficiency of calculating a dynamic response is an important point of the compliant mechanism for posture adjustment.Dynamic modeling with low orders of a 2R1T compliant parallel mechanism is studied in the paper.The ...Efficiency of calculating a dynamic response is an important point of the compliant mechanism for posture adjustment.Dynamic modeling with low orders of a 2R1T compliant parallel mechanism is studied in the paper.The mechanism with two out-of-plane rotational and one lifting degrees of freedom(DoFs)plays an important role in posture adjustment.Based on elastic beam theory,the stiffness matrix and mass matrix of the beam element are established where the moment of inertia is considered.To improve solving efficiency,a dynamic model with low orders of the mechanism is established based on a modified modal synthesis method.Firstly,each branch of the RPR type mechanism is divided into a substructure.Subsequently,a set of hypothetical modes of each substructure is obtained based on the C-B method.Finally,dynamic equation of the whole mechanism is established by the substructure assembly.A dynamic experiment is conducted to verify the dynamic characteristics of the compliant mechanism.展开更多
Light olefins is the incredibly important materials in chemical industry.Methanol to olefins(MTO),which provides a non-oil route for light olefins production,received considerable attention in the past decades.However...Light olefins is the incredibly important materials in chemical industry.Methanol to olefins(MTO),which provides a non-oil route for light olefins production,received considerable attention in the past decades.However,the catalyst deactivation is an inevitable feature in MTO processes,and regeneration,therefore,is one of the key steps in industrial MTO processes.Traditionally the MTO catalyst is regenerated by removing the deposited coke via air combustion,which unavoidably transforms coke into carbon dioxide and reduces the carbon utilization efficiency.Recent study shows that the coke species over MTO catalyst can be regenerated via steam,which can promote the light olefins yield as the deactivated coke species can be essentially transferred to industrially useful synthesis gas,is a promising pathway for further MTO processes development.In this work,we modelled and analyzed these two MTO regeneration methods in terms of carbon utilization efficiency and technology economics.As shown,the steam regeneration could achieve a carbon utilization efficiency of 84.31%,compared to 74.74%for air combustion regeneration.The MTO processes using steam regeneration can essentially achieve the near-zero carbon emission.In addition,light olefins production of the MTO processes using steam regeneration is 12.81%higher than that using air combustion regeneration.In this regard,steam regeneration could be considered as a potential yet promising regeneration method for further MTO processes,showing not only great environmental benefits but also competitive economic performance.展开更多
To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new lig...To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new light attention module,and a residue module—that are specially designed to learn the general dynamic behavior,transient disturbances,and other input factors of chemical processes,respectively.Combined with a hyperparameter optimization framework,Optuna,the effectiveness of the proposed LACG is tested by distributed control system data-driven modeling experiments on the discharge flowrate of an actual deethanization process.The LACG model provides significant advantages in prediction accuracy and model generalization compared with other models,including the feedforward neural network,convolution neural network,long short-term memory(LSTM),and attention-LSTM.Moreover,compared with the simulation results of a deethanization model built using Aspen Plus Dynamics V12.1,the LACG parameters are demonstrated to be interpretable,and more details on the variable interactions can be observed from the model parameters in comparison with the traditional interpretable model attention-LSTM.This contribution enriches interpretable machine learning knowledge and provides a reliable method with high accuracy for actual chemical process modeling,paving a route to intelligent manufacturing.展开更多
The exploration of Mars would heavily rely on Martian rocks mechanics and engineering technology.As the mechanical property of Martian rocks is uncertain,it is of utmost importance to predict the probability distribut...The exploration of Mars would heavily rely on Martian rocks mechanics and engineering technology.As the mechanical property of Martian rocks is uncertain,it is of utmost importance to predict the probability distribution of Martian rocks mechanical property for the success of Mars exploration.In this paper,a fast and accurate probability distribution method for predicting the macroscale elastic modulus of Martian rocks was proposed by integrating the microscale rock mechanical experiments(micro-RME),accurate grain-based modeling(AGBM)and upscaling methods based on reliability principles.Firstly,the microstructure of NWA12564 Martian sample and elastic modulus of each mineral were obtained by micro-RME with TESCAN integrated mineral analyzer(TIMA)and nanoindentation.The best probability distribution function of the minerals was determined by Kolmogorov-Smirnov(K-S)test.Secondly,based on best distribution function of each mineral,the Monte Carlo simulations(MCS)and upscaling methods were implemented to obtain the probability distribution of upscaled elastic modulus.Thirdly,the correlation between the upscaled elastic modulus and macroscale elastic modulus obtained by AGBM was established.The accurate probability distribution of the macroscale elastic modulus was obtained by this correlation relationship.The proposed method can predict the probability distribution of Martian rocks mechanical property with any size and shape samples.展开更多
Geomaterials with inferior hydraulic and strength characteristics often need improvement to enhance their engineering behaviors.Traditional ground improvement techniques require enormous mechanical effort or synthetic...Geomaterials with inferior hydraulic and strength characteristics often need improvement to enhance their engineering behaviors.Traditional ground improvement techniques require enormous mechanical effort or synthetic chemicals.Sustainable stabilization technique such as microbially induced calcite precipitation(MICP)utilizes bacterial metabolic processes to precipitate cementitious calcium carbonate.The reactive transport of biochemical species in the soil mass initiates the precipitation of biocement during the MICP process.The precipitated biocement alters the hydro-mechanical performance of the soil mass.Usually,the flow,deformation,and transport phenomena regulate the biocementation technique via coupled bio-chemo-hydro-mechanical(BCHM)processes.Among all,one crucial phenomenon controlling the precipitation mechanism is the encapsulation of biomass by calcium carbonate.Biomass encapsulation can potentially reduce the biochemical reaction rate and decelerate biocementation.Laboratory examination of the encapsulation process demands a thorough analysis of associated coupled effects.Despite this,a numerical model can assist in capturing the coupled processes influencing encapsulation during the MICP treatment.However,most numerical models did not consider biochemical reaction rate kinetics accounting for the influence of bacterial encapsulation.Given this,the current study developed a coupled BCHM model to evaluate the effect of encapsulation on the precipitated calcite content using a micro-scale semiempirical relationship.Firstly,the developed BCHM model was verified and validated using numerical and experimental observations of soil column tests.Later,the encapsulation phenomenon was investigated in the soil columns of variable maximum calcite crystal sizes.The results depict altered reaction rates due to the encapsulation phenomenon and an observable change in the precipitated calcite content for each maximum crystal size.Furthermore,the permeability and deformation of the soil mass were affected by the simultaneous precipitation of calcium carbonate.Overall,the present study comprehended the influence of the encapsulation of bacteria on cement morphology-induced permeability,biocement-induced stresses and displacements.展开更多
Software Development Life Cycle (SDLC) is one of the major ingredients for the development of efficient software systems within a time frame and low-cost involvement. From the literature, it is evident that there are ...Software Development Life Cycle (SDLC) is one of the major ingredients for the development of efficient software systems within a time frame and low-cost involvement. From the literature, it is evident that there are various kinds of process models that are used by the software industries for the development of small, medium and long-term software projects, but many of them do not cover risk management. It is quite obvious that the improper selection of the software development process model leads to failure of the software products as it is time bound activity. In the present work, a new software development process model is proposed which covers the risks at any stage of the development of the software product. The model is named a Hemant-Vipin (HV) process model and may be helpful for the software industries for development of the efficient software products and timely delivery at the end of the client. The efficiency of the HV process model is observed by considering various kinds of factors like requirement clarity, user feedback, change agility, predictability, risk identification, practical implementation, customer satisfaction, incremental development, use of ready-made components, quick design, resource organization and many more and found through a case study that the presented approach covers many of parameters in comparison of the existing process models. .展开更多
Objective:To explore and analyze the work process-based practical training teaching model for basic nursing skills in vocational colleges and its implementation effects.Methods:A total of 82 nursing students from our ...Objective:To explore and analyze the work process-based practical training teaching model for basic nursing skills in vocational colleges and its implementation effects.Methods:A total of 82 nursing students from our school were selected for the study,which was conducted from April 2023 to April 2024.Using a random number table method,the students were divided into an observation group and a control group,each with 41 students.The control group received conventional practical training teaching,while the observation group followed the work process-based practical training model for basic nursing skills.The assessment scores and teaching satisfaction of the two groups were compared.Results:The comparison of assessment scores showed that the observation group performed significantly better than the control group(P<0.05).The comparison of teaching satisfaction also indicated that the observation group had significantly higher satisfaction than the control group(P<0.05).Conclusion:The work process-based practical training teaching model for basic nursing skills in vocational colleges can improve students’assessment scores and enhance teaching satisfaction,demonstrating its value for wider application.展开更多
The knowledge of the existence,distribution and fate of polycyclic aromatic hydrocarbons(PAHs)and substituted polycyclic aromatic hydrocarbons(SPAHs)in wastewater treatment plants(WWTPs)was vital for reducing their co...The knowledge of the existence,distribution and fate of polycyclic aromatic hydrocarbons(PAHs)and substituted polycyclic aromatic hydrocarbons(SPAHs)in wastewater treatment plants(WWTPs)was vital for reducing their concentrations entering the aquatic environment.The concentrations of 13 SPAHs and 16 PAHs were all determined in a WWTP with styrene butadiene rubber(SBR)in partnership with the moving bed biofilm reactor(MBBR)process.SPAHs presented a higher concentration lever than PAHs in nearly all samples.The total removal efficiencies of PAHs and SPAHs ranged from 64.0%to 71.36%and 78.4%to 79.7%,respectively.The total yearly loads of PAHs(43.0 kg)and SPAHs(73.0 kg)were mainly reduced by the primary and SBR/MBBR biological treatment stages.The tertiary treatment stage had a minor contribution to target compounds removal.According to a synthesis and improvement fate model,we found that the dominant processes changed as the chemical octanol water partition coefficient(K_(ow))increased.But the seasonal variations of experimental removal efficiencies were more obvious than that of predicted data.In the primary sedimentation tank,dissolution in the aqueous phase and sorption to sludge/particulate matter were controlling processes for the removal of PAHs and SPAHs.The sorption to sludge and biodegradation were the principal removal mechanisms during the SBR/MBBR biological treatment process.The contribution of volatilization to removal was always insignificant.Furthermore,the basic physicochemical properties and operating parameters influenced the fate of PAHs and SPAHs in the WWTP.展开更多
To improve the accuracy of node trust evaluation in a distributed network, a trust model based on the experience of individuals is proposed, which establishes a new trust assessment system by introducing the experienc...To improve the accuracy of node trust evaluation in a distributed network, a trust model based on the experience of individuals is proposed, which establishes a new trust assessment system by introducing the experience factor and the comparative experience factor. The new evaluation system considers the differences between individuals and interactive histories between nodes, which solves the problem that nodes have inaccurate assessments due to the asymmetry of nodes to a certain extent. The algorithm analysis indicates that the new model uses different deviating values of tolerance evaluation for different individuals and uses different updating values embodying node individuation when updating feedback credibility of individuals, which evaluates the trust value more reasonably and more accurately. In addition, the proposed algorithm can be used in various trust models and has a good scalability.展开更多
With the development of automation and informatization in the steelmaking industry,the human brain gradually fails to cope with an increasing amount of data generated during the steelmaking process.Machine learning te...With the development of automation and informatization in the steelmaking industry,the human brain gradually fails to cope with an increasing amount of data generated during the steelmaking process.Machine learning technology provides a new method other than production experience and metallurgical principles in dealing with large amounts of data.The application of machine learning in the steelmaking process has become a research hotspot in recent years.This paper provides an overview of the applications of machine learning in the steelmaking process modeling involving hot metal pretreatment,primary steelmaking,secondary refining,and some other aspects.The three most frequently used machine learning algorithms in steelmaking process modeling are the artificial neural network,support vector machine,and case-based reasoning,demonstrating proportions of 56%,14%,and 10%,respectively.Collected data in the steelmaking plants are frequently faulty.Thus,data processing,especially data cleaning,is crucially important to the performance of machine learning models.The detection of variable importance can be used to optimize the process parameters and guide production.Machine learning is used in hot metal pretreatment modeling mainly for endpoint S content prediction.The predictions of the endpoints of element compositions and the process parameters are widely investigated in primary steelmaking.Machine learning is used in secondary refining modeling mainly for ladle furnaces,Ruhrstahl–Heraeus,vacuum degassing,argon oxygen decarburization,and vacuum oxygen decarburization processes.Further development of machine learning in the steelmaking process modeling can be realized through additional efforts in the construction of the data platform,the industrial transformation of the research achievements to the practical steelmaking process,and the improvement of the universality of the machine learning models.展开更多
安全生产事故往往由多组织交互、多因素耦合造成,事故原因涉及多个组织。为预防和遏制多组织生产安全事故的发生,基于系统理论事故建模与过程模型(Systems-Theory Accident Modeling and Process,STAMP)、24Model,构建一种用于多组织事...安全生产事故往往由多组织交互、多因素耦合造成,事故原因涉及多个组织。为预防和遏制多组织生产安全事故的发生,基于系统理论事故建模与过程模型(Systems-Theory Accident Modeling and Process,STAMP)、24Model,构建一种用于多组织事故分析的方法,并以青岛石油爆炸事故为例进行事故原因分析。结果显示:STAMP-24Model可以分组织,分层次且有效、全面、详细地分析涉及多个组织的事故原因,探究多组织之间的交互关系;对事故进行动态演化分析,可得到各组织不安全动作耦合关系与形成的事故失效链及管控失效路径,进而为预防多组织事故提供思路和参考。展开更多
Steam cracking is the dominant technology for producing light olefins,which are believed to be the foundation of the chemical industry.Predictive models of the cracking process can boost production efficiency and prof...Steam cracking is the dominant technology for producing light olefins,which are believed to be the foundation of the chemical industry.Predictive models of the cracking process can boost production efficiency and profit margin.Rapid advancements in machine learning research have recently enabled data-driven solutions to usher in a new era of process modeling.Meanwhile,its practical application to steam cracking is still hindered by the trade-off between prediction accuracy and computational speed.This research presents a framework for data-driven intelligent modeling of the steam cracking process.Industrial data preparation and feature engineering techniques provide computational-ready datasets for the framework,and feedstock similarities are exploited using k-means clustering.We propose LArge-Residuals-Deletion Multivariate Adaptive Regression Spline(LARD-MARS),a modeling approach that explicitly generates output formulas and eliminates potentially outlying instances.The framework is validated further by the presentation of clustering results,the explanation of variable importance,and the testing and comparison of model performance.展开更多
The comprehensive tire building and shaping processes are investigated through the finite element method(FEM)in this article.The mechanical properties of the uncured rubber from different tire components are investiga...The comprehensive tire building and shaping processes are investigated through the finite element method(FEM)in this article.The mechanical properties of the uncured rubber from different tire components are investigated through cyclic loading-unloading experiments under different strain rates.Based on the experiments,an elastoviscoplastic constitutive model is adopted to describe themechanical behaviors of the uncured rubber.The distinct mechanical properties,including the stress level,hysteresis and residual strain,of the uncured rubber can all be well characterized.The whole tire building process(including component winding,rubber bladder inflation,component stitching and carcass band folding-back)and the shaping process are simulated using this constitutive model.The simulated green tire profile is in good agreement with the actual profile obtained through 3D scanning.The deformation and stress of the rubber components and the cord reinforcements during production can be obtained fromthe FE simulation,which is helpful for judging the rationality of the tire construction design.Finally,the influence of the parameter“drum width”is investigated,and the simulated result is found to be consistent with the experimental observations,which verifies the effectiveness of the simulation.The established simulation strategy provides some guiding significance for the improvement of tire design parameters and the elimination of tire production defects.展开更多
Image processing networks have gained great success in many fields,and thus the issue of copyright protection for image processing networks hasbecome a focus of attention. Model watermarking techniques are widely used...Image processing networks have gained great success in many fields,and thus the issue of copyright protection for image processing networks hasbecome a focus of attention. Model watermarking techniques are widely usedin model copyright protection, but there are two challenges: (1) designinguniversal trigger sample watermarking for different network models is stilla challenge;(2) existing methods of copyright protection based on trigger swatermarking are difficult to resist forgery attacks. In this work, we propose adual model watermarking framework for copyright protection in image processingnetworks. The trigger sample watermark is embedded in the trainingprocess of the model, which can effectively verify the model copyright. And wedesign a common method for generating trigger sample watermarks based ongenerative adversarial networks, adaptively generating trigger sample watermarksaccording to different models. The spatial watermark is embedded intothe model output. When an attacker steals model copyright using a forgedtrigger sample watermark, which can be correctly extracted to distinguishbetween the piratical and the protected model. The experiments show that theproposed framework has good performance in different image segmentationnetworks of UNET, UNET++, and FCN (fully convolutional network), andeffectively resists forgery attacks.展开更多
Powered by advanced information technology,more and more complex systems are exhibiting characteristics of the cyber-physical-social systems(CPSS).In this context,computational experiments method has emerged as a nove...Powered by advanced information technology,more and more complex systems are exhibiting characteristics of the cyber-physical-social systems(CPSS).In this context,computational experiments method has emerged as a novel approach for the design,analysis,management,control,and integration of CPSS,which can realize the causal analysis of complex systems by means of“algorithmization”of“counterfactuals”.However,because CPSS involve human and social factors(e.g.,autonomy,initiative,and sociality),it is difficult for traditional design of experiment(DOE)methods to achieve the generative explanation of system emergence.To address this challenge,this paper proposes an integrated approach to the design of computational experiments,incorporating three key modules:1)Descriptive module:Determining the influencing factors and response variables of the system by means of the modeling of an artificial society;2)Interpretative module:Selecting factorial experimental design solution to identify the relationship between influencing factors and macro phenomena;3)Predictive module:Building a meta-model that is equivalent to artificial society to explore its operating laws.Finally,a case study of crowd-sourcing platforms is presented to illustrate the application process and effectiveness of the proposed approach,which can reveal the social impact of algorithmic behavior on“rider race”.展开更多
Coal bed methane has been considered as an important energy resource.One major difficulty of purifying coal bed methane comes from the similar physical properties of CH_4 and N_2.The ZIF-8/water-glycol slurry was used...Coal bed methane has been considered as an important energy resource.One major difficulty of purifying coal bed methane comes from the similar physical properties of CH_4 and N_2.The ZIF-8/water-glycol slurry was used as a medium to separate coal bed methane by fluidifying the solid adsorbent material.The sorption equilibrium experiment of binary mixture(CH_4/N_2)and slurry was conducted.The selectivity of CH_4 to N_2 is within the range of 2-6,which proved the feasibility of the slurry separation method.The modified Langmuir equation was used to describe the gas-slurry phase equilibrium behavior,and the calculated results were in good agreement with the experimental data.A continuous absorption-adsorption and desorption process on the separation of CH_4/N_2 in slurry is proposed and its mathematical model is also developed.Sensitivity analysis is conducted to determine the operation conditions and the energy performance of the proposed process was also evaluated.Feed gas contains 30 mol%of methane and the methane concentration in product gas is 95.46 mol%with the methane recovery ratio of 90.74%.The total energy consumption for per unit volume of product gas is determined as 1.846 kWh Nm^(-3).Experimental results and process simulation provide basic data for the design and operation of pilot and industrial plant.展开更多
The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was p...The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was proposed to reduce casting defects and improve production efficiency,which includes the random forest(RF)classification model,the feature importance analysis,and the process parameters optimization with Monte Carlo simulation.The collected data includes four types of defects and corresponding process parameters were used to construct the RF model.Classification results show a recall rate above 90% for all categories.The Gini Index was used to assess the importance of the process parameters in the formation of various defects in the RF model.Finally,the classification model was applied to different production conditions for quality prediction.In the case of process parameters optimization for gas porosity defects,this model serves as an experimental process in the Monte Carlo method to estimate a better temperature distribution.The prediction model,when applied to the factory,greatly improved the efficiency of defect detection.Results show that the scrap rate decreased from 10.16% to 6.68%.展开更多
Cities are facing challenges of high rise in population number and con-sequently need to be equipped with latest smart services to provide luxuries of life to its residents.Smart integrated solutions are also a need t...Cities are facing challenges of high rise in population number and con-sequently need to be equipped with latest smart services to provide luxuries of life to its residents.Smart integrated solutions are also a need to deal with the social and environmental challenges,caused by increasing urbanization.Currently,the development of smart services’integrated network,within a city,is facing the bar-riers including;less efficient collection and sharing of data,along with inadequate collaboration of software and hardware.Aiming to resolve these issues,this paper recommended a solution for a synchronous functionality in the smart services’integration process through modeling technique.Using this integration modeling solution,atfirst,the service participants,processes and tasks of smart services are identified and then standard illustrations are developed for the better understand-ing of the integrated service group environment.Business process modeling and notation(BPMN)language based models are developed and discussed for a devised case study,to test and experiment i.e.,for remote healthcare from a smart home.The research is concluded with the integration process model application for the required data sharing among different service groups.The outcomes of the modeling are better understanding and attaining maximum automation that can be referenced and replicated.展开更多
The successful execution and management of Offshore Software Maintenance Outsourcing(OSMO)can be very beneficial for OSMO vendors and the OSMO client.Although a lot of research on software outsourcing is going on,most...The successful execution and management of Offshore Software Maintenance Outsourcing(OSMO)can be very beneficial for OSMO vendors and the OSMO client.Although a lot of research on software outsourcing is going on,most of the existing literature on offshore outsourcing deals with the outsourcing of software development only.Several frameworks have been developed focusing on guiding software systemmanagers concerning offshore software outsourcing.However,none of these studies delivered comprehensive guidelines for managing the whole process of OSMO.There is a considerable lack of research working on managing OSMO from a vendor’s perspective.Therefore,to find the best practices for managing an OSMO process,it is necessary to further investigate such complex and multifaceted phenomena from the vendor’s perspective.This study validated the preliminary OSMO process model via a case study research approach.The results showed that the OSMO process model is applicable in an industrial setting with few changes.The industrial data collected during the case study enabled this paper to extend the preliminary OSMO process model.The refined version of the OSMO processmodel has four major phases including(i)Project Assessment,(ii)SLA(iii)Execution,and(iv)Risk.展开更多
Numerical simulation is the most powerful computational and analysis tool for a large variety of engineering and physical problems.For a complex problem relating to multi-field,multi-process and multi-scale,different ...Numerical simulation is the most powerful computational and analysis tool for a large variety of engineering and physical problems.For a complex problem relating to multi-field,multi-process and multi-scale,different computing tools have to be developed so as to solve particular fields at different scales and for different processes.Therefore,the integration of different types of software is inevitable.However,it is difficult to perform the transfer of the meshes and simulated results among software packages because of the lack of shared data formats or encrypted data formats.An image processing based method for three-dimensional model reconstruction for numerical simulation was proposed,which presents a solution to the integration problem by a series of slice or projection images obtained by the post-processing modules of the numerical simulation software.By means of mapping image pixels to meshes of either finite difference or finite element models,the geometry contour can be extracted to export the stereolithography model.The values of results,represented by color,can be deduced and assigned to the meshes.All the models with data can be directly or indirectly integrated into other software as a continued or new numerical simulation.The three-dimensional reconstruction method has been validated in numerical simulation of castings and case studies were provided in this study.展开更多
基金Supported by National Natural Science Foundation of China (Grant No.51975007)。
文摘Efficiency of calculating a dynamic response is an important point of the compliant mechanism for posture adjustment.Dynamic modeling with low orders of a 2R1T compliant parallel mechanism is studied in the paper.The mechanism with two out-of-plane rotational and one lifting degrees of freedom(DoFs)plays an important role in posture adjustment.Based on elastic beam theory,the stiffness matrix and mass matrix of the beam element are established where the moment of inertia is considered.To improve solving efficiency,a dynamic model with low orders of the mechanism is established based on a modified modal synthesis method.Firstly,each branch of the RPR type mechanism is divided into a substructure.Subsequently,a set of hypothetical modes of each substructure is obtained based on the C-B method.Finally,dynamic equation of the whole mechanism is established by the substructure assembly.A dynamic experiment is conducted to verify the dynamic characteristics of the compliant mechanism.
基金the financial support from the Strategic Priority Research Program of Chinese Academy of Sciences(XDA21010100)。
文摘Light olefins is the incredibly important materials in chemical industry.Methanol to olefins(MTO),which provides a non-oil route for light olefins production,received considerable attention in the past decades.However,the catalyst deactivation is an inevitable feature in MTO processes,and regeneration,therefore,is one of the key steps in industrial MTO processes.Traditionally the MTO catalyst is regenerated by removing the deposited coke via air combustion,which unavoidably transforms coke into carbon dioxide and reduces the carbon utilization efficiency.Recent study shows that the coke species over MTO catalyst can be regenerated via steam,which can promote the light olefins yield as the deactivated coke species can be essentially transferred to industrially useful synthesis gas,is a promising pathway for further MTO processes development.In this work,we modelled and analyzed these two MTO regeneration methods in terms of carbon utilization efficiency and technology economics.As shown,the steam regeneration could achieve a carbon utilization efficiency of 84.31%,compared to 74.74%for air combustion regeneration.The MTO processes using steam regeneration can essentially achieve the near-zero carbon emission.In addition,light olefins production of the MTO processes using steam regeneration is 12.81%higher than that using air combustion regeneration.In this regard,steam regeneration could be considered as a potential yet promising regeneration method for further MTO processes,showing not only great environmental benefits but also competitive economic performance.
基金support provided by the National Natural Science Foundation of China(22122802,22278044,and 21878028)the Chongqing Science Fund for Distinguished Young Scholars(CSTB2022NSCQ-JQX0021)the Fundamental Research Funds for the Central Universities(2022CDJXY-003).
文摘To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new light attention module,and a residue module—that are specially designed to learn the general dynamic behavior,transient disturbances,and other input factors of chemical processes,respectively.Combined with a hyperparameter optimization framework,Optuna,the effectiveness of the proposed LACG is tested by distributed control system data-driven modeling experiments on the discharge flowrate of an actual deethanization process.The LACG model provides significant advantages in prediction accuracy and model generalization compared with other models,including the feedforward neural network,convolution neural network,long short-term memory(LSTM),and attention-LSTM.Moreover,compared with the simulation results of a deethanization model built using Aspen Plus Dynamics V12.1,the LACG parameters are demonstrated to be interpretable,and more details on the variable interactions can be observed from the model parameters in comparison with the traditional interpretable model attention-LSTM.This contribution enriches interpretable machine learning knowledge and provides a reliable method with high accuracy for actual chemical process modeling,paving a route to intelligent manufacturing.
文摘The exploration of Mars would heavily rely on Martian rocks mechanics and engineering technology.As the mechanical property of Martian rocks is uncertain,it is of utmost importance to predict the probability distribution of Martian rocks mechanical property for the success of Mars exploration.In this paper,a fast and accurate probability distribution method for predicting the macroscale elastic modulus of Martian rocks was proposed by integrating the microscale rock mechanical experiments(micro-RME),accurate grain-based modeling(AGBM)and upscaling methods based on reliability principles.Firstly,the microstructure of NWA12564 Martian sample and elastic modulus of each mineral were obtained by micro-RME with TESCAN integrated mineral analyzer(TIMA)and nanoindentation.The best probability distribution function of the minerals was determined by Kolmogorov-Smirnov(K-S)test.Secondly,based on best distribution function of each mineral,the Monte Carlo simulations(MCS)and upscaling methods were implemented to obtain the probability distribution of upscaled elastic modulus.Thirdly,the correlation between the upscaled elastic modulus and macroscale elastic modulus obtained by AGBM was established.The accurate probability distribution of the macroscale elastic modulus was obtained by this correlation relationship.The proposed method can predict the probability distribution of Martian rocks mechanical property with any size and shape samples.
基金the funding support from the Ministry of Education,Government of India,under the Prime Minister Research Fellowship programme(Grant Nos.SB21221901CEPMRF008347 and SB22230217CEPMRF008347).
文摘Geomaterials with inferior hydraulic and strength characteristics often need improvement to enhance their engineering behaviors.Traditional ground improvement techniques require enormous mechanical effort or synthetic chemicals.Sustainable stabilization technique such as microbially induced calcite precipitation(MICP)utilizes bacterial metabolic processes to precipitate cementitious calcium carbonate.The reactive transport of biochemical species in the soil mass initiates the precipitation of biocement during the MICP process.The precipitated biocement alters the hydro-mechanical performance of the soil mass.Usually,the flow,deformation,and transport phenomena regulate the biocementation technique via coupled bio-chemo-hydro-mechanical(BCHM)processes.Among all,one crucial phenomenon controlling the precipitation mechanism is the encapsulation of biomass by calcium carbonate.Biomass encapsulation can potentially reduce the biochemical reaction rate and decelerate biocementation.Laboratory examination of the encapsulation process demands a thorough analysis of associated coupled effects.Despite this,a numerical model can assist in capturing the coupled processes influencing encapsulation during the MICP treatment.However,most numerical models did not consider biochemical reaction rate kinetics accounting for the influence of bacterial encapsulation.Given this,the current study developed a coupled BCHM model to evaluate the effect of encapsulation on the precipitated calcite content using a micro-scale semiempirical relationship.Firstly,the developed BCHM model was verified and validated using numerical and experimental observations of soil column tests.Later,the encapsulation phenomenon was investigated in the soil columns of variable maximum calcite crystal sizes.The results depict altered reaction rates due to the encapsulation phenomenon and an observable change in the precipitated calcite content for each maximum crystal size.Furthermore,the permeability and deformation of the soil mass were affected by the simultaneous precipitation of calcium carbonate.Overall,the present study comprehended the influence of the encapsulation of bacteria on cement morphology-induced permeability,biocement-induced stresses and displacements.
文摘Software Development Life Cycle (SDLC) is one of the major ingredients for the development of efficient software systems within a time frame and low-cost involvement. From the literature, it is evident that there are various kinds of process models that are used by the software industries for the development of small, medium and long-term software projects, but many of them do not cover risk management. It is quite obvious that the improper selection of the software development process model leads to failure of the software products as it is time bound activity. In the present work, a new software development process model is proposed which covers the risks at any stage of the development of the software product. The model is named a Hemant-Vipin (HV) process model and may be helpful for the software industries for development of the efficient software products and timely delivery at the end of the client. The efficiency of the HV process model is observed by considering various kinds of factors like requirement clarity, user feedback, change agility, predictability, risk identification, practical implementation, customer satisfaction, incremental development, use of ready-made components, quick design, resource organization and many more and found through a case study that the presented approach covers many of parameters in comparison of the existing process models. .
文摘Objective:To explore and analyze the work process-based practical training teaching model for basic nursing skills in vocational colleges and its implementation effects.Methods:A total of 82 nursing students from our school were selected for the study,which was conducted from April 2023 to April 2024.Using a random number table method,the students were divided into an observation group and a control group,each with 41 students.The control group received conventional practical training teaching,while the observation group followed the work process-based practical training model for basic nursing skills.The assessment scores and teaching satisfaction of the two groups were compared.Results:The comparison of assessment scores showed that the observation group performed significantly better than the control group(P<0.05).The comparison of teaching satisfaction also indicated that the observation group had significantly higher satisfaction than the control group(P<0.05).Conclusion:The work process-based practical training teaching model for basic nursing skills in vocational colleges can improve students’assessment scores and enhance teaching satisfaction,demonstrating its value for wider application.
基金This work was supported by the National Natural Science Foundation of China(No.51979255).
文摘The knowledge of the existence,distribution and fate of polycyclic aromatic hydrocarbons(PAHs)and substituted polycyclic aromatic hydrocarbons(SPAHs)in wastewater treatment plants(WWTPs)was vital for reducing their concentrations entering the aquatic environment.The concentrations of 13 SPAHs and 16 PAHs were all determined in a WWTP with styrene butadiene rubber(SBR)in partnership with the moving bed biofilm reactor(MBBR)process.SPAHs presented a higher concentration lever than PAHs in nearly all samples.The total removal efficiencies of PAHs and SPAHs ranged from 64.0%to 71.36%and 78.4%to 79.7%,respectively.The total yearly loads of PAHs(43.0 kg)and SPAHs(73.0 kg)were mainly reduced by the primary and SBR/MBBR biological treatment stages.The tertiary treatment stage had a minor contribution to target compounds removal.According to a synthesis and improvement fate model,we found that the dominant processes changed as the chemical octanol water partition coefficient(K_(ow))increased.But the seasonal variations of experimental removal efficiencies were more obvious than that of predicted data.In the primary sedimentation tank,dissolution in the aqueous phase and sorption to sludge/particulate matter were controlling processes for the removal of PAHs and SPAHs.The sorption to sludge and biodegradation were the principal removal mechanisms during the SBR/MBBR biological treatment process.The contribution of volatilization to removal was always insignificant.Furthermore,the basic physicochemical properties and operating parameters influenced the fate of PAHs and SPAHs in the WWTP.
文摘To improve the accuracy of node trust evaluation in a distributed network, a trust model based on the experience of individuals is proposed, which establishes a new trust assessment system by introducing the experience factor and the comparative experience factor. The new evaluation system considers the differences between individuals and interactive histories between nodes, which solves the problem that nodes have inaccurate assessments due to the asymmetry of nodes to a certain extent. The algorithm analysis indicates that the new model uses different deviating values of tolerance evaluation for different individuals and uses different updating values embodying node individuation when updating feedback credibility of individuals, which evaluates the trust value more reasonably and more accurately. In addition, the proposed algorithm can be used in various trust models and has a good scalability.
基金supported by the National Natural Science Foundation of China(No.U1960202)。
文摘With the development of automation and informatization in the steelmaking industry,the human brain gradually fails to cope with an increasing amount of data generated during the steelmaking process.Machine learning technology provides a new method other than production experience and metallurgical principles in dealing with large amounts of data.The application of machine learning in the steelmaking process has become a research hotspot in recent years.This paper provides an overview of the applications of machine learning in the steelmaking process modeling involving hot metal pretreatment,primary steelmaking,secondary refining,and some other aspects.The three most frequently used machine learning algorithms in steelmaking process modeling are the artificial neural network,support vector machine,and case-based reasoning,demonstrating proportions of 56%,14%,and 10%,respectively.Collected data in the steelmaking plants are frequently faulty.Thus,data processing,especially data cleaning,is crucially important to the performance of machine learning models.The detection of variable importance can be used to optimize the process parameters and guide production.Machine learning is used in hot metal pretreatment modeling mainly for endpoint S content prediction.The predictions of the endpoints of element compositions and the process parameters are widely investigated in primary steelmaking.Machine learning is used in secondary refining modeling mainly for ladle furnaces,Ruhrstahl–Heraeus,vacuum degassing,argon oxygen decarburization,and vacuum oxygen decarburization processes.Further development of machine learning in the steelmaking process modeling can be realized through additional efforts in the construction of the data platform,the industrial transformation of the research achievements to the practical steelmaking process,and the improvement of the universality of the machine learning models.
文摘安全生产事故往往由多组织交互、多因素耦合造成,事故原因涉及多个组织。为预防和遏制多组织生产安全事故的发生,基于系统理论事故建模与过程模型(Systems-Theory Accident Modeling and Process,STAMP)、24Model,构建一种用于多组织事故分析的方法,并以青岛石油爆炸事故为例进行事故原因分析。结果显示:STAMP-24Model可以分组织,分层次且有效、全面、详细地分析涉及多个组织的事故原因,探究多组织之间的交互关系;对事故进行动态演化分析,可得到各组织不安全动作耦合关系与形成的事故失效链及管控失效路径,进而为预防多组织事故提供思路和参考。
基金supported by the National Key Research and Development Program of China(2021 YFB 4000500,2021 YFB 4000501,and 2021 YFB 4000502)。
文摘Steam cracking is the dominant technology for producing light olefins,which are believed to be the foundation of the chemical industry.Predictive models of the cracking process can boost production efficiency and profit margin.Rapid advancements in machine learning research have recently enabled data-driven solutions to usher in a new era of process modeling.Meanwhile,its practical application to steam cracking is still hindered by the trade-off between prediction accuracy and computational speed.This research presents a framework for data-driven intelligent modeling of the steam cracking process.Industrial data preparation and feature engineering techniques provide computational-ready datasets for the framework,and feedstock similarities are exploited using k-means clustering.We propose LArge-Residuals-Deletion Multivariate Adaptive Regression Spline(LARD-MARS),a modeling approach that explicitly generates output formulas and eliminates potentially outlying instances.The framework is validated further by the presentation of clustering results,the explanation of variable importance,and the testing and comparison of model performance.
基金funded by the NationalNatural Science Foundation of China (Nos.11902229,11502181)the Strategic Priority Research Program of the Chinese Academy of Sciences (Grant Nos.XDB22040502,XDC06030200).
文摘The comprehensive tire building and shaping processes are investigated through the finite element method(FEM)in this article.The mechanical properties of the uncured rubber from different tire components are investigated through cyclic loading-unloading experiments under different strain rates.Based on the experiments,an elastoviscoplastic constitutive model is adopted to describe themechanical behaviors of the uncured rubber.The distinct mechanical properties,including the stress level,hysteresis and residual strain,of the uncured rubber can all be well characterized.The whole tire building process(including component winding,rubber bladder inflation,component stitching and carcass band folding-back)and the shaping process are simulated using this constitutive model.The simulated green tire profile is in good agreement with the actual profile obtained through 3D scanning.The deformation and stress of the rubber components and the cord reinforcements during production can be obtained fromthe FE simulation,which is helpful for judging the rationality of the tire construction design.Finally,the influence of the parameter“drum width”is investigated,and the simulated result is found to be consistent with the experimental observations,which verifies the effectiveness of the simulation.The established simulation strategy provides some guiding significance for the improvement of tire design parameters and the elimination of tire production defects.
基金supported by the National Natural Science Foundation of China under grants U1836208,by the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD)fundby the Collaborative Innovation Center of Atmospheric Environment and Equipment Technology (CICAEET)fund,China.
文摘Image processing networks have gained great success in many fields,and thus the issue of copyright protection for image processing networks hasbecome a focus of attention. Model watermarking techniques are widely usedin model copyright protection, but there are two challenges: (1) designinguniversal trigger sample watermarking for different network models is stilla challenge;(2) existing methods of copyright protection based on trigger swatermarking are difficult to resist forgery attacks. In this work, we propose adual model watermarking framework for copyright protection in image processingnetworks. The trigger sample watermark is embedded in the trainingprocess of the model, which can effectively verify the model copyright. And wedesign a common method for generating trigger sample watermarks based ongenerative adversarial networks, adaptively generating trigger sample watermarksaccording to different models. The spatial watermark is embedded intothe model output. When an attacker steals model copyright using a forgedtrigger sample watermark, which can be correctly extracted to distinguishbetween the piratical and the protected model. The experiments show that theproposed framework has good performance in different image segmentationnetworks of UNET, UNET++, and FCN (fully convolutional network), andeffectively resists forgery attacks.
基金the National Key Research and Development Program of China(2021YFF0900800)the National Natural Science Foundation of China(61972276,62206116,62032016)+2 种基金the New Liberal Arts Reform and Practice Project of National Ministry of Education(2021170002)the Open Research Fund of the State Key Laboratory for Management and Control of Complex Systems(20210101)Tianjin University Talent Innovation Reward Program for Literature and Science Graduate Student(C1-2022-010)。
文摘Powered by advanced information technology,more and more complex systems are exhibiting characteristics of the cyber-physical-social systems(CPSS).In this context,computational experiments method has emerged as a novel approach for the design,analysis,management,control,and integration of CPSS,which can realize the causal analysis of complex systems by means of“algorithmization”of“counterfactuals”.However,because CPSS involve human and social factors(e.g.,autonomy,initiative,and sociality),it is difficult for traditional design of experiment(DOE)methods to achieve the generative explanation of system emergence.To address this challenge,this paper proposes an integrated approach to the design of computational experiments,incorporating three key modules:1)Descriptive module:Determining the influencing factors and response variables of the system by means of the modeling of an artificial society;2)Interpretative module:Selecting factorial experimental design solution to identify the relationship between influencing factors and macro phenomena;3)Predictive module:Building a meta-model that is equivalent to artificial society to explore its operating laws.Finally,a case study of crowd-sourcing platforms is presented to illustrate the application process and effectiveness of the proposed approach,which can reveal the social impact of algorithmic behavior on“rider race”.
基金The financial supports received from the National Natural Science Foundation of China(21522609,21636009 and 21878328)the National Key Research and Development Program of China(Nos.2017YFC0307302,2016YFC0304003)+1 种基金the Science Foundation of China University of Petroleum,Beijing(No.2462018BJC004)Beijing Science and Technology Program,China(No.Z181100005118010)。
文摘Coal bed methane has been considered as an important energy resource.One major difficulty of purifying coal bed methane comes from the similar physical properties of CH_4 and N_2.The ZIF-8/water-glycol slurry was used as a medium to separate coal bed methane by fluidifying the solid adsorbent material.The sorption equilibrium experiment of binary mixture(CH_4/N_2)and slurry was conducted.The selectivity of CH_4 to N_2 is within the range of 2-6,which proved the feasibility of the slurry separation method.The modified Langmuir equation was used to describe the gas-slurry phase equilibrium behavior,and the calculated results were in good agreement with the experimental data.A continuous absorption-adsorption and desorption process on the separation of CH_4/N_2 in slurry is proposed and its mathematical model is also developed.Sensitivity analysis is conducted to determine the operation conditions and the energy performance of the proposed process was also evaluated.Feed gas contains 30 mol%of methane and the methane concentration in product gas is 95.46 mol%with the methane recovery ratio of 90.74%.The total energy consumption for per unit volume of product gas is determined as 1.846 kWh Nm^(-3).Experimental results and process simulation provide basic data for the design and operation of pilot and industrial plant.
基金financially supported by the National Key Research and Development Program of China(2022YFB3706800,2020YFB1710100)the National Natural Science Foundation of China(51821001,52090042,52074183)。
文摘The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was proposed to reduce casting defects and improve production efficiency,which includes the random forest(RF)classification model,the feature importance analysis,and the process parameters optimization with Monte Carlo simulation.The collected data includes four types of defects and corresponding process parameters were used to construct the RF model.Classification results show a recall rate above 90% for all categories.The Gini Index was used to assess the importance of the process parameters in the formation of various defects in the RF model.Finally,the classification model was applied to different production conditions for quality prediction.In the case of process parameters optimization for gas porosity defects,this model serves as an experimental process in the Monte Carlo method to estimate a better temperature distribution.The prediction model,when applied to the factory,greatly improved the efficiency of defect detection.Results show that the scrap rate decreased from 10.16% to 6.68%.
文摘Cities are facing challenges of high rise in population number and con-sequently need to be equipped with latest smart services to provide luxuries of life to its residents.Smart integrated solutions are also a need to deal with the social and environmental challenges,caused by increasing urbanization.Currently,the development of smart services’integrated network,within a city,is facing the bar-riers including;less efficient collection and sharing of data,along with inadequate collaboration of software and hardware.Aiming to resolve these issues,this paper recommended a solution for a synchronous functionality in the smart services’integration process through modeling technique.Using this integration modeling solution,atfirst,the service participants,processes and tasks of smart services are identified and then standard illustrations are developed for the better understand-ing of the integrated service group environment.Business process modeling and notation(BPMN)language based models are developed and discussed for a devised case study,to test and experiment i.e.,for remote healthcare from a smart home.The research is concluded with the integration process model application for the required data sharing among different service groups.The outcomes of the modeling are better understanding and attaining maximum automation that can be referenced and replicated.
基金This research is fully funded byUniversiti Malaysia Terengganu under the research Grant(PGRG).
文摘The successful execution and management of Offshore Software Maintenance Outsourcing(OSMO)can be very beneficial for OSMO vendors and the OSMO client.Although a lot of research on software outsourcing is going on,most of the existing literature on offshore outsourcing deals with the outsourcing of software development only.Several frameworks have been developed focusing on guiding software systemmanagers concerning offshore software outsourcing.However,none of these studies delivered comprehensive guidelines for managing the whole process of OSMO.There is a considerable lack of research working on managing OSMO from a vendor’s perspective.Therefore,to find the best practices for managing an OSMO process,it is necessary to further investigate such complex and multifaceted phenomena from the vendor’s perspective.This study validated the preliminary OSMO process model via a case study research approach.The results showed that the OSMO process model is applicable in an industrial setting with few changes.The industrial data collected during the case study enabled this paper to extend the preliminary OSMO process model.The refined version of the OSMO processmodel has four major phases including(i)Project Assessment,(ii)SLA(iii)Execution,and(iv)Risk.
基金funded by National Key R&D Program of China(No.2021YFB3401200)the National Natural Science Foundation of China(No.51875308)the Beijing Nature Sciences Fund-Haidian Originality Cooperation Project(L212002).
文摘Numerical simulation is the most powerful computational and analysis tool for a large variety of engineering and physical problems.For a complex problem relating to multi-field,multi-process and multi-scale,different computing tools have to be developed so as to solve particular fields at different scales and for different processes.Therefore,the integration of different types of software is inevitable.However,it is difficult to perform the transfer of the meshes and simulated results among software packages because of the lack of shared data formats or encrypted data formats.An image processing based method for three-dimensional model reconstruction for numerical simulation was proposed,which presents a solution to the integration problem by a series of slice or projection images obtained by the post-processing modules of the numerical simulation software.By means of mapping image pixels to meshes of either finite difference or finite element models,the geometry contour can be extracted to export the stereolithography model.The values of results,represented by color,can be deduced and assigned to the meshes.All the models with data can be directly or indirectly integrated into other software as a continued or new numerical simulation.The three-dimensional reconstruction method has been validated in numerical simulation of castings and case studies were provided in this study.