Light olefins is the incredibly important materials in chemical industry.Methanol to olefins(MTO),which provides a non-oil route for light olefins production,received considerable attention in the past decades.However...Light olefins is the incredibly important materials in chemical industry.Methanol to olefins(MTO),which provides a non-oil route for light olefins production,received considerable attention in the past decades.However,the catalyst deactivation is an inevitable feature in MTO processes,and regeneration,therefore,is one of the key steps in industrial MTO processes.Traditionally the MTO catalyst is regenerated by removing the deposited coke via air combustion,which unavoidably transforms coke into carbon dioxide and reduces the carbon utilization efficiency.Recent study shows that the coke species over MTO catalyst can be regenerated via steam,which can promote the light olefins yield as the deactivated coke species can be essentially transferred to industrially useful synthesis gas,is a promising pathway for further MTO processes development.In this work,we modelled and analyzed these two MTO regeneration methods in terms of carbon utilization efficiency and technology economics.As shown,the steam regeneration could achieve a carbon utilization efficiency of 84.31%,compared to 74.74%for air combustion regeneration.The MTO processes using steam regeneration can essentially achieve the near-zero carbon emission.In addition,light olefins production of the MTO processes using steam regeneration is 12.81%higher than that using air combustion regeneration.In this regard,steam regeneration could be considered as a potential yet promising regeneration method for further MTO processes,showing not only great environmental benefits but also competitive economic performance.展开更多
Software Development Life Cycle (SDLC) is one of the major ingredients for the development of efficient software systems within a time frame and low-cost involvement. From the literature, it is evident that there are ...Software Development Life Cycle (SDLC) is one of the major ingredients for the development of efficient software systems within a time frame and low-cost involvement. From the literature, it is evident that there are various kinds of process models that are used by the software industries for the development of small, medium and long-term software projects, but many of them do not cover risk management. It is quite obvious that the improper selection of the software development process model leads to failure of the software products as it is time bound activity. In the present work, a new software development process model is proposed which covers the risks at any stage of the development of the software product. The model is named a Hemant-Vipin (HV) process model and may be helpful for the software industries for development of the efficient software products and timely delivery at the end of the client. The efficiency of the HV process model is observed by considering various kinds of factors like requirement clarity, user feedback, change agility, predictability, risk identification, practical implementation, customer satisfaction, incremental development, use of ready-made components, quick design, resource organization and many more and found through a case study that the presented approach covers many of parameters in comparison of the existing process models. .展开更多
With the development of automation and informatization in the steelmaking industry,the human brain gradually fails to cope with an increasing amount of data generated during the steelmaking process.Machine learning te...With the development of automation and informatization in the steelmaking industry,the human brain gradually fails to cope with an increasing amount of data generated during the steelmaking process.Machine learning technology provides a new method other than production experience and metallurgical principles in dealing with large amounts of data.The application of machine learning in the steelmaking process has become a research hotspot in recent years.This paper provides an overview of the applications of machine learning in the steelmaking process modeling involving hot metal pretreatment,primary steelmaking,secondary refining,and some other aspects.The three most frequently used machine learning algorithms in steelmaking process modeling are the artificial neural network,support vector machine,and case-based reasoning,demonstrating proportions of 56%,14%,and 10%,respectively.Collected data in the steelmaking plants are frequently faulty.Thus,data processing,especially data cleaning,is crucially important to the performance of machine learning models.The detection of variable importance can be used to optimize the process parameters and guide production.Machine learning is used in hot metal pretreatment modeling mainly for endpoint S content prediction.The predictions of the endpoints of element compositions and the process parameters are widely investigated in primary steelmaking.Machine learning is used in secondary refining modeling mainly for ladle furnaces,Ruhrstahl–Heraeus,vacuum degassing,argon oxygen decarburization,and vacuum oxygen decarburization processes.Further development of machine learning in the steelmaking process modeling can be realized through additional efforts in the construction of the data platform,the industrial transformation of the research achievements to the practical steelmaking process,and the improvement of the universality of the machine learning models.展开更多
安全生产事故往往由多组织交互、多因素耦合造成,事故原因涉及多个组织。为预防和遏制多组织生产安全事故的发生,基于系统理论事故建模与过程模型(Systems-Theory Accident Modeling and Process,STAMP)、24Model,构建一种用于多组织事...安全生产事故往往由多组织交互、多因素耦合造成,事故原因涉及多个组织。为预防和遏制多组织生产安全事故的发生,基于系统理论事故建模与过程模型(Systems-Theory Accident Modeling and Process,STAMP)、24Model,构建一种用于多组织事故分析的方法,并以青岛石油爆炸事故为例进行事故原因分析。结果显示:STAMP-24Model可以分组织,分层次且有效、全面、详细地分析涉及多个组织的事故原因,探究多组织之间的交互关系;对事故进行动态演化分析,可得到各组织不安全动作耦合关系与形成的事故失效链及管控失效路径,进而为预防多组织事故提供思路和参考。展开更多
Steam cracking is the dominant technology for producing light olefins,which are believed to be the foundation of the chemical industry.Predictive models of the cracking process can boost production efficiency and prof...Steam cracking is the dominant technology for producing light olefins,which are believed to be the foundation of the chemical industry.Predictive models of the cracking process can boost production efficiency and profit margin.Rapid advancements in machine learning research have recently enabled data-driven solutions to usher in a new era of process modeling.Meanwhile,its practical application to steam cracking is still hindered by the trade-off between prediction accuracy and computational speed.This research presents a framework for data-driven intelligent modeling of the steam cracking process.Industrial data preparation and feature engineering techniques provide computational-ready datasets for the framework,and feedstock similarities are exploited using k-means clustering.We propose LArge-Residuals-Deletion Multivariate Adaptive Regression Spline(LARD-MARS),a modeling approach that explicitly generates output formulas and eliminates potentially outlying instances.The framework is validated further by the presentation of clustering results,the explanation of variable importance,and the testing and comparison of model performance.展开更多
The comprehensive tire building and shaping processes are investigated through the finite element method(FEM)in this article.The mechanical properties of the uncured rubber from different tire components are investiga...The comprehensive tire building and shaping processes are investigated through the finite element method(FEM)in this article.The mechanical properties of the uncured rubber from different tire components are investigated through cyclic loading-unloading experiments under different strain rates.Based on the experiments,an elastoviscoplastic constitutive model is adopted to describe themechanical behaviors of the uncured rubber.The distinct mechanical properties,including the stress level,hysteresis and residual strain,of the uncured rubber can all be well characterized.The whole tire building process(including component winding,rubber bladder inflation,component stitching and carcass band folding-back)and the shaping process are simulated using this constitutive model.The simulated green tire profile is in good agreement with the actual profile obtained through 3D scanning.The deformation and stress of the rubber components and the cord reinforcements during production can be obtained fromthe FE simulation,which is helpful for judging the rationality of the tire construction design.Finally,the influence of the parameter“drum width”is investigated,and the simulated result is found to be consistent with the experimental observations,which verifies the effectiveness of the simulation.The established simulation strategy provides some guiding significance for the improvement of tire design parameters and the elimination of tire production defects.展开更多
Cities are facing challenges of high rise in population number and con-sequently need to be equipped with latest smart services to provide luxuries of life to its residents.Smart integrated solutions are also a need t...Cities are facing challenges of high rise in population number and con-sequently need to be equipped with latest smart services to provide luxuries of life to its residents.Smart integrated solutions are also a need to deal with the social and environmental challenges,caused by increasing urbanization.Currently,the development of smart services’integrated network,within a city,is facing the bar-riers including;less efficient collection and sharing of data,along with inadequate collaboration of software and hardware.Aiming to resolve these issues,this paper recommended a solution for a synchronous functionality in the smart services’integration process through modeling technique.Using this integration modeling solution,atfirst,the service participants,processes and tasks of smart services are identified and then standard illustrations are developed for the better understand-ing of the integrated service group environment.Business process modeling and notation(BPMN)language based models are developed and discussed for a devised case study,to test and experiment i.e.,for remote healthcare from a smart home.The research is concluded with the integration process model application for the required data sharing among different service groups.The outcomes of the modeling are better understanding and attaining maximum automation that can be referenced and replicated.展开更多
The successful execution and management of Offshore Software Maintenance Outsourcing(OSMO)can be very beneficial for OSMO vendors and the OSMO client.Although a lot of research on software outsourcing is going on,most...The successful execution and management of Offshore Software Maintenance Outsourcing(OSMO)can be very beneficial for OSMO vendors and the OSMO client.Although a lot of research on software outsourcing is going on,most of the existing literature on offshore outsourcing deals with the outsourcing of software development only.Several frameworks have been developed focusing on guiding software systemmanagers concerning offshore software outsourcing.However,none of these studies delivered comprehensive guidelines for managing the whole process of OSMO.There is a considerable lack of research working on managing OSMO from a vendor’s perspective.Therefore,to find the best practices for managing an OSMO process,it is necessary to further investigate such complex and multifaceted phenomena from the vendor’s perspective.This study validated the preliminary OSMO process model via a case study research approach.The results showed that the OSMO process model is applicable in an industrial setting with few changes.The industrial data collected during the case study enabled this paper to extend the preliminary OSMO process model.The refined version of the OSMO processmodel has four major phases including(i)Project Assessment,(ii)SLA(iii)Execution,and(iv)Risk.展开更多
Numerical simulation is the most powerful computational and analysis tool for a large variety of engineering and physical problems.For a complex problem relating to multi-field,multi-process and multi-scale,different ...Numerical simulation is the most powerful computational and analysis tool for a large variety of engineering and physical problems.For a complex problem relating to multi-field,multi-process and multi-scale,different computing tools have to be developed so as to solve particular fields at different scales and for different processes.Therefore,the integration of different types of software is inevitable.However,it is difficult to perform the transfer of the meshes and simulated results among software packages because of the lack of shared data formats or encrypted data formats.An image processing based method for three-dimensional model reconstruction for numerical simulation was proposed,which presents a solution to the integration problem by a series of slice or projection images obtained by the post-processing modules of the numerical simulation software.By means of mapping image pixels to meshes of either finite difference or finite element models,the geometry contour can be extracted to export the stereolithography model.The values of results,represented by color,can be deduced and assigned to the meshes.All the models with data can be directly or indirectly integrated into other software as a continued or new numerical simulation.The three-dimensional reconstruction method has been validated in numerical simulation of castings and case studies were provided in this study.展开更多
Image processing networks have gained great success in many fields,and thus the issue of copyright protection for image processing networks hasbecome a focus of attention. Model watermarking techniques are widely used...Image processing networks have gained great success in many fields,and thus the issue of copyright protection for image processing networks hasbecome a focus of attention. Model watermarking techniques are widely usedin model copyright protection, but there are two challenges: (1) designinguniversal trigger sample watermarking for different network models is stilla challenge;(2) existing methods of copyright protection based on trigger swatermarking are difficult to resist forgery attacks. In this work, we propose adual model watermarking framework for copyright protection in image processingnetworks. The trigger sample watermark is embedded in the trainingprocess of the model, which can effectively verify the model copyright. And wedesign a common method for generating trigger sample watermarks based ongenerative adversarial networks, adaptively generating trigger sample watermarksaccording to different models. The spatial watermark is embedded intothe model output. When an attacker steals model copyright using a forgedtrigger sample watermark, which can be correctly extracted to distinguishbetween the piratical and the protected model. The experiments show that theproposed framework has good performance in different image segmentationnetworks of UNET, UNET++, and FCN (fully convolutional network), andeffectively resists forgery attacks.展开更多
The knowledge of the existence,distribution and fate of polycyclic aromatic hydrocarbons(PAHs)and substituted polycyclic aromatic hydrocarbons(SPAHs)in wastewater treatment plants(WWTPs)was vital for reducing their co...The knowledge of the existence,distribution and fate of polycyclic aromatic hydrocarbons(PAHs)and substituted polycyclic aromatic hydrocarbons(SPAHs)in wastewater treatment plants(WWTPs)was vital for reducing their concentrations entering the aquatic environment.The concentrations of 13 SPAHs and 16 PAHs were all determined in a WWTP with styrene butadiene rubber(SBR)in partnership with the moving bed biofilm reactor(MBBR)process.SPAHs presented a higher concentration lever than PAHs in nearly all samples.The total removal efficiencies of PAHs and SPAHs ranged from 64.0%to 71.36%and 78.4%to 79.7%,respectively.The total yearly loads of PAHs(43.0 kg)and SPAHs(73.0 kg)were mainly reduced by the primary and SBR/MBBR biological treatment stages.The tertiary treatment stage had a minor contribution to target compounds removal.According to a synthesis and improvement fate model,we found that the dominant processes changed as the chemical octanol water partition coefficient(K_(ow))increased.But the seasonal variations of experimental removal efficiencies were more obvious than that of predicted data.In the primary sedimentation tank,dissolution in the aqueous phase and sorption to sludge/particulate matter were controlling processes for the removal of PAHs and SPAHs.The sorption to sludge and biodegradation were the principal removal mechanisms during the SBR/MBBR biological treatment process.The contribution of volatilization to removal was always insignificant.Furthermore,the basic physicochemical properties and operating parameters influenced the fate of PAHs and SPAHs in the WWTP.展开更多
The optimization system, which was the subject of our study, is an autonomous chain for the automatic management of cyanide consumption. It is in the phase of industrial automation which made it possible to use the ma...The optimization system, which was the subject of our study, is an autonomous chain for the automatic management of cyanide consumption. It is in the phase of industrial automation which made it possible to use the machines in order to reduce the workload of the worker while keeping a high productivity and a quality in great demand. Furthermore, the use of cyanide in leaching tanks is a necessity in the gold recovery process. This consumption of cyanide must be optimal in these tanks in order to have a good recovery while controlling the concentration of cyanide. Cyanide is one of the most expensive products for mining companies. On a completely different note, we see huge variations during the addition of cyanide. Following a recommendation from the metallurgical and operations teams, the control team carried out an analysis of the problem while proposing a solution to reduce the variability around plus or minus 10% of the addition setpoint through automation. It should be noted that this automatic optimization by monitoring the concentration of cyanide, made use of industrial automation which is a technique which ensures the operation of the ore processing chain without human intervention. In other words, it made it possible to substitute a machine for man. So, this leads us to conduct a study on concentration levels in the real world. The results show that the analysis of the modeling of the cyanide consumption optimization system is an appropriate solution to eradicate failures in the mineral processing chain. The trend curves demonstrate this resolution perfectly.展开更多
Quality traceability plays an essential role in assembling and welding offshore platform blocks.The improvement of the welding quality traceability system is conducive to improving the durability of the offshore platf...Quality traceability plays an essential role in assembling and welding offshore platform blocks.The improvement of the welding quality traceability system is conducive to improving the durability of the offshore platform and the process level of the offshore industry.Currently,qualitymanagement remains in the era of primary information,and there is a lack of effective tracking and recording of welding quality data.When welding defects are encountered,it is difficult to rapidly and accurately determine the root cause of the problem from various complexities and scattered quality data.In this paper,a composite welding quality traceability model for offshore platform block construction process is proposed,it contains the quality early-warning method based on long short-term memory and quality data backtracking query optimization algorithm.By fulfilling the training of the early-warning model and the implementation of the query optimization algorithm,the quality traceability model has the ability to assist enterprises in realizing the rapid identification and positioning of quality problems.Furthermore,the model and the quality traceability algorithm are checked by cases in actual working conditions.Verification analyses suggest that the proposed early-warningmodel for welding quality and the algorithmfor optimizing backtracking requests are effective and can be applied to the actual construction process.展开更多
The spread of an advantageous mutation through a population is of fundamental interest in population genetics. While the classical Moran model is formulated for a well-mixed population, it has long been recognized tha...The spread of an advantageous mutation through a population is of fundamental interest in population genetics. While the classical Moran model is formulated for a well-mixed population, it has long been recognized that in real-world applications, the population usually has an explicit spatial structure which can significantly influence the dynamics. In the context of cancer initiation in epithelial tissue, several recent works have analyzed the dynamics of advantageous mutant spread on integer lattices, using the biased voter model from particle systems theory. In this spatial version of the Moran model, individuals first reproduce according to their fitness and then replace a neighboring individual. From a biological standpoint, the opposite dynamics, where individuals first die and are then replaced by a neighboring individual according to its fitness, are equally relevant. Here, we investigate this death-birth analogue of the biased voter model. We construct the process mathematically, derive the associated dual process, establish bounds on the survival probability of a single mutant, and prove that the process has an asymptotic shape. We also briefly discuss alternative birth-death and death-birth dynamics, depending on how the mutant fitness advantage affects the dynamics. We show that birth-death and death-birth formulations of the biased voter model are equivalent when fitness affects the former event of each update of the model, whereas the birth-death model is fundamentally different from the death-birth model when fitness affects the latter event.展开更多
Effective monitoring of the structural health of combined coal-rock under complex geological conditions by pressure stimulated currents(PSCs)has great potential for the understanding of dynamic disasters in undergroun...Effective monitoring of the structural health of combined coal-rock under complex geological conditions by pressure stimulated currents(PSCs)has great potential for the understanding of dynamic disasters in underground engineering.To reveal the effect of this way,the uniaxial compression experiments with PSC monitoring were conducted on three types of coal-rock combination samples with different strength combinations.The mechanism explanation of PSCs are investigated by resistivity test,atomic force microscopy(AFM)and computed tomography(CT)methods,and a PSC flow model based on progressive failure process is proposed.The influence of strength combinations on PSCs in the progressive failure process are emphasized.The results show the PSC responses between rock part,coal part and the two components are different,which are affected by multi-scale fracture characteristics and electrical properties.As the rock strength decreases,the progressive failure process changes obviously with the influence range of interface constraint effect decreasing,resulting in the different responses of PSC strength and direction in different parts to fracture behaviors.The PSC flow model is initially validated by the relationship between the accumulated charges of different parts.The results are expected to provide a new reference and method for mining design and roadway quality assessment.展开更多
The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was p...The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was proposed to reduce casting defects and improve production efficiency,which includes the random forest(RF)classification model,the feature importance analysis,and the process parameters optimization with Monte Carlo simulation.The collected data includes four types of defects and corresponding process parameters were used to construct the RF model.Classification results show a recall rate above 90% for all categories.The Gini Index was used to assess the importance of the process parameters in the formation of various defects in the RF model.Finally,the classification model was applied to different production conditions for quality prediction.In the case of process parameters optimization for gas porosity defects,this model serves as an experimental process in the Monte Carlo method to estimate a better temperature distribution.The prediction model,when applied to the factory,greatly improved the efficiency of defect detection.Results show that the scrap rate decreased from 10.16% to 6.68%.展开更多
There are more uncertainties with ice hydrometeor representations and related processes than liquid hydrometeors within microphysics parameterization(MP)schemes because of their complicated geometries and physical pro...There are more uncertainties with ice hydrometeor representations and related processes than liquid hydrometeors within microphysics parameterization(MP)schemes because of their complicated geometries and physical properties.Idealized supercell simulations are produced using the WRF model coupled with“full”Hebrew University spectral bin MP(HU-SBM),and NSSL and Thompson bulk MP(BMP)schemes.HU-SBM downdrafts are typically weaker than those of the NSSL and Thompson simulations,accompanied by less rain evaporation.HU-SBM produces more cloud ice(plates),graupel,and hail than the BMPs,yet precipitates less at the surface.The limiting mass bins(and subsequently,particle size)of rimed ice in HU-SBM and slower rimed ice fall speeds lead to smaller melting-level net rimed ice fluxes than those of the BMPs.Aggregation from plates in HU-SBM,together with snow–graupel collisions,leads to a greater snow contribution to rain than those of the BMPs.Replacing HU-SBM’s fall speeds using the formulations of the BMPs after aggregating the discrete bin values to mass mixing ratios and total number concentrations increases net rain and rimed ice fluxes.Still,they are smaller in magnitude than bulk rain,NSSL hail,and Thompson graupel net fluxes near the surface.Conversely,the melting-layer net rimed ice fluxes are reduced when the fall speeds for the NSSL and Thompson simulations are calculated using HU-SBM fall speed formulations after discretizing the bulk particle size distributions(PSDs)into spectral bins.The results highlight precipitation sensitivity to storm dynamics,fall speed,hydrometeor evolution governed by process rates,and MP PSD design.展开更多
In this paper, CiteSpace, a bibliometrics software, was adopted to collect research papers published on the Web of Science, which are relevant to biological model and effluent quality prediction in activated sludge pr...In this paper, CiteSpace, a bibliometrics software, was adopted to collect research papers published on the Web of Science, which are relevant to biological model and effluent quality prediction in activated sludge process in the wastewater treatment. By the way of trend map, keyword knowledge map, and co-cited knowledge map, specific visualization analysis and identification of the authors, institutions and regions were concluded. Furthermore, the topics and hotspots of water quality prediction in activated sludge process through the literature-co-citation-based cluster analysis and literature citation burst analysis were also determined, which not only reflected the historical evolution progress to a certain extent, but also provided the direction and insight of the knowledge structure of water quality prediction and activated sludge process for future research.展开更多
Amid urbanization and the continuous expansion of transportation networks,the necessity for tunnel construction and maintenance has become paramount.Addressing this need requires the investigation of efficient,economi...Amid urbanization and the continuous expansion of transportation networks,the necessity for tunnel construction and maintenance has become paramount.Addressing this need requires the investigation of efficient,economical,and robust tunnel reinforcement techniques.This paper explores fiber reinforced polymer(FRP)and steel fiber reinforced concrete(SFRC)technologies,which have emerged as viable solutions for enhancing tunnel structures.FRP is celebrated for its lightweight and high-strength attributes,effectively augmenting load-bearing capacity and seismic resistance,while SFRC’s notable crack resistance and longevity potentially enhance the performance of tunnel segments.Nonetheless,current research predominantly focuses on experimental analysis,lacking comprehensive theoretical models.To bridge this gap,the cohesive zone model(CZM),which utilizes cohesive elements to characterize the potential fracture surfaces of concrete/SFRC,the rebar-concrete interface,and the FRP-concrete interface,was employed.A modeling approach was subsequently proposed to construct a tunnel segment model reinforced with either SFRC or FRP.Moreover,the corresponding mixed-mode constitutive models,considering interfacial friction,were integrated into the proposed model.Experimental validation and numerical simulations corroborated the accuracy of the proposed model.Additionally,this study examined the reinforcement design of tunnel segments.Through a numerical evaluation,the effectiveness of innovative reinforcement schemes,such as substituting concrete with SFRC and externally bonding FRP sheets,was assessed utilizing a case study from the Fuzhou Metro Shield Tunnel Construction Project.展开更多
The mechanical properties and failure mechanism of lightweight aggregate concrete(LWAC)is a hot topic in the engineering field,and the relationship between its microstructure and macroscopic mechanical properties is a...The mechanical properties and failure mechanism of lightweight aggregate concrete(LWAC)is a hot topic in the engineering field,and the relationship between its microstructure and macroscopic mechanical properties is also a frontier research topic in the academic field.In this study,the image processing technology is used to establish a micro-structure model of lightweight aggregate concrete.Through the information extraction and processing of the section image of actual light aggregate concrete specimens,the mesostructural model of light aggregate concrete with real aggregate characteristics is established.The numerical simulation of uniaxial tensile test,uniaxial compression test and three-point bending test of lightweight aggregate concrete are carried out using a new finite element method-the base force element method respectively.Firstly,the image processing technology is used to produce beam specimens,uniaxial compression specimens and uniaxial tensile specimens of light aggregate concrete,which can better simulate the aggregate shape and random distribution of real light aggregate concrete.Secondly,the three-point bending test is numerically simulated.Thirdly,the uniaxial compression specimen generated by image processing technology is numerically simulated.Fourth,the uniaxial tensile specimen generated by image processing technology is numerically simulated.The mechanical behavior and damage mode of the specimen during loading were analyzed.The results of numerical simulation are compared and analyzed with those of relevant experiments.The feasibility and correctness of the micromodel established in this study for analyzing the micromechanics of lightweight aggregate concrete materials are verified.Image processing technology has a broad application prospect in the field of concrete mesoscopic damage analysis.展开更多
基金the financial support from the Strategic Priority Research Program of Chinese Academy of Sciences(XDA21010100)。
文摘Light olefins is the incredibly important materials in chemical industry.Methanol to olefins(MTO),which provides a non-oil route for light olefins production,received considerable attention in the past decades.However,the catalyst deactivation is an inevitable feature in MTO processes,and regeneration,therefore,is one of the key steps in industrial MTO processes.Traditionally the MTO catalyst is regenerated by removing the deposited coke via air combustion,which unavoidably transforms coke into carbon dioxide and reduces the carbon utilization efficiency.Recent study shows that the coke species over MTO catalyst can be regenerated via steam,which can promote the light olefins yield as the deactivated coke species can be essentially transferred to industrially useful synthesis gas,is a promising pathway for further MTO processes development.In this work,we modelled and analyzed these two MTO regeneration methods in terms of carbon utilization efficiency and technology economics.As shown,the steam regeneration could achieve a carbon utilization efficiency of 84.31%,compared to 74.74%for air combustion regeneration.The MTO processes using steam regeneration can essentially achieve the near-zero carbon emission.In addition,light olefins production of the MTO processes using steam regeneration is 12.81%higher than that using air combustion regeneration.In this regard,steam regeneration could be considered as a potential yet promising regeneration method for further MTO processes,showing not only great environmental benefits but also competitive economic performance.
文摘Software Development Life Cycle (SDLC) is one of the major ingredients for the development of efficient software systems within a time frame and low-cost involvement. From the literature, it is evident that there are various kinds of process models that are used by the software industries for the development of small, medium and long-term software projects, but many of them do not cover risk management. It is quite obvious that the improper selection of the software development process model leads to failure of the software products as it is time bound activity. In the present work, a new software development process model is proposed which covers the risks at any stage of the development of the software product. The model is named a Hemant-Vipin (HV) process model and may be helpful for the software industries for development of the efficient software products and timely delivery at the end of the client. The efficiency of the HV process model is observed by considering various kinds of factors like requirement clarity, user feedback, change agility, predictability, risk identification, practical implementation, customer satisfaction, incremental development, use of ready-made components, quick design, resource organization and many more and found through a case study that the presented approach covers many of parameters in comparison of the existing process models. .
基金supported by the National Natural Science Foundation of China(No.U1960202)。
文摘With the development of automation and informatization in the steelmaking industry,the human brain gradually fails to cope with an increasing amount of data generated during the steelmaking process.Machine learning technology provides a new method other than production experience and metallurgical principles in dealing with large amounts of data.The application of machine learning in the steelmaking process has become a research hotspot in recent years.This paper provides an overview of the applications of machine learning in the steelmaking process modeling involving hot metal pretreatment,primary steelmaking,secondary refining,and some other aspects.The three most frequently used machine learning algorithms in steelmaking process modeling are the artificial neural network,support vector machine,and case-based reasoning,demonstrating proportions of 56%,14%,and 10%,respectively.Collected data in the steelmaking plants are frequently faulty.Thus,data processing,especially data cleaning,is crucially important to the performance of machine learning models.The detection of variable importance can be used to optimize the process parameters and guide production.Machine learning is used in hot metal pretreatment modeling mainly for endpoint S content prediction.The predictions of the endpoints of element compositions and the process parameters are widely investigated in primary steelmaking.Machine learning is used in secondary refining modeling mainly for ladle furnaces,Ruhrstahl–Heraeus,vacuum degassing,argon oxygen decarburization,and vacuum oxygen decarburization processes.Further development of machine learning in the steelmaking process modeling can be realized through additional efforts in the construction of the data platform,the industrial transformation of the research achievements to the practical steelmaking process,and the improvement of the universality of the machine learning models.
文摘安全生产事故往往由多组织交互、多因素耦合造成,事故原因涉及多个组织。为预防和遏制多组织生产安全事故的发生,基于系统理论事故建模与过程模型(Systems-Theory Accident Modeling and Process,STAMP)、24Model,构建一种用于多组织事故分析的方法,并以青岛石油爆炸事故为例进行事故原因分析。结果显示:STAMP-24Model可以分组织,分层次且有效、全面、详细地分析涉及多个组织的事故原因,探究多组织之间的交互关系;对事故进行动态演化分析,可得到各组织不安全动作耦合关系与形成的事故失效链及管控失效路径,进而为预防多组织事故提供思路和参考。
基金supported by the National Key Research and Development Program of China(2021 YFB 4000500,2021 YFB 4000501,and 2021 YFB 4000502)。
文摘Steam cracking is the dominant technology for producing light olefins,which are believed to be the foundation of the chemical industry.Predictive models of the cracking process can boost production efficiency and profit margin.Rapid advancements in machine learning research have recently enabled data-driven solutions to usher in a new era of process modeling.Meanwhile,its practical application to steam cracking is still hindered by the trade-off between prediction accuracy and computational speed.This research presents a framework for data-driven intelligent modeling of the steam cracking process.Industrial data preparation and feature engineering techniques provide computational-ready datasets for the framework,and feedstock similarities are exploited using k-means clustering.We propose LArge-Residuals-Deletion Multivariate Adaptive Regression Spline(LARD-MARS),a modeling approach that explicitly generates output formulas and eliminates potentially outlying instances.The framework is validated further by the presentation of clustering results,the explanation of variable importance,and the testing and comparison of model performance.
基金funded by the NationalNatural Science Foundation of China (Nos.11902229,11502181)the Strategic Priority Research Program of the Chinese Academy of Sciences (Grant Nos.XDB22040502,XDC06030200).
文摘The comprehensive tire building and shaping processes are investigated through the finite element method(FEM)in this article.The mechanical properties of the uncured rubber from different tire components are investigated through cyclic loading-unloading experiments under different strain rates.Based on the experiments,an elastoviscoplastic constitutive model is adopted to describe themechanical behaviors of the uncured rubber.The distinct mechanical properties,including the stress level,hysteresis and residual strain,of the uncured rubber can all be well characterized.The whole tire building process(including component winding,rubber bladder inflation,component stitching and carcass band folding-back)and the shaping process are simulated using this constitutive model.The simulated green tire profile is in good agreement with the actual profile obtained through 3D scanning.The deformation and stress of the rubber components and the cord reinforcements during production can be obtained fromthe FE simulation,which is helpful for judging the rationality of the tire construction design.Finally,the influence of the parameter“drum width”is investigated,and the simulated result is found to be consistent with the experimental observations,which verifies the effectiveness of the simulation.The established simulation strategy provides some guiding significance for the improvement of tire design parameters and the elimination of tire production defects.
文摘Cities are facing challenges of high rise in population number and con-sequently need to be equipped with latest smart services to provide luxuries of life to its residents.Smart integrated solutions are also a need to deal with the social and environmental challenges,caused by increasing urbanization.Currently,the development of smart services’integrated network,within a city,is facing the bar-riers including;less efficient collection and sharing of data,along with inadequate collaboration of software and hardware.Aiming to resolve these issues,this paper recommended a solution for a synchronous functionality in the smart services’integration process through modeling technique.Using this integration modeling solution,atfirst,the service participants,processes and tasks of smart services are identified and then standard illustrations are developed for the better understand-ing of the integrated service group environment.Business process modeling and notation(BPMN)language based models are developed and discussed for a devised case study,to test and experiment i.e.,for remote healthcare from a smart home.The research is concluded with the integration process model application for the required data sharing among different service groups.The outcomes of the modeling are better understanding and attaining maximum automation that can be referenced and replicated.
基金This research is fully funded byUniversiti Malaysia Terengganu under the research Grant(PGRG).
文摘The successful execution and management of Offshore Software Maintenance Outsourcing(OSMO)can be very beneficial for OSMO vendors and the OSMO client.Although a lot of research on software outsourcing is going on,most of the existing literature on offshore outsourcing deals with the outsourcing of software development only.Several frameworks have been developed focusing on guiding software systemmanagers concerning offshore software outsourcing.However,none of these studies delivered comprehensive guidelines for managing the whole process of OSMO.There is a considerable lack of research working on managing OSMO from a vendor’s perspective.Therefore,to find the best practices for managing an OSMO process,it is necessary to further investigate such complex and multifaceted phenomena from the vendor’s perspective.This study validated the preliminary OSMO process model via a case study research approach.The results showed that the OSMO process model is applicable in an industrial setting with few changes.The industrial data collected during the case study enabled this paper to extend the preliminary OSMO process model.The refined version of the OSMO processmodel has four major phases including(i)Project Assessment,(ii)SLA(iii)Execution,and(iv)Risk.
基金funded by National Key R&D Program of China(No.2021YFB3401200)the National Natural Science Foundation of China(No.51875308)the Beijing Nature Sciences Fund-Haidian Originality Cooperation Project(L212002).
文摘Numerical simulation is the most powerful computational and analysis tool for a large variety of engineering and physical problems.For a complex problem relating to multi-field,multi-process and multi-scale,different computing tools have to be developed so as to solve particular fields at different scales and for different processes.Therefore,the integration of different types of software is inevitable.However,it is difficult to perform the transfer of the meshes and simulated results among software packages because of the lack of shared data formats or encrypted data formats.An image processing based method for three-dimensional model reconstruction for numerical simulation was proposed,which presents a solution to the integration problem by a series of slice or projection images obtained by the post-processing modules of the numerical simulation software.By means of mapping image pixels to meshes of either finite difference or finite element models,the geometry contour can be extracted to export the stereolithography model.The values of results,represented by color,can be deduced and assigned to the meshes.All the models with data can be directly or indirectly integrated into other software as a continued or new numerical simulation.The three-dimensional reconstruction method has been validated in numerical simulation of castings and case studies were provided in this study.
基金supported by the National Natural Science Foundation of China under grants U1836208,by the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD)fundby the Collaborative Innovation Center of Atmospheric Environment and Equipment Technology (CICAEET)fund,China.
文摘Image processing networks have gained great success in many fields,and thus the issue of copyright protection for image processing networks hasbecome a focus of attention. Model watermarking techniques are widely usedin model copyright protection, but there are two challenges: (1) designinguniversal trigger sample watermarking for different network models is stilla challenge;(2) existing methods of copyright protection based on trigger swatermarking are difficult to resist forgery attacks. In this work, we propose adual model watermarking framework for copyright protection in image processingnetworks. The trigger sample watermark is embedded in the trainingprocess of the model, which can effectively verify the model copyright. And wedesign a common method for generating trigger sample watermarks based ongenerative adversarial networks, adaptively generating trigger sample watermarksaccording to different models. The spatial watermark is embedded intothe model output. When an attacker steals model copyright using a forgedtrigger sample watermark, which can be correctly extracted to distinguishbetween the piratical and the protected model. The experiments show that theproposed framework has good performance in different image segmentationnetworks of UNET, UNET++, and FCN (fully convolutional network), andeffectively resists forgery attacks.
基金This work was supported by the National Natural Science Foundation of China(No.51979255).
文摘The knowledge of the existence,distribution and fate of polycyclic aromatic hydrocarbons(PAHs)and substituted polycyclic aromatic hydrocarbons(SPAHs)in wastewater treatment plants(WWTPs)was vital for reducing their concentrations entering the aquatic environment.The concentrations of 13 SPAHs and 16 PAHs were all determined in a WWTP with styrene butadiene rubber(SBR)in partnership with the moving bed biofilm reactor(MBBR)process.SPAHs presented a higher concentration lever than PAHs in nearly all samples.The total removal efficiencies of PAHs and SPAHs ranged from 64.0%to 71.36%and 78.4%to 79.7%,respectively.The total yearly loads of PAHs(43.0 kg)and SPAHs(73.0 kg)were mainly reduced by the primary and SBR/MBBR biological treatment stages.The tertiary treatment stage had a minor contribution to target compounds removal.According to a synthesis and improvement fate model,we found that the dominant processes changed as the chemical octanol water partition coefficient(K_(ow))increased.But the seasonal variations of experimental removal efficiencies were more obvious than that of predicted data.In the primary sedimentation tank,dissolution in the aqueous phase and sorption to sludge/particulate matter were controlling processes for the removal of PAHs and SPAHs.The sorption to sludge and biodegradation were the principal removal mechanisms during the SBR/MBBR biological treatment process.The contribution of volatilization to removal was always insignificant.Furthermore,the basic physicochemical properties and operating parameters influenced the fate of PAHs and SPAHs in the WWTP.
文摘The optimization system, which was the subject of our study, is an autonomous chain for the automatic management of cyanide consumption. It is in the phase of industrial automation which made it possible to use the machines in order to reduce the workload of the worker while keeping a high productivity and a quality in great demand. Furthermore, the use of cyanide in leaching tanks is a necessity in the gold recovery process. This consumption of cyanide must be optimal in these tanks in order to have a good recovery while controlling the concentration of cyanide. Cyanide is one of the most expensive products for mining companies. On a completely different note, we see huge variations during the addition of cyanide. Following a recommendation from the metallurgical and operations teams, the control team carried out an analysis of the problem while proposing a solution to reduce the variability around plus or minus 10% of the addition setpoint through automation. It should be noted that this automatic optimization by monitoring the concentration of cyanide, made use of industrial automation which is a technique which ensures the operation of the ore processing chain without human intervention. In other words, it made it possible to substitute a machine for man. So, this leads us to conduct a study on concentration levels in the real world. The results show that the analysis of the modeling of the cyanide consumption optimization system is an appropriate solution to eradicate failures in the mineral processing chain. The trend curves demonstrate this resolution perfectly.
基金funded by Ministry of Industry and Information Technology of the People’s Republic of China[Grant No.2018473].
文摘Quality traceability plays an essential role in assembling and welding offshore platform blocks.The improvement of the welding quality traceability system is conducive to improving the durability of the offshore platform and the process level of the offshore industry.Currently,qualitymanagement remains in the era of primary information,and there is a lack of effective tracking and recording of welding quality data.When welding defects are encountered,it is difficult to rapidly and accurately determine the root cause of the problem from various complexities and scattered quality data.In this paper,a composite welding quality traceability model for offshore platform block construction process is proposed,it contains the quality early-warning method based on long short-term memory and quality data backtracking query optimization algorithm.By fulfilling the training of the early-warning model and the implementation of the query optimization algorithm,the quality traceability model has the ability to assist enterprises in realizing the rapid identification and positioning of quality problems.Furthermore,the model and the quality traceability algorithm are checked by cases in actual working conditions.Verification analyses suggest that the proposed early-warningmodel for welding quality and the algorithmfor optimizing backtracking requests are effective and can be applied to the actual construction process.
基金supported in part by the NIH grant R01CA241134supported in part by the NSF grant CMMI-1552764+3 种基金supported in part by the NSF grants DMS-1349724 and DMS-2052465supported in part by the NSF grant CCF-1740761supported in part by the U.S.-Norway Fulbright Foundation and the Research Council of Norway R&D Grant 309273supported in part by the Norwegian Centennial Chair grant and the Doctoral Dissertation Fellowship from the University of Minnesota.
文摘The spread of an advantageous mutation through a population is of fundamental interest in population genetics. While the classical Moran model is formulated for a well-mixed population, it has long been recognized that in real-world applications, the population usually has an explicit spatial structure which can significantly influence the dynamics. In the context of cancer initiation in epithelial tissue, several recent works have analyzed the dynamics of advantageous mutant spread on integer lattices, using the biased voter model from particle systems theory. In this spatial version of the Moran model, individuals first reproduce according to their fitness and then replace a neighboring individual. From a biological standpoint, the opposite dynamics, where individuals first die and are then replaced by a neighboring individual according to its fitness, are equally relevant. Here, we investigate this death-birth analogue of the biased voter model. We construct the process mathematically, derive the associated dual process, establish bounds on the survival probability of a single mutant, and prove that the process has an asymptotic shape. We also briefly discuss alternative birth-death and death-birth dynamics, depending on how the mutant fitness advantage affects the dynamics. We show that birth-death and death-birth formulations of the biased voter model are equivalent when fitness affects the former event of each update of the model, whereas the birth-death model is fundamentally different from the death-birth model when fitness affects the latter event.
基金supported by National Key R&D Program of China(No.2022YFC3004705)the National Natural Science Foundation of China(Nos.52074280,52227901 and 52204249)National Natural Science Foundation of China Youth Fund(No.52104230).
文摘Effective monitoring of the structural health of combined coal-rock under complex geological conditions by pressure stimulated currents(PSCs)has great potential for the understanding of dynamic disasters in underground engineering.To reveal the effect of this way,the uniaxial compression experiments with PSC monitoring were conducted on three types of coal-rock combination samples with different strength combinations.The mechanism explanation of PSCs are investigated by resistivity test,atomic force microscopy(AFM)and computed tomography(CT)methods,and a PSC flow model based on progressive failure process is proposed.The influence of strength combinations on PSCs in the progressive failure process are emphasized.The results show the PSC responses between rock part,coal part and the two components are different,which are affected by multi-scale fracture characteristics and electrical properties.As the rock strength decreases,the progressive failure process changes obviously with the influence range of interface constraint effect decreasing,resulting in the different responses of PSC strength and direction in different parts to fracture behaviors.The PSC flow model is initially validated by the relationship between the accumulated charges of different parts.The results are expected to provide a new reference and method for mining design and roadway quality assessment.
基金financially supported by the National Key Research and Development Program of China(2022YFB3706800,2020YFB1710100)the National Natural Science Foundation of China(51821001,52090042,52074183)。
文摘The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was proposed to reduce casting defects and improve production efficiency,which includes the random forest(RF)classification model,the feature importance analysis,and the process parameters optimization with Monte Carlo simulation.The collected data includes four types of defects and corresponding process parameters were used to construct the RF model.Classification results show a recall rate above 90% for all categories.The Gini Index was used to assess the importance of the process parameters in the formation of various defects in the RF model.Finally,the classification model was applied to different production conditions for quality prediction.In the case of process parameters optimization for gas porosity defects,this model serves as an experimental process in the Monte Carlo method to estimate a better temperature distribution.The prediction model,when applied to the factory,greatly improved the efficiency of defect detection.Results show that the scrap rate decreased from 10.16% to 6.68%.
基金This research was primarily supported by a NOAA Warn-on-Forecast(WoF)grant(Grant No.NA16OAR4320115).
文摘There are more uncertainties with ice hydrometeor representations and related processes than liquid hydrometeors within microphysics parameterization(MP)schemes because of their complicated geometries and physical properties.Idealized supercell simulations are produced using the WRF model coupled with“full”Hebrew University spectral bin MP(HU-SBM),and NSSL and Thompson bulk MP(BMP)schemes.HU-SBM downdrafts are typically weaker than those of the NSSL and Thompson simulations,accompanied by less rain evaporation.HU-SBM produces more cloud ice(plates),graupel,and hail than the BMPs,yet precipitates less at the surface.The limiting mass bins(and subsequently,particle size)of rimed ice in HU-SBM and slower rimed ice fall speeds lead to smaller melting-level net rimed ice fluxes than those of the BMPs.Aggregation from plates in HU-SBM,together with snow–graupel collisions,leads to a greater snow contribution to rain than those of the BMPs.Replacing HU-SBM’s fall speeds using the formulations of the BMPs after aggregating the discrete bin values to mass mixing ratios and total number concentrations increases net rain and rimed ice fluxes.Still,they are smaller in magnitude than bulk rain,NSSL hail,and Thompson graupel net fluxes near the surface.Conversely,the melting-layer net rimed ice fluxes are reduced when the fall speeds for the NSSL and Thompson simulations are calculated using HU-SBM fall speed formulations after discretizing the bulk particle size distributions(PSDs)into spectral bins.The results highlight precipitation sensitivity to storm dynamics,fall speed,hydrometeor evolution governed by process rates,and MP PSD design.
文摘In this paper, CiteSpace, a bibliometrics software, was adopted to collect research papers published on the Web of Science, which are relevant to biological model and effluent quality prediction in activated sludge process in the wastewater treatment. By the way of trend map, keyword knowledge map, and co-cited knowledge map, specific visualization analysis and identification of the authors, institutions and regions were concluded. Furthermore, the topics and hotspots of water quality prediction in activated sludge process through the literature-co-citation-based cluster analysis and literature citation burst analysis were also determined, which not only reflected the historical evolution progress to a certain extent, but also provided the direction and insight of the knowledge structure of water quality prediction and activated sludge process for future research.
基金funded by the Scientific research startup Foundation of Fujian University of Technology(GY-Z21067 and GY-Z21026).
文摘Amid urbanization and the continuous expansion of transportation networks,the necessity for tunnel construction and maintenance has become paramount.Addressing this need requires the investigation of efficient,economical,and robust tunnel reinforcement techniques.This paper explores fiber reinforced polymer(FRP)and steel fiber reinforced concrete(SFRC)technologies,which have emerged as viable solutions for enhancing tunnel structures.FRP is celebrated for its lightweight and high-strength attributes,effectively augmenting load-bearing capacity and seismic resistance,while SFRC’s notable crack resistance and longevity potentially enhance the performance of tunnel segments.Nonetheless,current research predominantly focuses on experimental analysis,lacking comprehensive theoretical models.To bridge this gap,the cohesive zone model(CZM),which utilizes cohesive elements to characterize the potential fracture surfaces of concrete/SFRC,the rebar-concrete interface,and the FRP-concrete interface,was employed.A modeling approach was subsequently proposed to construct a tunnel segment model reinforced with either SFRC or FRP.Moreover,the corresponding mixed-mode constitutive models,considering interfacial friction,were integrated into the proposed model.Experimental validation and numerical simulations corroborated the accuracy of the proposed model.Additionally,this study examined the reinforcement design of tunnel segments.Through a numerical evaluation,the effectiveness of innovative reinforcement schemes,such as substituting concrete with SFRC and externally bonding FRP sheets,was assessed utilizing a case study from the Fuzhou Metro Shield Tunnel Construction Project.
基金supported by the National Science Foundation of China(10972015,11172015)the Beijing Natural Science Foundation(8162008).
文摘The mechanical properties and failure mechanism of lightweight aggregate concrete(LWAC)is a hot topic in the engineering field,and the relationship between its microstructure and macroscopic mechanical properties is also a frontier research topic in the academic field.In this study,the image processing technology is used to establish a micro-structure model of lightweight aggregate concrete.Through the information extraction and processing of the section image of actual light aggregate concrete specimens,the mesostructural model of light aggregate concrete with real aggregate characteristics is established.The numerical simulation of uniaxial tensile test,uniaxial compression test and three-point bending test of lightweight aggregate concrete are carried out using a new finite element method-the base force element method respectively.Firstly,the image processing technology is used to produce beam specimens,uniaxial compression specimens and uniaxial tensile specimens of light aggregate concrete,which can better simulate the aggregate shape and random distribution of real light aggregate concrete.Secondly,the three-point bending test is numerically simulated.Thirdly,the uniaxial compression specimen generated by image processing technology is numerically simulated.Fourth,the uniaxial tensile specimen generated by image processing technology is numerically simulated.The mechanical behavior and damage mode of the specimen during loading were analyzed.The results of numerical simulation are compared and analyzed with those of relevant experiments.The feasibility and correctness of the micromodel established in this study for analyzing the micromechanics of lightweight aggregate concrete materials are verified.Image processing technology has a broad application prospect in the field of concrete mesoscopic damage analysis.