The curse of dimensionality refers to the problem o increased sparsity and computational complexity when dealing with high-dimensional data.In recent years,the types and vari ables of industrial data have increased si...The curse of dimensionality refers to the problem o increased sparsity and computational complexity when dealing with high-dimensional data.In recent years,the types and vari ables of industrial data have increased significantly,making data driven models more challenging to develop.To address this prob lem,data augmentation technology has been introduced as an effective tool to solve the sparsity problem of high-dimensiona industrial data.This paper systematically explores and discusses the necessity,feasibility,and effectiveness of augmented indus trial data-driven modeling in the context of the curse of dimen sionality and virtual big data.Then,the process of data augmen tation modeling is analyzed,and the concept of data boosting augmentation is proposed.The data boosting augmentation involves designing the reliability weight and actual-virtual weigh functions,and developing a double weighted partial least squares model to optimize the three stages of data generation,data fusion and modeling.This approach significantly improves the inter pretability,effectiveness,and practicality of data augmentation in the industrial modeling.Finally,the proposed method is verified using practical examples of fault diagnosis systems and virtua measurement systems in the industry.The results demonstrate the effectiveness of the proposed approach in improving the accu racy and robustness of data-driven models,making them more suitable for real-world industrial applications.展开更多
The dynamical modeling of projectile systems with sufficient accuracy is of great difficulty due to high-dimensional space and various perturbations.With the rapid development of data science and scientific tools of m...The dynamical modeling of projectile systems with sufficient accuracy is of great difficulty due to high-dimensional space and various perturbations.With the rapid development of data science and scientific tools of measurement recently,there are numerous data-driven methods devoted to discovering governing laws from data.In this work,a data-driven method is employed to perform the modeling of the projectile based on the Kramers–Moyal formulas.More specifically,the four-dimensional projectile system is assumed as an It?stochastic differential equation.Then the least square method and sparse learning are applied to identify the drift coefficient and diffusion matrix from sample path data,which agree well with the real system.The effectiveness of the data-driven method demonstrates that it will become a powerful tool in extracting governing equations and predicting complex dynamical behaviors of the projectile.展开更多
Conventional automated machine learning(AutoML)technologies fall short in preprocessing low-quality raw data and adapting to varying indoor and outdoor environments,leading to accuracy reduction in forecasting short-t...Conventional automated machine learning(AutoML)technologies fall short in preprocessing low-quality raw data and adapting to varying indoor and outdoor environments,leading to accuracy reduction in forecasting short-term building energy loads.Moreover,their predictions are not transparent because of their black box nature.Hence,the building field currently lacks an AutoML framework capable of data quality enhancement,environment self-adaptation,and model interpretation.To address this research gap,an improved AutoML-based end-to-end data-driven modeling framework is proposed.Bayesian optimization is applied by this framework to find an optimal data preprocessing process for quality improvement of raw data.It bridges the gap where conventional AutoML technologies cannot automatically handle missing data and outliers.A sliding window-based model retraining strategy is utilized to achieve environment self-adaptation,contributing to the accuracy enhancement of AutoML technologies.Moreover,a local interpretable model-agnostic explanations-based approach is developed to interpret predictions made by the improved framework.It overcomes the poor interpretability of conventional AutoML technologies.The performance of the improved framework in forecasting one-hour ahead cooling loads is evaluated using two-year operational data from a real building.It is discovered that the accuracy of the improved framework increases by 4.24%–8.79%compared with four conventional frameworks for buildings with not only high-quality but also low-quality operational data.Furthermore,it is demonstrated that the developed model interpretation approach can effectively explain the predictions of the improved framework.The improved framework offers a novel perspective on creating accurate and reliable AutoML frameworks tailored to building energy load prediction tasks and other similar tasks.展开更多
Global images of auroras obtained by cameras on spacecraft are a key tool for studying the near-Earth environment.However,the cameras are sensitive not only to auroral emissions produced by precipitating particles,but...Global images of auroras obtained by cameras on spacecraft are a key tool for studying the near-Earth environment.However,the cameras are sensitive not only to auroral emissions produced by precipitating particles,but also to dayglow emissions produced by photoelectrons induced by sunlight.Nightglow emissions and scattered sunlight can contribute to the background signal.To fully utilize such images in space science,background contamination must be removed to isolate the auroral signal.Here we outline a data-driven approach to modeling the background intensity in multiple images by formulating linear inverse problems based on B-splines and spherical harmonics.The approach is robust,flexible,and iteratively deselects outliers,such as auroral emissions.The final model is smooth across the terminator and accounts for slow temporal variations and large-scale asymmetries in the dayglow.We demonstrate the model by using the three far ultraviolet cameras on the Imager for Magnetopause-to-Aurora Global Exploration(IMAGE)mission.The method can be applied to historical missions and is relevant for upcoming missions,such as the Solar wind Magnetosphere Ionosphere Link Explorer(SMILE)mission.展开更多
Steam cracking is the dominant technology for producing light olefins,which are believed to be the foundation of the chemical industry.Predictive models of the cracking process can boost production efficiency and prof...Steam cracking is the dominant technology for producing light olefins,which are believed to be the foundation of the chemical industry.Predictive models of the cracking process can boost production efficiency and profit margin.Rapid advancements in machine learning research have recently enabled data-driven solutions to usher in a new era of process modeling.Meanwhile,its practical application to steam cracking is still hindered by the trade-off between prediction accuracy and computational speed.This research presents a framework for data-driven intelligent modeling of the steam cracking process.Industrial data preparation and feature engineering techniques provide computational-ready datasets for the framework,and feedstock similarities are exploited using k-means clustering.We propose LArge-Residuals-Deletion Multivariate Adaptive Regression Spline(LARD-MARS),a modeling approach that explicitly generates output formulas and eliminates potentially outlying instances.The framework is validated further by the presentation of clustering results,the explanation of variable importance,and the testing and comparison of model performance.展开更多
This paper demonstrated the fabrication,characterization,datadriven modeling,and practical application of a 1D SnO_(2)nanofiber-based memristor,in which a 1D SnO_(2)active layer wassandwiched between silver(Ag)and alu...This paper demonstrated the fabrication,characterization,datadriven modeling,and practical application of a 1D SnO_(2)nanofiber-based memristor,in which a 1D SnO_(2)active layer wassandwiched between silver(Ag)and aluminum(Al)electrodes.Thisdevice yielded a very high ROFF:RON of~104(ION:IOFF of~105)with an excellent activation slope of 10 mV/dec,low set voltage ofVSET~1.14 V and good repeatability.This paper physically explained the conduction mechanism in the layered SnO_(2)nanofiber-basedmemristor.The conductive network was composed of nanofibersthat play a vital role in the memristive action,since more conductive paths could facilitate the hopping of electron carriers.Energyband structures experimentally extracted with the adoption of ultraviolet photoelectron spectroscopy strongly support the claimsreported in this paper.An machine learning(ML)–assisted,datadriven model of the fabricated memristor was also developedemploying different popular algorithms such as polynomialregression,support vector regression,k nearest neighbors,andartificial neural network(ANN)to model the data of the fabricateddevice.We have proposed two types of ANN models(type I andtype II)algorithms,illustrated with a detailed flowchart,to modelthe fabricated memristor.Benchmarking with standard ML techniques shows that the type II ANN algorithm provides the bestmean absolute percentage error of 0.0175 with a 98%R^(2)score.The proposed data-driven model was further validated with the characterization results of similar new memristors fabricated adoptingthe same fabrication recipe,which gave satisfactory predictions.Lastly,the ANN type II model was applied to design and implementsimple AND&OR logic functionalities adopting the fabricatedmemristors with expected,near-ideal characteristics.展开更多
Machine learning(ML)provides a new surrogate method for investigating groundwater flow dynamics in unsaturated soils.Traditional pure data-driven methods(e.g.deep neural network,DNN)can provide rapid predictions,but t...Machine learning(ML)provides a new surrogate method for investigating groundwater flow dynamics in unsaturated soils.Traditional pure data-driven methods(e.g.deep neural network,DNN)can provide rapid predictions,but they do require sufficient on-site data for accurate training,and lack interpretability to the physical processes within the data.In this paper,we provide a physics and equalityconstrained artificial neural network(PECANN),to derive unsaturated infiltration solutions with a small amount of initial and boundary data.PECANN takes the physics-informed neural network(PINN)as a foundation,encodes the unsaturated infiltration physical laws(i.e.Richards equation,RE)into the loss function,and uses the augmented Lagrangian method to constrain the learning process of the solutions of RE by adding stronger penalty for the initial and boundary conditions.Four unsaturated infiltration cases are designed to test the training performance of PECANN,i.e.one-dimensional(1D)steady-state unsaturated infiltration,1D transient-state infiltration,two-dimensional(2D)transient-state infiltration,and 1D coupled unsaturated infiltration and deformation.The predicted results of PECANN are compared with the finite difference solutions or analytical solutions.The results indicate that PECANN can accurately capture the variations of pressure head during the unsaturated infiltration,and present higher precision and robustness than DNN and PINN.It is also revealed that PECANN can achieve the same accuracy as the finite difference method with fewer initial and boundary training data.Additionally,we investigate the effect of the hyperparameters of PECANN on solving RE problem.PECANN provides an effective tool for simulating unsaturated infiltration.展开更多
To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new lig...To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new light attention module,and a residue module—that are specially designed to learn the general dynamic behavior,transient disturbances,and other input factors of chemical processes,respectively.Combined with a hyperparameter optimization framework,Optuna,the effectiveness of the proposed LACG is tested by distributed control system data-driven modeling experiments on the discharge flowrate of an actual deethanization process.The LACG model provides significant advantages in prediction accuracy and model generalization compared with other models,including the feedforward neural network,convolution neural network,long short-term memory(LSTM),and attention-LSTM.Moreover,compared with the simulation results of a deethanization model built using Aspen Plus Dynamics V12.1,the LACG parameters are demonstrated to be interpretable,and more details on the variable interactions can be observed from the model parameters in comparison with the traditional interpretable model attention-LSTM.This contribution enriches interpretable machine learning knowledge and provides a reliable method with high accuracy for actual chemical process modeling,paving a route to intelligent manufacturing.展开更多
The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was p...The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was proposed to reduce casting defects and improve production efficiency,which includes the random forest(RF)classification model,the feature importance analysis,and the process parameters optimization with Monte Carlo simulation.The collected data includes four types of defects and corresponding process parameters were used to construct the RF model.Classification results show a recall rate above 90% for all categories.The Gini Index was used to assess the importance of the process parameters in the formation of various defects in the RF model.Finally,the classification model was applied to different production conditions for quality prediction.In the case of process parameters optimization for gas porosity defects,this model serves as an experimental process in the Monte Carlo method to estimate a better temperature distribution.The prediction model,when applied to the factory,greatly improved the efficiency of defect detection.Results show that the scrap rate decreased from 10.16% to 6.68%.展开更多
Macrosegregation is a critical factor that limits the mechanical properties of materials.The impact of equiaxed crystal sedimentation on macrosegregation has been extensively studied,as it plays a significant role in ...Macrosegregation is a critical factor that limits the mechanical properties of materials.The impact of equiaxed crystal sedimentation on macrosegregation has been extensively studied,as it plays a significant role in determining the distribution of alloying elements and impurities within a material.To improve macrosegregation in steel connecting shafts,a multiphase solidification model that couples melt flow,heat transfer,microstructure evolution,and solute transport was established based on the volume-averaged Eulerian-Eulerian approach.In this model,the effects of liquid phase,equiaxed crystals,columnar dendrites,and columnar-to-equiaxed transition(CET)during solidification and evolution of microstructure can be considered simultaneously.The sedimentation of equiaxed crystals contributes to negative macrosegregation,where regions between columnar dendrites and equiaxed crystals undergo significant A-type positive macrosegregation due to the CET.Additionally,noticeable positive macrosegregation occurs in the area of final solidification in the ingot.The improvement in macrosegregation is beneficial for enhancing the mechanical properties of connecting shafts.To mitigate the thermal convection of molten steel resulting from excessive superheating,reducing the superheating during casting without employing external fields or altering the design of the ingot mold is indeed an effective approach to control macrosegregation.展开更多
BACKGROUND Postoperative delirium,particularly prevalent in elderly patients after abdominal cancer surgery,presents significant challenges in clinical management.AIM To develop a synthetic minority oversampling techn...BACKGROUND Postoperative delirium,particularly prevalent in elderly patients after abdominal cancer surgery,presents significant challenges in clinical management.AIM To develop a synthetic minority oversampling technique(SMOTE)-based model for predicting postoperative delirium in elderly abdominal cancer patients.METHODS In this retrospective cohort study,we analyzed data from 611 elderly patients who underwent abdominal malignant tumor surgery at our hospital between September 2020 and October 2022.The incidence of postoperative delirium was recorded for 7 d post-surgery.Patients were divided into delirium and non-delirium groups based on the occurrence of postoperative delirium or not.A multivariate logistic regression model was used to identify risk factors and develop a predictive model for postoperative delirium.The SMOTE technique was applied to enhance the model by oversampling the delirium cases.The model’s predictive accuracy was then validated.RESULTS In our study involving 611 elderly patients with abdominal malignant tumors,multivariate logistic regression analysis identified significant risk factors for postoperative delirium.These included the Charlson comorbidity index,American Society of Anesthesiologists classification,history of cerebrovascular disease,surgical duration,perioperative blood transfusion,and postoperative pain score.The incidence rate of postoperative delirium in our study was 22.91%.The original predictive model(P1)exhibited an area under the receiver operating characteristic curve of 0.862.In comparison,the SMOTE-based logistic early warning model(P2),which utilized the SMOTE oversampling algorithm,showed a slightly lower but comparable area under the curve of 0.856,suggesting no significant difference in performance between the two predictive approaches.CONCLUSION This study confirms that the SMOTE-enhanced predictive model for postoperative delirium in elderly abdominal tumor patients shows performance equivalent to that of traditional methods,effectively addressing data imbalance.展开更多
We have proposed a methodology to assess the robustness of underground tunnels against potential failure.This involves developing vulnerability functions for various qualities of rock mass and static loading intensiti...We have proposed a methodology to assess the robustness of underground tunnels against potential failure.This involves developing vulnerability functions for various qualities of rock mass and static loading intensities.To account for these variations,we utilized a Monte Carlo Simulation(MCS)technique coupled with the finite difference code FLAC^(3D),to conduct two thousand seven hundred numerical simulations of a horseshoe tunnel located within a rock mass with different geological strength index system(GSIs)and subjected to different states of static loading.To quantify the severity of damage within the rock mass,we selected one stress-based(brittle shear ratio(BSR))and one strain-based failure criterion(plastic damage index(PDI)).Based on these criteria,we then developed fragility curves.Additionally,we used mathematical approximation techniques to produce vulnerability functions that relate the probabilities of various damage states to loading intensities for different quality classes of blocky rock mass.The results indicated that the fragility curves we obtained could accurately depict the evolution of the inner and outer shell damage around the tunnel.Therefore,we have provided engineers with a tool that can predict levels of damages associated with different failure mechanisms based on variations in rock mass quality and in situ stress state.Our method is a numerically developed,multi-variate approach that can aid engineers in making informed decisions about the robustness of underground tunnels.展开更多
Modern medicine is reliant on various medical imaging technologies for non-invasively observing patients’anatomy.However,the interpretation of medical images can be highly subjective and dependent on the expertise of...Modern medicine is reliant on various medical imaging technologies for non-invasively observing patients’anatomy.However,the interpretation of medical images can be highly subjective and dependent on the expertise of clinicians.Moreover,some potentially useful quantitative information in medical images,especially that which is not visible to the naked eye,is often ignored during clinical practice.In contrast,radiomics performs high-throughput feature extraction from medical images,which enables quantitative analysis of medical images and prediction of various clinical endpoints.Studies have reported that radiomics exhibits promising performance in diagnosis and predicting treatment responses and prognosis,demonstrating its potential to be a non-invasive auxiliary tool for personalized medicine.However,radiomics remains in a developmental phase as numerous technical challenges have yet to be solved,especially in feature engineering and statistical modeling.In this review,we introduce the current utility of radiomics by summarizing research on its application in the diagnosis,prognosis,and prediction of treatment responses in patients with cancer.We focus on machine learning approaches,for feature extraction and selection during feature engineering and for imbalanced datasets and multi-modality fusion during statistical modeling.Furthermore,we introduce the stability,reproducibility,and interpretability of features,and the generalizability and interpretability of models.Finally,we offer possible solutions to current challenges in radiomics research.展开更多
Rock fragmentation plays a critical role in rock avalanches,yet conventional approaches such as classical granular flow models or the bonded particle model have limitations in accurately characterizing the progressive...Rock fragmentation plays a critical role in rock avalanches,yet conventional approaches such as classical granular flow models or the bonded particle model have limitations in accurately characterizing the progressive disintegration and kinematics of multi-deformable rock blocks during rockslides.The present study proposes a discrete-continuous numerical model,based on a cohesive zone model,to explicitly incorporate the progressive fragmentation and intricate interparticle interactions inherent in rockslides.Breakable rock granular assemblies are released along an inclined plane and flow onto a horizontal plane.The numerical scenarios are established to incorporate variations in slope angle,initial height,friction coefficient,and particle number.The evolutions of fragmentation,kinematic,runout and depositional characteristics are quantitatively analyzed and compared with experimental and field data.A positive linear relationship between the equivalent friction coefficient and the apparent friction coefficient is identified.In general,the granular mass predominantly exhibits characteristics of a dense granular flow,with the Savage number exhibiting a decreasing trend as the volume of mass increases.The process of particle breakage gradually occurs in a bottom-up manner,leading to a significant increase in the angular velocities of the rock blocks with increasing depth.The simulation results reproduce the field observations of inverse grading and source stratigraphy preservation in the deposit.We propose a disintegration index that incorporates factors such as drop height,rock mass volume,and rock strength.Our findings demonstrate a consistent linear relationship between this index and the fragmentation degree in all tested scenarios.展开更多
Natural slopes usually display complicated exposed rock surfaces that are characterized by complex and substantial terrain undulation and ubiquitous undesirable phenomena such as vegetation cover and rockfalls.This st...Natural slopes usually display complicated exposed rock surfaces that are characterized by complex and substantial terrain undulation and ubiquitous undesirable phenomena such as vegetation cover and rockfalls.This study presents a systematic outcrop research of fracture pattern variations in a complicated rock slope,and the qualitative and quantitative study of the complex phenomena impact on threedimensional(3D)discrete fracture network(DFN)modeling.As the studies of the outcrop fracture pattern have been so far focused on local variations,thus,we put forward a statistical analysis of global variations.The entire outcrop is partitioned into several subzones,and the subzone-scale variability of fracture geometric properties is analyzed(including the orientation,the density,and the trace length).The results reveal significant variations in fracture characteristics(such as the concentrative degree,the average orientation,the density,and the trace length)among different subzones.Moreover,the density of fracture sets,which is approximately parallel to the slope surface,exhibits a notably higher value compared to other fracture sets across all subzones.To improve the accuracy of the DFN modeling,the effects of three common phenomena resulting from vegetation and rockfalls are qualitatively analyzed and the corresponding quantitative data processing solutions are proposed.Subsequently,the 3D fracture geometric parameters are determined for different areas of the high-steep rock slope in terms of the subzone dimensions.The results show significant variations in the same set of 3D fracture parameters across different regions with density differing by up to tenfold and mean trace length exhibiting differences of 3e4 times.The study results present precise geological structural information,improve modeling accuracy,and provide practical solutions for addressing complex outcrop issues.展开更多
Deterministic compartment models(CMs)and stochastic models,including stochastic CMs and agent-based models,are widely utilized in epidemic modeling.However,the relationship between CMs and their corresponding stochast...Deterministic compartment models(CMs)and stochastic models,including stochastic CMs and agent-based models,are widely utilized in epidemic modeling.However,the relationship between CMs and their corresponding stochastic models is not well understood.The present study aimed to address this gap by conducting a comparative study using the susceptible,exposed,infectious,and recovered(SEIR)model and its extended CMs from the coronavirus disease 2019 modeling literature.We demonstrated the equivalence of the numerical solution of CMs using the Euler scheme and their stochastic counterparts through theoretical analysis and simulations.Based on this equivalence,we proposed an efficient model calibration method that could replicate the exact solution of CMs in the corresponding stochastic models through parameter adjustment.The advancement in calibration techniques enhanced the accuracy of stochastic modeling in capturing the dynamics of epidemics.However,it should be noted that discrete-time stochastic models cannot perfectly reproduce the exact solution of continuous-time CMs.Additionally,we proposed a new stochastic compartment and agent mixed model as an alternative to agent-based models for large-scale population simulations with a limited number of agents.This model offered a balance between computational efficiency and accuracy.The results of this research contributed to the comparison and unification of deterministic CMs and stochastic models in epidemic modeling.Furthermore,the results had implications for the development of hybrid models that integrated the strengths of both frameworks.Overall,the present study has provided valuable epidemic modeling techniques and their practical applications for understanding and controlling the spread of infectious diseases.展开更多
Ethylene glycol(EG)plays a pivotal role as a primary raw material in the polyester industry,and the syngas-to-EG route has become a significant technical route in production.The carbon monoxide(CO)gas-phase catalytic ...Ethylene glycol(EG)plays a pivotal role as a primary raw material in the polyester industry,and the syngas-to-EG route has become a significant technical route in production.The carbon monoxide(CO)gas-phase catalytic coupling to synthesize dimethyl oxalate(DMO)is a crucial process in the syngas-to-EG route,whereby the composition of the reactor outlet exerts influence on the ultimate quality of the EG product and the energy consumption during the subsequent separation process.However,measuring product quality in real time or establishing accurate dynamic mechanism models is challenging.To effectively model the DMO synthesis process,this study proposes a hybrid modeling strategy that integrates process mechanisms and data-driven approaches.The CO gas-phase catalytic coupling mechanism model is developed based on intrinsic kinetics and material balance,while a long short-term memory(LSTM)neural network is employed to predict the macroscopic reaction rate by leveraging temporal relationships derived from archived measurements.The proposed model is trained semi-supervised to accommodate limited-label data scenarios,leveraging historical data.By integrating these predictions with the mechanism model,the hybrid modeling approach provides reliable and interpretable forecasts of mass fractions.Empirical investigations unequivocally validate the superiority of the proposed hybrid modeling approach over conventional data-driven models(DDMs)and other hybrid modeling techniques.展开更多
Recent studies have underscored the significance of the capillary fringe in hydrological and biochemical processes.Moreover,its role in shallow waters is expected to be considerable.Traditionally,the study of groundwa...Recent studies have underscored the significance of the capillary fringe in hydrological and biochemical processes.Moreover,its role in shallow waters is expected to be considerable.Traditionally,the study of groundwater flow has centered on unsaturated-saturated zones,often overlooking the impact of the capillary fringe.In this study,we introduce a steady-state two-dimensional model that integrates the capillary fringe into a 2-D numerical solution.Our novel approach employs the potential form of the Richards equation,facilitating the determination of boundaries,pressures,and velocities across different ground surface zones.We utilized a two-dimensional Freefem++finite element model to compute the stationary solution.The validation of the model was conducted using experimental data.We employed the OFAT(One_Factor-At-Time)method to identify the most sensitive soil parameters and understand how changes in these parameters may affect the behavior and water dynamics of the capillary fringe.The results emphasize the role of hydraulic conductivity as a key parameter influencing capillary fringe shape and dynamics.Velocity values within the capillary fringe suggest the prevalence of horizontal flow.By variation of the water table level and the incoming flow q0,we have shown the correlation between water table elevation and the upper limit of the capillary fringe.展开更多
Simulating the total ionizing dose(TID)of an electrical system using transistor-level models can be difficult and expensive,particularly for digital-integrated circuits(ICs).In this study,a method for modeling TID eff...Simulating the total ionizing dose(TID)of an electrical system using transistor-level models can be difficult and expensive,particularly for digital-integrated circuits(ICs).In this study,a method for modeling TID effects in complementary metaloxide semiconductor(CMOS)digital ICs based on the input/output buffer information specification(IBIS)was proposed.The digital IC was first divided into three parts based on its internal structure:the input buffer,output buffer,and functional area.Each of these three parts was separately modeled.Using the IBIS model,the transistor V-I characteristic curves of the buffers were processed,and the physical parameters were extracted and modeled using VHDL-AMS.In the functional area,logic functions were modeled in VHDL according to the data sheet.A golden digital IC model was developed by combining the input buffer,output buffer,and functional area models.Furthermore,the golden ratio was reconstructed based on TID experimental data,enabling the assessment of TID effects on the threshold voltage,carrier mobility,and time series of the digital IC.TID experiments were conducted using a CMOS non-inverting multiplexer,NC7SZ157,and the results were compared with the simulation results,which showed that the relative errors were less than 2%at each dose point.This confirms the practicality and accuracy of the proposed modeling method.The TID effect model for digital ICs developed using this modeling technique includes both the logical function of the IC and changes in electrical properties and functional degradation impacted by TID,which has potential applications in the design of radiation-hardening tolerance in digital ICs.展开更多
The surrounding geological conditions and supporting structures of underground engineering are often updated during construction,and these updates require repeated numerical modeling.To improve the numerical modeling ...The surrounding geological conditions and supporting structures of underground engineering are often updated during construction,and these updates require repeated numerical modeling.To improve the numerical modeling efficiency of underground engineering,a modularized and parametric modeling cloud server is developed by using Python codes.The basic framework of the cloud server is as follows:input the modeling parameters into the web platform,implement Rhino software and FLAC3D software to model and run simulations in the cloud server,and return the simulation results to the web platform.The modeling program can automatically generate instructions that can run the modeling process in Rhino based on the input modeling parameters.The main modules of the modeling program include modeling the 3D geological structures,the underground engineering structures,and the supporting structures as well as meshing the geometric models.In particular,various cross-sections of underground caverns are crafted as parametricmodules in themodeling program.Themodularized and parametric modeling program is used for a finite element simulation of the underground powerhouse of the Shuangjiangkou Hydropower Station.This complicatedmodel is rapidly generated for the simulation,and the simulation results are reasonable.Thus,this modularized and parametric modeling program is applicable for three-dimensional finite element simulations and analyses.展开更多
基金supported in part by the National Natural Science Foundation of China(NSFC)(92167106,61833014)Key Research and Development Program of Zhejiang Province(2022C01206)。
文摘The curse of dimensionality refers to the problem o increased sparsity and computational complexity when dealing with high-dimensional data.In recent years,the types and vari ables of industrial data have increased significantly,making data driven models more challenging to develop.To address this prob lem,data augmentation technology has been introduced as an effective tool to solve the sparsity problem of high-dimensiona industrial data.This paper systematically explores and discusses the necessity,feasibility,and effectiveness of augmented indus trial data-driven modeling in the context of the curse of dimen sionality and virtual big data.Then,the process of data augmen tation modeling is analyzed,and the concept of data boosting augmentation is proposed.The data boosting augmentation involves designing the reliability weight and actual-virtual weigh functions,and developing a double weighted partial least squares model to optimize the three stages of data generation,data fusion and modeling.This approach significantly improves the inter pretability,effectiveness,and practicality of data augmentation in the industrial modeling.Finally,the proposed method is verified using practical examples of fault diagnosis systems and virtua measurement systems in the industry.The results demonstrate the effectiveness of the proposed approach in improving the accu racy and robustness of data-driven models,making them more suitable for real-world industrial applications.
基金the Six Talent Peaks Project in Jiangsu Province,China(Grant No.JXQC-002)。
文摘The dynamical modeling of projectile systems with sufficient accuracy is of great difficulty due to high-dimensional space and various perturbations.With the rapid development of data science and scientific tools of measurement recently,there are numerous data-driven methods devoted to discovering governing laws from data.In this work,a data-driven method is employed to perform the modeling of the projectile based on the Kramers–Moyal formulas.More specifically,the four-dimensional projectile system is assumed as an It?stochastic differential equation.Then the least square method and sparse learning are applied to identify the drift coefficient and diffusion matrix from sample path data,which agree well with the real system.The effectiveness of the data-driven method demonstrates that it will become a powerful tool in extracting governing equations and predicting complex dynamical behaviors of the projectile.
基金funded by the National Natural Science Foundation of China(No.52161135202)Hangzhou Key Scientific Research Plan Project(No.2023SZD0028).
文摘Conventional automated machine learning(AutoML)technologies fall short in preprocessing low-quality raw data and adapting to varying indoor and outdoor environments,leading to accuracy reduction in forecasting short-term building energy loads.Moreover,their predictions are not transparent because of their black box nature.Hence,the building field currently lacks an AutoML framework capable of data quality enhancement,environment self-adaptation,and model interpretation.To address this research gap,an improved AutoML-based end-to-end data-driven modeling framework is proposed.Bayesian optimization is applied by this framework to find an optimal data preprocessing process for quality improvement of raw data.It bridges the gap where conventional AutoML technologies cannot automatically handle missing data and outliers.A sliding window-based model retraining strategy is utilized to achieve environment self-adaptation,contributing to the accuracy enhancement of AutoML technologies.Moreover,a local interpretable model-agnostic explanations-based approach is developed to interpret predictions made by the improved framework.It overcomes the poor interpretability of conventional AutoML technologies.The performance of the improved framework in forecasting one-hour ahead cooling loads is evaluated using two-year operational data from a real building.It is discovered that the accuracy of the improved framework increases by 4.24%–8.79%compared with four conventional frameworks for buildings with not only high-quality but also low-quality operational data.Furthermore,it is demonstrated that the developed model interpretation approach can effectively explain the predictions of the improved framework.The improved framework offers a novel perspective on creating accurate and reliable AutoML frameworks tailored to building energy load prediction tasks and other similar tasks.
基金supported by the Research Council of Norway under contracts 223252/F50 and 300844/F50the Trond Mohn Foundation。
文摘Global images of auroras obtained by cameras on spacecraft are a key tool for studying the near-Earth environment.However,the cameras are sensitive not only to auroral emissions produced by precipitating particles,but also to dayglow emissions produced by photoelectrons induced by sunlight.Nightglow emissions and scattered sunlight can contribute to the background signal.To fully utilize such images in space science,background contamination must be removed to isolate the auroral signal.Here we outline a data-driven approach to modeling the background intensity in multiple images by formulating linear inverse problems based on B-splines and spherical harmonics.The approach is robust,flexible,and iteratively deselects outliers,such as auroral emissions.The final model is smooth across the terminator and accounts for slow temporal variations and large-scale asymmetries in the dayglow.We demonstrate the model by using the three far ultraviolet cameras on the Imager for Magnetopause-to-Aurora Global Exploration(IMAGE)mission.The method can be applied to historical missions and is relevant for upcoming missions,such as the Solar wind Magnetosphere Ionosphere Link Explorer(SMILE)mission.
基金supported by the National Key Research and Development Program of China(2021 YFB 4000500,2021 YFB 4000501,and 2021 YFB 4000502)。
文摘Steam cracking is the dominant technology for producing light olefins,which are believed to be the foundation of the chemical industry.Predictive models of the cracking process can boost production efficiency and profit margin.Rapid advancements in machine learning research have recently enabled data-driven solutions to usher in a new era of process modeling.Meanwhile,its practical application to steam cracking is still hindered by the trade-off between prediction accuracy and computational speed.This research presents a framework for data-driven intelligent modeling of the steam cracking process.Industrial data preparation and feature engineering techniques provide computational-ready datasets for the framework,and feedstock similarities are exploited using k-means clustering.We propose LArge-Residuals-Deletion Multivariate Adaptive Regression Spline(LARD-MARS),a modeling approach that explicitly generates output formulas and eliminates potentially outlying instances.The framework is validated further by the presentation of clustering results,the explanation of variable importance,and the testing and comparison of model performance.
文摘This paper demonstrated the fabrication,characterization,datadriven modeling,and practical application of a 1D SnO_(2)nanofiber-based memristor,in which a 1D SnO_(2)active layer wassandwiched between silver(Ag)and aluminum(Al)electrodes.Thisdevice yielded a very high ROFF:RON of~104(ION:IOFF of~105)with an excellent activation slope of 10 mV/dec,low set voltage ofVSET~1.14 V and good repeatability.This paper physically explained the conduction mechanism in the layered SnO_(2)nanofiber-basedmemristor.The conductive network was composed of nanofibersthat play a vital role in the memristive action,since more conductive paths could facilitate the hopping of electron carriers.Energyband structures experimentally extracted with the adoption of ultraviolet photoelectron spectroscopy strongly support the claimsreported in this paper.An machine learning(ML)–assisted,datadriven model of the fabricated memristor was also developedemploying different popular algorithms such as polynomialregression,support vector regression,k nearest neighbors,andartificial neural network(ANN)to model the data of the fabricateddevice.We have proposed two types of ANN models(type I andtype II)algorithms,illustrated with a detailed flowchart,to modelthe fabricated memristor.Benchmarking with standard ML techniques shows that the type II ANN algorithm provides the bestmean absolute percentage error of 0.0175 with a 98%R^(2)score.The proposed data-driven model was further validated with the characterization results of similar new memristors fabricated adoptingthe same fabrication recipe,which gave satisfactory predictions.Lastly,the ANN type II model was applied to design and implementsimple AND&OR logic functionalities adopting the fabricatedmemristors with expected,near-ideal characteristics.
基金funding support from the science and technology innovation Program of Hunan Province(Grant No.2023RC1017)Hunan Provincial Postgraduate Research and Innovation Project(Grant No.CX20220109)National Natural Science Foundation of China Youth Fund(Grant No.52208378).
文摘Machine learning(ML)provides a new surrogate method for investigating groundwater flow dynamics in unsaturated soils.Traditional pure data-driven methods(e.g.deep neural network,DNN)can provide rapid predictions,but they do require sufficient on-site data for accurate training,and lack interpretability to the physical processes within the data.In this paper,we provide a physics and equalityconstrained artificial neural network(PECANN),to derive unsaturated infiltration solutions with a small amount of initial and boundary data.PECANN takes the physics-informed neural network(PINN)as a foundation,encodes the unsaturated infiltration physical laws(i.e.Richards equation,RE)into the loss function,and uses the augmented Lagrangian method to constrain the learning process of the solutions of RE by adding stronger penalty for the initial and boundary conditions.Four unsaturated infiltration cases are designed to test the training performance of PECANN,i.e.one-dimensional(1D)steady-state unsaturated infiltration,1D transient-state infiltration,two-dimensional(2D)transient-state infiltration,and 1D coupled unsaturated infiltration and deformation.The predicted results of PECANN are compared with the finite difference solutions or analytical solutions.The results indicate that PECANN can accurately capture the variations of pressure head during the unsaturated infiltration,and present higher precision and robustness than DNN and PINN.It is also revealed that PECANN can achieve the same accuracy as the finite difference method with fewer initial and boundary training data.Additionally,we investigate the effect of the hyperparameters of PECANN on solving RE problem.PECANN provides an effective tool for simulating unsaturated infiltration.
基金support provided by the National Natural Science Foundation of China(22122802,22278044,and 21878028)the Chongqing Science Fund for Distinguished Young Scholars(CSTB2022NSCQ-JQX0021)the Fundamental Research Funds for the Central Universities(2022CDJXY-003).
文摘To equip data-driven dynamic chemical process models with strong interpretability,we develop a light attention–convolution–gate recurrent unit(LACG)architecture with three sub-modules—a basic module,a brand-new light attention module,and a residue module—that are specially designed to learn the general dynamic behavior,transient disturbances,and other input factors of chemical processes,respectively.Combined with a hyperparameter optimization framework,Optuna,the effectiveness of the proposed LACG is tested by distributed control system data-driven modeling experiments on the discharge flowrate of an actual deethanization process.The LACG model provides significant advantages in prediction accuracy and model generalization compared with other models,including the feedforward neural network,convolution neural network,long short-term memory(LSTM),and attention-LSTM.Moreover,compared with the simulation results of a deethanization model built using Aspen Plus Dynamics V12.1,the LACG parameters are demonstrated to be interpretable,and more details on the variable interactions can be observed from the model parameters in comparison with the traditional interpretable model attention-LSTM.This contribution enriches interpretable machine learning knowledge and provides a reliable method with high accuracy for actual chemical process modeling,paving a route to intelligent manufacturing.
基金financially supported by the National Key Research and Development Program of China(2022YFB3706800,2020YFB1710100)the National Natural Science Foundation of China(51821001,52090042,52074183)。
文摘The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was proposed to reduce casting defects and improve production efficiency,which includes the random forest(RF)classification model,the feature importance analysis,and the process parameters optimization with Monte Carlo simulation.The collected data includes four types of defects and corresponding process parameters were used to construct the RF model.Classification results show a recall rate above 90% for all categories.The Gini Index was used to assess the importance of the process parameters in the formation of various defects in the RF model.Finally,the classification model was applied to different production conditions for quality prediction.In the case of process parameters optimization for gas porosity defects,this model serves as an experimental process in the Monte Carlo method to estimate a better temperature distribution.The prediction model,when applied to the factory,greatly improved the efficiency of defect detection.Results show that the scrap rate decreased from 10.16% to 6.68%.
基金supported by the National Key Research and Development Program of China(2021YFB3702005)the National Natural Science Foundation of China(52304352)+3 种基金the Central Government Guides Local Science and Technology Development Fund Projects(2023JH6/100100046)2022"Chunhui Program"Collaborative Scientific Research Project(202200042)the Doctoral Start-up Foundation of Liaoning Province(2023-BS-182)the Technology Development Project of State Key Laboratory of Metal Material for Marine Equipment and Application[HGSKL-USTLN(2022)01].
文摘Macrosegregation is a critical factor that limits the mechanical properties of materials.The impact of equiaxed crystal sedimentation on macrosegregation has been extensively studied,as it plays a significant role in determining the distribution of alloying elements and impurities within a material.To improve macrosegregation in steel connecting shafts,a multiphase solidification model that couples melt flow,heat transfer,microstructure evolution,and solute transport was established based on the volume-averaged Eulerian-Eulerian approach.In this model,the effects of liquid phase,equiaxed crystals,columnar dendrites,and columnar-to-equiaxed transition(CET)during solidification and evolution of microstructure can be considered simultaneously.The sedimentation of equiaxed crystals contributes to negative macrosegregation,where regions between columnar dendrites and equiaxed crystals undergo significant A-type positive macrosegregation due to the CET.Additionally,noticeable positive macrosegregation occurs in the area of final solidification in the ingot.The improvement in macrosegregation is beneficial for enhancing the mechanical properties of connecting shafts.To mitigate the thermal convection of molten steel resulting from excessive superheating,reducing the superheating during casting without employing external fields or altering the design of the ingot mold is indeed an effective approach to control macrosegregation.
基金Supported by Discipline Advancement Program of Shanghai Fourth People’s Hospital,No.SY-XKZT-2020-2013.
文摘BACKGROUND Postoperative delirium,particularly prevalent in elderly patients after abdominal cancer surgery,presents significant challenges in clinical management.AIM To develop a synthetic minority oversampling technique(SMOTE)-based model for predicting postoperative delirium in elderly abdominal cancer patients.METHODS In this retrospective cohort study,we analyzed data from 611 elderly patients who underwent abdominal malignant tumor surgery at our hospital between September 2020 and October 2022.The incidence of postoperative delirium was recorded for 7 d post-surgery.Patients were divided into delirium and non-delirium groups based on the occurrence of postoperative delirium or not.A multivariate logistic regression model was used to identify risk factors and develop a predictive model for postoperative delirium.The SMOTE technique was applied to enhance the model by oversampling the delirium cases.The model’s predictive accuracy was then validated.RESULTS In our study involving 611 elderly patients with abdominal malignant tumors,multivariate logistic regression analysis identified significant risk factors for postoperative delirium.These included the Charlson comorbidity index,American Society of Anesthesiologists classification,history of cerebrovascular disease,surgical duration,perioperative blood transfusion,and postoperative pain score.The incidence rate of postoperative delirium in our study was 22.91%.The original predictive model(P1)exhibited an area under the receiver operating characteristic curve of 0.862.In comparison,the SMOTE-based logistic early warning model(P2),which utilized the SMOTE oversampling algorithm,showed a slightly lower but comparable area under the curve of 0.856,suggesting no significant difference in performance between the two predictive approaches.CONCLUSION This study confirms that the SMOTE-enhanced predictive model for postoperative delirium in elderly abdominal tumor patients shows performance equivalent to that of traditional methods,effectively addressing data imbalance.
基金funding received by a grant from the Natural Sciences and Engineering Research Council of Canada(NSERC)(Grant No.CRDPJ 469057e14).
文摘We have proposed a methodology to assess the robustness of underground tunnels against potential failure.This involves developing vulnerability functions for various qualities of rock mass and static loading intensities.To account for these variations,we utilized a Monte Carlo Simulation(MCS)technique coupled with the finite difference code FLAC^(3D),to conduct two thousand seven hundred numerical simulations of a horseshoe tunnel located within a rock mass with different geological strength index system(GSIs)and subjected to different states of static loading.To quantify the severity of damage within the rock mass,we selected one stress-based(brittle shear ratio(BSR))and one strain-based failure criterion(plastic damage index(PDI)).Based on these criteria,we then developed fragility curves.Additionally,we used mathematical approximation techniques to produce vulnerability functions that relate the probabilities of various damage states to loading intensities for different quality classes of blocky rock mass.The results indicated that the fragility curves we obtained could accurately depict the evolution of the inner and outer shell damage around the tunnel.Therefore,we have provided engineers with a tool that can predict levels of damages associated with different failure mechanisms based on variations in rock mass quality and in situ stress state.Our method is a numerically developed,multi-variate approach that can aid engineers in making informed decisions about the robustness of underground tunnels.
基金supported in part by the National Natural Science Foundation of China(82072019)the Shenzhen Basic Research Program(JCYJ20210324130209023)+5 种基金the Shenzhen-Hong Kong-Macao S&T Program(Category C)(SGDX20201103095002019)the Mainland-Hong Kong Joint Funding Scheme(MHKJFS)(MHP/005/20),the Project of Strategic Importance Fund(P0035421)the Projects of RISA(P0043001)from the Hong Kong Polytechnic University,the Natural Science Foundation of Jiangsu Province(BK20201441)the Provincial and Ministry Co-constructed Project of Henan Province Medical Science and Technology Research(SBGJ202103038,SBGJ202102056)the Henan Province Key R&D and Promotion Project(Science and Technology Research)(222102310015)the Natural Science Foundation of Henan Province(222300420575),and the Henan Province Science and Technology Research(222102310322).
文摘Modern medicine is reliant on various medical imaging technologies for non-invasively observing patients’anatomy.However,the interpretation of medical images can be highly subjective and dependent on the expertise of clinicians.Moreover,some potentially useful quantitative information in medical images,especially that which is not visible to the naked eye,is often ignored during clinical practice.In contrast,radiomics performs high-throughput feature extraction from medical images,which enables quantitative analysis of medical images and prediction of various clinical endpoints.Studies have reported that radiomics exhibits promising performance in diagnosis and predicting treatment responses and prognosis,demonstrating its potential to be a non-invasive auxiliary tool for personalized medicine.However,radiomics remains in a developmental phase as numerous technical challenges have yet to be solved,especially in feature engineering and statistical modeling.In this review,we introduce the current utility of radiomics by summarizing research on its application in the diagnosis,prognosis,and prediction of treatment responses in patients with cancer.We focus on machine learning approaches,for feature extraction and selection during feature engineering and for imbalanced datasets and multi-modality fusion during statistical modeling.Furthermore,we introduce the stability,reproducibility,and interpretability of features,and the generalizability and interpretability of models.Finally,we offer possible solutions to current challenges in radiomics research.
基金support from the National Key R&D plan(Grant No.2022YFC3004303)the National Natural Science Foundation of China(Grant No.42107161)+3 种基金the State Key Laboratory of Hydroscience and Hydraulic Engineering(Grant No.2021-KY-04)the Open Research Fund Program of State Key Laboratory of Hydroscience and Engineering(sklhse-2023-C-01)the Open Research Fund Program of Key Laboratory of the Hydrosphere of the Ministry of Water Resources(mklhs-2023-04)the China Three Gorges Corporation(XLD/2117).
文摘Rock fragmentation plays a critical role in rock avalanches,yet conventional approaches such as classical granular flow models or the bonded particle model have limitations in accurately characterizing the progressive disintegration and kinematics of multi-deformable rock blocks during rockslides.The present study proposes a discrete-continuous numerical model,based on a cohesive zone model,to explicitly incorporate the progressive fragmentation and intricate interparticle interactions inherent in rockslides.Breakable rock granular assemblies are released along an inclined plane and flow onto a horizontal plane.The numerical scenarios are established to incorporate variations in slope angle,initial height,friction coefficient,and particle number.The evolutions of fragmentation,kinematic,runout and depositional characteristics are quantitatively analyzed and compared with experimental and field data.A positive linear relationship between the equivalent friction coefficient and the apparent friction coefficient is identified.In general,the granular mass predominantly exhibits characteristics of a dense granular flow,with the Savage number exhibiting a decreasing trend as the volume of mass increases.The process of particle breakage gradually occurs in a bottom-up manner,leading to a significant increase in the angular velocities of the rock blocks with increasing depth.The simulation results reproduce the field observations of inverse grading and source stratigraphy preservation in the deposit.We propose a disintegration index that incorporates factors such as drop height,rock mass volume,and rock strength.Our findings demonstrate a consistent linear relationship between this index and the fragmentation degree in all tested scenarios.
基金supported by the National Key Research and Development Program of China(Grant No.2022YFC3080200)the National Natural Science Foundation of China(Grant No.42022053)the China Postdoctoral Science Foundation(Grant No.2023M731264).
文摘Natural slopes usually display complicated exposed rock surfaces that are characterized by complex and substantial terrain undulation and ubiquitous undesirable phenomena such as vegetation cover and rockfalls.This study presents a systematic outcrop research of fracture pattern variations in a complicated rock slope,and the qualitative and quantitative study of the complex phenomena impact on threedimensional(3D)discrete fracture network(DFN)modeling.As the studies of the outcrop fracture pattern have been so far focused on local variations,thus,we put forward a statistical analysis of global variations.The entire outcrop is partitioned into several subzones,and the subzone-scale variability of fracture geometric properties is analyzed(including the orientation,the density,and the trace length).The results reveal significant variations in fracture characteristics(such as the concentrative degree,the average orientation,the density,and the trace length)among different subzones.Moreover,the density of fracture sets,which is approximately parallel to the slope surface,exhibits a notably higher value compared to other fracture sets across all subzones.To improve the accuracy of the DFN modeling,the effects of three common phenomena resulting from vegetation and rockfalls are qualitatively analyzed and the corresponding quantitative data processing solutions are proposed.Subsequently,the 3D fracture geometric parameters are determined for different areas of the high-steep rock slope in terms of the subzone dimensions.The results show significant variations in the same set of 3D fracture parameters across different regions with density differing by up to tenfold and mean trace length exhibiting differences of 3e4 times.The study results present precise geological structural information,improve modeling accuracy,and provide practical solutions for addressing complex outcrop issues.
基金supported by the National Natural Science Foundation of China(Grant Nos.82173620 to Yang Zhao and 82041024 to Feng Chen)partially supported by the Bill&Melinda Gates Foundation(Grant No.INV-006371 to Feng Chen)Priority Academic Program Development of Jiangsu Higher Education Institutions.
文摘Deterministic compartment models(CMs)and stochastic models,including stochastic CMs and agent-based models,are widely utilized in epidemic modeling.However,the relationship between CMs and their corresponding stochastic models is not well understood.The present study aimed to address this gap by conducting a comparative study using the susceptible,exposed,infectious,and recovered(SEIR)model and its extended CMs from the coronavirus disease 2019 modeling literature.We demonstrated the equivalence of the numerical solution of CMs using the Euler scheme and their stochastic counterparts through theoretical analysis and simulations.Based on this equivalence,we proposed an efficient model calibration method that could replicate the exact solution of CMs in the corresponding stochastic models through parameter adjustment.The advancement in calibration techniques enhanced the accuracy of stochastic modeling in capturing the dynamics of epidemics.However,it should be noted that discrete-time stochastic models cannot perfectly reproduce the exact solution of continuous-time CMs.Additionally,we proposed a new stochastic compartment and agent mixed model as an alternative to agent-based models for large-scale population simulations with a limited number of agents.This model offered a balance between computational efficiency and accuracy.The results of this research contributed to the comparison and unification of deterministic CMs and stochastic models in epidemic modeling.Furthermore,the results had implications for the development of hybrid models that integrated the strengths of both frameworks.Overall,the present study has provided valuable epidemic modeling techniques and their practical applications for understanding and controlling the spread of infectious diseases.
基金supported in part by the National Key Research and Development Program of China(2022YFB3305300)the National Natural Science Foundation of China(62173178).
文摘Ethylene glycol(EG)plays a pivotal role as a primary raw material in the polyester industry,and the syngas-to-EG route has become a significant technical route in production.The carbon monoxide(CO)gas-phase catalytic coupling to synthesize dimethyl oxalate(DMO)is a crucial process in the syngas-to-EG route,whereby the composition of the reactor outlet exerts influence on the ultimate quality of the EG product and the energy consumption during the subsequent separation process.However,measuring product quality in real time or establishing accurate dynamic mechanism models is challenging.To effectively model the DMO synthesis process,this study proposes a hybrid modeling strategy that integrates process mechanisms and data-driven approaches.The CO gas-phase catalytic coupling mechanism model is developed based on intrinsic kinetics and material balance,while a long short-term memory(LSTM)neural network is employed to predict the macroscopic reaction rate by leveraging temporal relationships derived from archived measurements.The proposed model is trained semi-supervised to accommodate limited-label data scenarios,leveraging historical data.By integrating these predictions with the mechanism model,the hybrid modeling approach provides reliable and interpretable forecasts of mass fractions.Empirical investigations unequivocally validate the superiority of the proposed hybrid modeling approach over conventional data-driven models(DDMs)and other hybrid modeling techniques.
文摘Recent studies have underscored the significance of the capillary fringe in hydrological and biochemical processes.Moreover,its role in shallow waters is expected to be considerable.Traditionally,the study of groundwater flow has centered on unsaturated-saturated zones,often overlooking the impact of the capillary fringe.In this study,we introduce a steady-state two-dimensional model that integrates the capillary fringe into a 2-D numerical solution.Our novel approach employs the potential form of the Richards equation,facilitating the determination of boundaries,pressures,and velocities across different ground surface zones.We utilized a two-dimensional Freefem++finite element model to compute the stationary solution.The validation of the model was conducted using experimental data.We employed the OFAT(One_Factor-At-Time)method to identify the most sensitive soil parameters and understand how changes in these parameters may affect the behavior and water dynamics of the capillary fringe.The results emphasize the role of hydraulic conductivity as a key parameter influencing capillary fringe shape and dynamics.Velocity values within the capillary fringe suggest the prevalence of horizontal flow.By variation of the water table level and the incoming flow q0,we have shown the correlation between water table elevation and the upper limit of the capillary fringe.
基金This work was supported by the special fund of the State Key Laboratory of Intense Pulsed Radiation Simulation and Effect(No.SKLIPR2011).
文摘Simulating the total ionizing dose(TID)of an electrical system using transistor-level models can be difficult and expensive,particularly for digital-integrated circuits(ICs).In this study,a method for modeling TID effects in complementary metaloxide semiconductor(CMOS)digital ICs based on the input/output buffer information specification(IBIS)was proposed.The digital IC was first divided into three parts based on its internal structure:the input buffer,output buffer,and functional area.Each of these three parts was separately modeled.Using the IBIS model,the transistor V-I characteristic curves of the buffers were processed,and the physical parameters were extracted and modeled using VHDL-AMS.In the functional area,logic functions were modeled in VHDL according to the data sheet.A golden digital IC model was developed by combining the input buffer,output buffer,and functional area models.Furthermore,the golden ratio was reconstructed based on TID experimental data,enabling the assessment of TID effects on the threshold voltage,carrier mobility,and time series of the digital IC.TID experiments were conducted using a CMOS non-inverting multiplexer,NC7SZ157,and the results were compared with the simulation results,which showed that the relative errors were less than 2%at each dose point.This confirms the practicality and accuracy of the proposed modeling method.The TID effect model for digital ICs developed using this modeling technique includes both the logical function of the IC and changes in electrical properties and functional degradation impacted by TID,which has potential applications in the design of radiation-hardening tolerance in digital ICs.
基金The Construction S&T Project of the Department of Transportation of Sichuan Province(Grant No.2023A02)the National Natural Science Foundation of China(No.52109135).
文摘The surrounding geological conditions and supporting structures of underground engineering are often updated during construction,and these updates require repeated numerical modeling.To improve the numerical modeling efficiency of underground engineering,a modularized and parametric modeling cloud server is developed by using Python codes.The basic framework of the cloud server is as follows:input the modeling parameters into the web platform,implement Rhino software and FLAC3D software to model and run simulations in the cloud server,and return the simulation results to the web platform.The modeling program can automatically generate instructions that can run the modeling process in Rhino based on the input modeling parameters.The main modules of the modeling program include modeling the 3D geological structures,the underground engineering structures,and the supporting structures as well as meshing the geometric models.In particular,various cross-sections of underground caverns are crafted as parametricmodules in themodeling program.Themodularized and parametric modeling program is used for a finite element simulation of the underground powerhouse of the Shuangjiangkou Hydropower Station.This complicatedmodel is rapidly generated for the simulation,and the simulation results are reasonable.Thus,this modularized and parametric modeling program is applicable for three-dimensional finite element simulations and analyses.