Purpose-The purpose of this paper is to eliminate the fluctuations in train arrival and departure times caused by skewed distributions in interval operation times.These fluctuations arise from random origin and proces...Purpose-The purpose of this paper is to eliminate the fluctuations in train arrival and departure times caused by skewed distributions in interval operation times.These fluctuations arise from random origin and process factors during interval operations and can accumulate over multiple intervals.The aim is to enhance the robustness of high-speed rail station arrival and departure track utilization schemes.Design/methodologylapproach-To achieve this objective,the paper simulates actual train operations,incorporating the fluctuations in interval operation times into the utilization of arrival and departure tracks at the station.The Monte Carlo simulation method is adopted to solve this problem.This approach transforms a nonlinear model,which includes constraints from probability distribution functions and is difficult to solve directly,into a linear programming model that is easier to handle.The method then linearly weights two objectives to optimize the solution.Findings-Through the application of Monte Carlo simulation,the study successfully converts the complex nonlinear model with probability distribution function constraints into a manageable linear programming model.By continuously adjusting the weighting coefficients of the linear objectives,the method is able to optimize the Pareto solution.Notably,this approach does not require extensive scene data to obtain a satisfactory Pareto solution set.Originality/value-The paper contributes to the field by introducing a novel method for optimizing high-speed rail station arrival and departure track utilization in the presence of fluctuations in interval operation times.The use of Monte Carlo simulation to transform the problem into a tractable linear programming model represents a significant advancement.Furthermore,the method's ability to produce satisfactory Pareto solutions without relying on extensive data sets adds to its practical value and applicability in real-world scenarios.展开更多
This study proposed a new real-time manufacturing process monitoring method to monitor and detect process shifts in manufacturing operations.Since real-time production process monitoring is critical in today’s smart ...This study proposed a new real-time manufacturing process monitoring method to monitor and detect process shifts in manufacturing operations.Since real-time production process monitoring is critical in today’s smart manufacturing.The more robust the monitoring model,the more reliable a process is to be under control.In the past,many researchers have developed real-time monitoring methods to detect process shifts early.However,thesemethods have limitations in detecting process shifts as quickly as possible and handling various data volumes and varieties.In this paper,a robust monitoring model combining Gated Recurrent Unit(GRU)and Random Forest(RF)with Real-Time Contrast(RTC)called GRU-RF-RTC was proposed to detect process shifts rapidly.The effectiveness of the proposed GRU-RF-RTC model is first evaluated using multivariate normal and nonnormal distribution datasets.Then,to prove the applicability of the proposed model in a realmanufacturing setting,the model was evaluated using real-world normal and non-normal problems.The results demonstrate that the proposed GRU-RF-RTC outperforms other methods in detecting process shifts quickly with the lowest average out-of-control run length(ARL1)in all synthesis and real-world problems under normal and non-normal cases.The experiment results on real-world problems highlight the significance of the proposed GRU-RF-RTC model in modern manufacturing process monitoring applications.The result reveals that the proposed method improves the shift detection capability by 42.14%in normal and 43.64%in gamma distribution problems.展开更多
Extensive high-speed railway(HSR)network resembled the intricate vascular system of the human body,crisscrossing mainlands.Seismic events,known for their unpredictability,pose a significant threat to both trains and b...Extensive high-speed railway(HSR)network resembled the intricate vascular system of the human body,crisscrossing mainlands.Seismic events,known for their unpredictability,pose a significant threat to both trains and bridges,given the HSR’s extended operational duration.Therefore,ensuring the running safety of train-bridge coupled(TBC)system,primarily composed of simply supported beam bridges,is paramount.Traditional methods like the Monte Carlo method fall short in analyzing this intricate system efficiently.Instead,efficient algorithm like the new point estimate method combined with moment expansion approximation(NPEM-MEA)is applied to study random responses of numerical simulation TBC systems.Validation of the NPEM-MEA’s feasibility is conducted using the Monte Carlo method.Comparative analysis confirms the accuracy and efficiency of the method,with a recommended truncation order of four to six for the NPEM-MEA.Additionally,the influences of seismic magnitude and epicentral distance are discussed based on the random dynamic responses in the TBC system.This methodology not only facilitates seismic safety assessments for TBC systems but also contributes to standard-setting for these systems under earthquake conditions.展开更多
Acid production with flue gas is a complex nonlinear process with multiple variables and strong coupling.The operation data is an important basis for state monitoring,optimal control,and fault diagnosis.However,the op...Acid production with flue gas is a complex nonlinear process with multiple variables and strong coupling.The operation data is an important basis for state monitoring,optimal control,and fault diagnosis.However,the operating environment of acid production with flue gas is complex and there is much equipment.The data obtained by the detection equipment is seriously polluted and prone to abnormal phenomena such as data loss and outliers.Therefore,to solve the problem of abnormal data in the process of acid production with flue gas,a data cleaning method based on improved random forest is proposed.Firstly,an outlier data recognition model based on isolation forest is designed to identify and eliminate the outliers in the dataset.Secondly,an improved random forest regression model is established.Genetic algorithm is used to optimize the hyperparameters of the random forest regression model.Then the optimal parameter combination is found in the search space and the trend of data is predicted.Finally,the improved random forest data cleaning method is used to compensate for the missing data after eliminating abnormal data and the data cleaning is realized.Results show that the proposed method can accurately eliminate and compensate for the abnormal data in the process of acid production with flue gas.The method improves the accuracy of compensation for missing data.With the data after cleaning,a more accurate model can be established,which is significant to the subsequent temperature control.The conversion rate of SO_(2) can be further improved,thereby improving the yield of sulfuric acid and economic benefits.展开更多
Random numbers are one of the key foundations of cryptography.This work implements a discrete quantum random number generator(QRNG)based on the tunneling effect of electrons in an avalanche photo diode.Without any pos...Random numbers are one of the key foundations of cryptography.This work implements a discrete quantum random number generator(QRNG)based on the tunneling effect of electrons in an avalanche photo diode.Without any post-processing and conditioning,this QRNG can output raw sequences at a rate of 100 Mbps.Remarkably,the statistical min-entropy of the 8,000,000 bits sequence reaches 0.9944 bits/bit,and the min-entropy validated by NIST SP 800-90B reaches 0.9872 bits/bit.This metric is currently the highest value we have investigated for QRNG raw sequences.Moreover,this QRNG can continuously and stably output raw sequences with high randomness over extended periods.The system produced a continuous output of 1,174 Gbits raw sequence for a duration of 11,744 s,with every 8 Mbits forming a unit to obtain a statistical min-entropy distribution with an average value of 0.9892 bits/bit.The statistical min-entropy of all data(1,174 Gbits)achieves the value of0.9951 bits/bit.This QRNG can produce high-quality raw sequences with good randomness and stability.It has the potential to meet the high demand in cryptography for random numbers with high quality.展开更多
In the practical environment,it is very common for the simultaneous occurrence of base excitation and crosswind.Scavenging the combined energy of vibration and wind with a single energy harvesting structure is fascina...In the practical environment,it is very common for the simultaneous occurrence of base excitation and crosswind.Scavenging the combined energy of vibration and wind with a single energy harvesting structure is fascinating.For this purpose,the effects of the wind speed and random excitation level are investigated with the stochastic averaging method(SAM)based on the energy envelope.The results of the analytical prediction are verified with the Monte-Carlo method(MCM).The numerical simulation shows that the introduction of wind can reduce the critical excitation level for triggering an inter-well jump and make a bi-stable energy harvester(BEH)realize the performance enhancement for a weak base excitation.However,as the strength of the wind increases to a particular level,the influence of the random base excitation on the dynamic responses is weakened,and the system exhibits a periodic galloping response.A comparison between a BEH and a linear energy harvester(LEH)indicates that the BEH demonstrates inferior performance for high-speed wind.Relevant experiments are conducted to investigate the validity of the theoretical prediction and numerical simulation.The experimental findings also show that strong random excitation is favorable for the BEH in the range of low wind speeds.However,as the speed of the incoming wind is up to a particular level,the disadvantage of the BEH becomes clear and evident.展开更多
Gas hydrate drilling expeditions in the Pearl River Mouth Basin,South China Sea,have identified concentrated gas hydrates with variable thickness.Moreover,free gas and the coexistence of gas hydrate and free gas have ...Gas hydrate drilling expeditions in the Pearl River Mouth Basin,South China Sea,have identified concentrated gas hydrates with variable thickness.Moreover,free gas and the coexistence of gas hydrate and free gas have been confirmed by logging,coring,and production tests in the foraminifera-rich silty sediments with complex bottom-simulating reflectors(BSRs).The broad-band processing is conducted on conventional three-dimensional(3D)seismic data to improve the image and detection accuracy of gas hydratebearing layers and delineate the saturation and thickness of gas hydrate-and free gas-bearing sediments.Several geophysical attributes extracted along the base of the gas hydrate stability zone are used to demonstrate the variable distribution and the controlling factors for the differential enrichment of gas hydrate.The inverted gas hydrate saturation at the production zone is over 40% with a thickness of 90 m,showing the interbedded distribution with different boundaries between gas hydrate-and free gas-bearing layers.However,the gas hydrate saturation value at the adjacent canyon is 70%,with 30-m-thick patches and linear features.The lithological and fault controls on gas hydrate and free gas distributions are demonstrated by tracing each gas hydrate-bearing layer.Moreover,the BSR depths based on broad-band reprocessed 3D seismic data not only exhibit variations due to small-scale topographic changes caused by seafloor sedimentation and erosion but also show the upward shift of BSR and the blocky distribution of the coexistence of gas hydrate and free gas in the Pearl River Mouth Basin.展开更多
The current methods used to industrially produce sinomenine hydrochloride involve several issues,including high solvent toxicity,long process flow,and low atomic utilization efficiency,and the greenness scores of the ...The current methods used to industrially produce sinomenine hydrochloride involve several issues,including high solvent toxicity,long process flow,and low atomic utilization efficiency,and the greenness scores of the processes are below 65 points.To solve these problems,a new process using anisole as the extractant was proposed.Anisole exhibits high selectivity for sinomenine and can be connected to the subsequent water-washing steps.After alkalization of the medicinal material,heating extraction,water washing,and acidification crystallization were carried out.The process was modeled and optimized.The design space was constructed.The recommended operating ranges for the critical process parameters were 3.0–4.0 h for alkalization time,60.0–80.0℃ for extraction temperature,2.0–3.0(volume ratio)for washing solution amount,and 2.0–2.4 mol·L^(-1) for hydrochloric acid concentration.The new process shows good robustness because different batches of medicinal materials did not greatly impact crystal purity or sinomenine transfer rate.The sinomenine transfer rate was about 20%higher than that of industrial processes.The greenness score increased to 90 points since the novel process proposed in this research solves the problems of long process flow,high solvent toxicity,and poor atomic economy,better aligning with the concept of green chemistry.展开更多
The sensory perception of food is a dynamic process,which is closely related to the release of flavor substances during oral processing.It’s not only affected by the food material,but also subjected to the individual...The sensory perception of food is a dynamic process,which is closely related to the release of flavor substances during oral processing.It’s not only affected by the food material,but also subjected to the individual oral environment.To explore the oral processing characteristics of soft-boiled chicken,the sensory properties,texture,particle size,viscosity,characteristic values of electronic nose and tongue of different chicken samples were investigated.The correlation analysis showed that the physical characteristics especially the cohesiveness,springiness,resilience of the sample determined oral processing behavior.The addition of chicken skin played a role in lubrication during oral processing.The particle size of the bolus was heightened at the early stage,and the fluidity was enhanced in the end,which reduced the chewing time to the swallowing point and raised the aromatic compounds signal of electronic nose.But the effect of chicken skin on chicken thigh with relatively high fat content,was opposite in electronic nose,which had a certain masking effect on the perception of umami and sweet taste.In conclusion,fat played a critical role in chicken oral processing and chicken thigh had obvious advantages in comprehensive evaluation of soft-boiled chicken,which was more popular among people.展开更多
The travel time of rock compressional waves is an essential parameter used for estimating important rock properties,such as porosity,permeability,and lithology.Current methods,like wireline logging tests,provide broad...The travel time of rock compressional waves is an essential parameter used for estimating important rock properties,such as porosity,permeability,and lithology.Current methods,like wireline logging tests,provide broad measurements but lack finer resolution.Laboratory-based rock core measurements offer higher resolution but are resource-intensive.Conventionally,wireline logging and rock core measurements have been used independently.This study introduces a novel approach that integrates both data sources.The method leverages the detailed features from limited core data to enhance the resolution of wireline logging data.By combining machine learning with random field theory,the method allows for probabilistic predictions in regions with sparse data sampling.In this framework,12 parameters from wireline tests are used to predict trends in rock core data.The residuals are modeled using random field theory.The outcomes are high-resolution predictions that combine both the predicted trend and the probabilistic realizations of the residual.By utilizing unconditional and conditional random field theories,this method enables unconditional and conditional simulations of the underlying high-resolution rock compressional wave travel time profile and provides uncertainty estimates.This integrated approach optimizes the use of existing core and logging data.Its applicability is confirmed in an oil project in West China.展开更多
BACKGROUND The mucosal barrier's immune-brain interactions,pivotal for neural development and function,are increasingly recognized for their potential causal and therapeutic relevance to irritable bowel syndrome(I...BACKGROUND The mucosal barrier's immune-brain interactions,pivotal for neural development and function,are increasingly recognized for their potential causal and therapeutic relevance to irritable bowel syndrome(IBS).Prior studies linking immune inflammation with IBS have been inconsistent.To further elucidate this relationship,we conducted a Mendelian randomization(MR)analysis of 731 immune cell markers to dissect the influence of various immune phenotypes on IBS.Our goal was to deepen our understanding of the disrupted brain-gut axis in IBS and to identify novel therapeutic targets.AIM To leverage publicly available data to perform MR analysis on 731 immune cell markers and explore their impact on IBS.We aimed to uncover immunophenotypic associations with IBS that could inform future drug development and therapeutic strategies.METHODS We performed a comprehensive two-sample MR analysis to evaluate the causal relationship between immune cell markers and IBS.By utilizing genetic data from public databases,we examined the causal associations between 731 immune cell markers,encompassing median fluorescence intensity,relative cell abundance,absolute cell count,and morphological parameters,with IBS susceptibility.Sensitivity analyses were conducted to validate our findings and address potential heterogeneity and pleiotropy.RESULTS Bidirectional false discovery rate correction indicated no significant influence of IBS on immunophenotypes.However,our analysis revealed a causal impact of IBS on 30 out of 731 immune phenotypes(P<0.05).Nine immune phenotypes demonstrated a protective effect against IBS[inverse variance weighting(IVW)<0.05,odd ratio(OR)<1],while 21 others were associated with an increased risk of IBS onset(IVW≥0.05,OR≥1).CONCLUSION Our findings underscore a substantial genetic correlation between immune cell phenotypes and IBS,providing valuable insights into the pathophysiology of the condition.These results pave the way for the development of more precise biomarkers and targeted therapies for IBS.Furthermore,this research enriches our comprehension of immune cell roles in IBS pathogenesis,offering a foundation for more effective,personalized treatment approaches.These advancements hold promise for improving IBS patient quality of life and reducing the disease burden on individuals and their families.展开更多
BACKGROUND Clinical studies have reported that patients with gastroesophageal reflux disease(GERD)have a higher prevalence of hypertension.AIM To performed a bidirectional Mendelian randomization(MR)analysis to invest...BACKGROUND Clinical studies have reported that patients with gastroesophageal reflux disease(GERD)have a higher prevalence of hypertension.AIM To performed a bidirectional Mendelian randomization(MR)analysis to investi-gate the causal link between GERD and essential hypertension.METHODS Eligible single nucleotide polymorphisms(SNPs)were selected,and weighted median,inverse variance weighted(IVW)as well as MR egger(MR-Egger)re-gression were used to examine the potential causal association between GERD and hypertension.The MR-Pleiotropy RESidual Sum and Outlier analysis was used to detect and attempt to reduce horizontal pleiotropy by removing outliers SNPs.The MR-Egger intercept test,Cochran’s Q test and“leave-one-out”sen-sitivity analysis were performed to evaluate the horizontal pleiotropy,heterogen-eities,and stability of single instrumental variable.RESULTS IVW analysis exhibited an increased risk of hypertension(OR=1.46,95%CI:1.33-1.59,P=2.14E-16)in GERD patients.And the same result was obtained in replication practice(OR=1.002,95%CI:1.0008-1.003,P=0.000498).Meanwhile,the IVW analysis showed an increased risk of systolic blood pressure(β=0.78,95%CI:0.11-1.44,P=0.021)and hypertensive heart disease(OR=1.68,95%CI:1.36-2.08,P=0.0000016)in GERD patients.Moreover,we found an decreased risk of Barrett's esophagus(OR=0.91,95%CI:0.83-0.99,P=0.043)in essential hypertension patients.CONCLUSION We found that GERD would increase the risk of essential hypertension,which provided a novel prevent and therapeutic perspectives of essential hypertension.展开更多
A large-scale fine-grained Mg-Gd-Y-Zn-Zr alloy plate with high strength and ductility was successfully prepared by multi-pass friction stir processing(MFSP)technology in this work.The structure of grains and long peri...A large-scale fine-grained Mg-Gd-Y-Zn-Zr alloy plate with high strength and ductility was successfully prepared by multi-pass friction stir processing(MFSP)technology in this work.The structure of grains and long period stacking ordered(LPSO)phase were characterized,and the mechanical properties uniformity was investigated.Moreover,a quantitative relationship between the microstructure and tensile yield strength was established.The results showed that the grains in the processed zone(PZ)and interfacial zone(IZ)were refined from 50μm to 3μm and 4μm,respectively,and numerous original LPSO phases were broken.In IZ,some block-shaped 18R LPSO phases were transformed into needle-like 14H LPSO phases due to stacking faults and the short-range diffusion of solute atoms.The severe shear deformation in the form of kinetic energy caused profuse stacking fault to be generated and move rapidly,greatly increasing the transformation rate of LPSO phase.After MFSP,the ultimate tensile strength,yield strength and elongation to failure of the large-scale plate were 367 MPa,305 MPa and 18.0% respectively.Grain refinement and LPSO phase strengthening were the major strengthening mechanisms for the MFSP sample.In particularly,the strength of IZ was comparable to that of PZ because the strength contribution of the 14H LPSO phase offsets the lack of grain refinement strengthening in IZ.This result opposes the widely accepted notion that IZ is a weak region in MFSP-prepared large-scale fine-grained plate.展开更多
The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was p...The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was proposed to reduce casting defects and improve production efficiency,which includes the random forest(RF)classification model,the feature importance analysis,and the process parameters optimization with Monte Carlo simulation.The collected data includes four types of defects and corresponding process parameters were used to construct the RF model.Classification results show a recall rate above 90% for all categories.The Gini Index was used to assess the importance of the process parameters in the formation of various defects in the RF model.Finally,the classification model was applied to different production conditions for quality prediction.In the case of process parameters optimization for gas porosity defects,this model serves as an experimental process in the Monte Carlo method to estimate a better temperature distribution.The prediction model,when applied to the factory,greatly improved the efficiency of defect detection.Results show that the scrap rate decreased from 10.16% to 6.68%.展开更多
Recently,the increasing interest in wearable technology for personal healthcare and smart virtual/augmented reality applications has led to the development of facile fabrication methods.Lasers have long been used to d...Recently,the increasing interest in wearable technology for personal healthcare and smart virtual/augmented reality applications has led to the development of facile fabrication methods.Lasers have long been used to develop original solutions to such challenging technological problems due to their remote,sterile,rapid,and site-selective processing of materials.In this review,recent developments in relevant laser processes are summarized under two separate categories.First,transformative approaches,such as for laser-induced graphene,are introduced.In addition to design optimization and the alteration of a native substrate,the latest advances under a transformative approach now enable more complex material compositions and multilayer device configurations through the simultaneous transformation of heterogeneous precursors,or the sequential addition of functional layers coupled with other electronic elements.In addition,the more conventional laser techniques,such as ablation,sintering,and synthesis,can still be used to enhance the functionality of an entire system through the expansion of applicable materials and the adoption of new mechanisms.Later,various wearable device components developed through the corresponding laser processes are discussed,with an emphasis on chemical/physical sensors and energy devices.In addition,special attention is given to applications that use multiple laser sources or processes,which lay the foundation for the all-laser fabrication of wearable devices.展开更多
BACKGROUND Non-alcoholic fatty liver disease(NAFLD)and alcohol-related liver disease(Ar-LD)constitute the primary forms of chronic liver disease,and their incidence is progressively increasing with changes in lifestyl...BACKGROUND Non-alcoholic fatty liver disease(NAFLD)and alcohol-related liver disease(Ar-LD)constitute the primary forms of chronic liver disease,and their incidence is progressively increasing with changes in lifestyle habits.Earlier studies have do-cumented a correlation between the occurrence and development of prevalent mental disorders and fatty liver.AIM To investigate the correlation between fatty liver and mental disorders,thus ne-cessitating the implementation of a mendelian randomization(MR)study to elu-cidate this association.METHODS Data on NAFLD and ArLD were retrieved from the genome-wide association studies catalog,while information on mental disorders,including Alzheimer's disease,schizophrenia,anxiety disorder,attention deficit hyperactivity disorder(ADHD),bipolar disorder,major depressive disorder,multiple personality dis-order,obsessive-compulsive disorder(OCD),post-traumatic stress disorder(PTSD),and schizophrenia was acquired from the psychiatric genomics consor-tium.A two-sample MR method was applied to investigate mediators in signifi-cant associations.RESULTS After excluding weak instrumental variables,a causal relationship was identified between fatty liver disease and the occurrence and development of some psychia-tric disorders.Specifically,the findings indicated that ArLD was associated with a significantly elevated risk of developing ADHD(OR:5.81,95%CI:5.59-6.03,P<0.01),bipolar disorder(OR:5.73,95%CI:5.42-6.05,P=0.03),OCD(OR:6.42,95%CI:5.60-7.36,P<0.01),and PTSD(OR:5.66,95%CI:5.33-6.01,P<0.01).Meanwhile,NAFLD significantly increased the risk of developing bipolar disorder(OR:55.08,95%CI:3.59-845.51,P<0.01),OCD(OR:61.50,95%CI:6.69-565.45,P<0.01),and PTSD(OR:52.09,95%CI:4.24-639.32,P<0.01).CONCLUSION Associations were found between genetic predisposition to fatty liver disease and an increased risk of a broad range of psychiatric disorders,namely bipolar disorder,OCD,and PTSD,highlighting the significance of preven-tive measures against psychiatric disorders in patients with fatty liver disease.展开更多
Reducing the process variation is a significant concern for resistive random access memory(RRAM).Due to its ultrahigh integration density,RRAM arrays are prone to lithographic variation during the lithography process,...Reducing the process variation is a significant concern for resistive random access memory(RRAM).Due to its ultrahigh integration density,RRAM arrays are prone to lithographic variation during the lithography process,introducing electrical variation among different RRAM devices.In this work,an optical physical verification methodology for the RRAM array is developed,and the effects of different layout parameters on important electrical characteristics are systematically investigated.The results indicate that the RRAM devices can be categorized into three clusters according to their locations and lithography environments.The read resistance is more sensitive to the locations in the array(~30%)than SET/RESET voltage(<10%).The increase in the RRAM device length and the application of the optical proximity correction technique can help to reduce the variation to less than 10%,whereas it reduces RRAM read resistance by 4×,resulting in a higher power and area consumption.As such,we provide design guidelines to minimize the electrical variation of RRAM arrays due to the lithography process.展开更多
Due to a prolonged operation time and low mass transfer efficiency, the primary challenge in the aeration process of non-Newtonian fluids is the high energy consumption, which is closely related to the form and rate o...Due to a prolonged operation time and low mass transfer efficiency, the primary challenge in the aeration process of non-Newtonian fluids is the high energy consumption, which is closely related to the form and rate of impeller, ventilation, rheological properties and bubble morphology in the reactor. In this perspective, through optimal computational fluid dynamics models and experiments, the relationship between power consumption, volumetric mass transfer rate(kLa) and initial bubble size(d0) was constructed to establish an efficient operation mode for the aeration process of non-Newtonian fluids. It was found that reducing the d0could significantly increase the oxygen mass transfer rate, resulting in an obvious decrease in the ventilation volume and impeller speed. When d0was regulated within 2-5 mm,an optimal kLa could be achieved, and 21% of power consumption could be saved, compared to the case of bubbles with a diameter of 10 mm.展开更多
文摘Purpose-The purpose of this paper is to eliminate the fluctuations in train arrival and departure times caused by skewed distributions in interval operation times.These fluctuations arise from random origin and process factors during interval operations and can accumulate over multiple intervals.The aim is to enhance the robustness of high-speed rail station arrival and departure track utilization schemes.Design/methodologylapproach-To achieve this objective,the paper simulates actual train operations,incorporating the fluctuations in interval operation times into the utilization of arrival and departure tracks at the station.The Monte Carlo simulation method is adopted to solve this problem.This approach transforms a nonlinear model,which includes constraints from probability distribution functions and is difficult to solve directly,into a linear programming model that is easier to handle.The method then linearly weights two objectives to optimize the solution.Findings-Through the application of Monte Carlo simulation,the study successfully converts the complex nonlinear model with probability distribution function constraints into a manageable linear programming model.By continuously adjusting the weighting coefficients of the linear objectives,the method is able to optimize the Pareto solution.Notably,this approach does not require extensive scene data to obtain a satisfactory Pareto solution set.Originality/value-The paper contributes to the field by introducing a novel method for optimizing high-speed rail station arrival and departure track utilization in the presence of fluctuations in interval operation times.The use of Monte Carlo simulation to transform the problem into a tractable linear programming model represents a significant advancement.Furthermore,the method's ability to produce satisfactory Pareto solutions without relying on extensive data sets adds to its practical value and applicability in real-world scenarios.
基金support from the National Science and Technology Council of Taiwan(Contract Nos.111-2221 E-011081 and 111-2622-E-011019)the support from Intelligent Manufacturing Innovation Center(IMIC),National Taiwan University of Science and Technology(NTUST),Taipei,Taiwan,which is a Featured Areas Research Center in Higher Education Sprout Project of Ministry of Education(MOE),Taiwan(since 2023)was appreciatedWe also thank Wang Jhan Yang Charitable Trust Fund(Contract No.WJY 2020-HR-01)for its financial support.
文摘This study proposed a new real-time manufacturing process monitoring method to monitor and detect process shifts in manufacturing operations.Since real-time production process monitoring is critical in today’s smart manufacturing.The more robust the monitoring model,the more reliable a process is to be under control.In the past,many researchers have developed real-time monitoring methods to detect process shifts early.However,thesemethods have limitations in detecting process shifts as quickly as possible and handling various data volumes and varieties.In this paper,a robust monitoring model combining Gated Recurrent Unit(GRU)and Random Forest(RF)with Real-Time Contrast(RTC)called GRU-RF-RTC was proposed to detect process shifts rapidly.The effectiveness of the proposed GRU-RF-RTC model is first evaluated using multivariate normal and nonnormal distribution datasets.Then,to prove the applicability of the proposed model in a realmanufacturing setting,the model was evaluated using real-world normal and non-normal problems.The results demonstrate that the proposed GRU-RF-RTC outperforms other methods in detecting process shifts quickly with the lowest average out-of-control run length(ARL1)in all synthesis and real-world problems under normal and non-normal cases.The experiment results on real-world problems highlight the significance of the proposed GRU-RF-RTC model in modern manufacturing process monitoring applications.The result reveals that the proposed method improves the shift detection capability by 42.14%in normal and 43.64%in gamma distribution problems.
基金National Natural Science Foundation of China under Grant Nos.11972379 and 42377184,Hunan 100-Talent PlanNatural Science Foundation of Hunan Province under Grant No.2022JJ10079+1 种基金Hunan High-Level Talent Plan under Grant No.420030004Central South University Research Project under Grant Nos.202045006(Innovation-Driven Project)and 502390001。
文摘Extensive high-speed railway(HSR)network resembled the intricate vascular system of the human body,crisscrossing mainlands.Seismic events,known for their unpredictability,pose a significant threat to both trains and bridges,given the HSR’s extended operational duration.Therefore,ensuring the running safety of train-bridge coupled(TBC)system,primarily composed of simply supported beam bridges,is paramount.Traditional methods like the Monte Carlo method fall short in analyzing this intricate system efficiently.Instead,efficient algorithm like the new point estimate method combined with moment expansion approximation(NPEM-MEA)is applied to study random responses of numerical simulation TBC systems.Validation of the NPEM-MEA’s feasibility is conducted using the Monte Carlo method.Comparative analysis confirms the accuracy and efficiency of the method,with a recommended truncation order of four to six for the NPEM-MEA.Additionally,the influences of seismic magnitude and epicentral distance are discussed based on the random dynamic responses in the TBC system.This methodology not only facilitates seismic safety assessments for TBC systems but also contributes to standard-setting for these systems under earthquake conditions.
基金supported by the National Natural Science Foundation of China(61873006)Beijing Natural Science Foundation(4204087,4212040).
文摘Acid production with flue gas is a complex nonlinear process with multiple variables and strong coupling.The operation data is an important basis for state monitoring,optimal control,and fault diagnosis.However,the operating environment of acid production with flue gas is complex and there is much equipment.The data obtained by the detection equipment is seriously polluted and prone to abnormal phenomena such as data loss and outliers.Therefore,to solve the problem of abnormal data in the process of acid production with flue gas,a data cleaning method based on improved random forest is proposed.Firstly,an outlier data recognition model based on isolation forest is designed to identify and eliminate the outliers in the dataset.Secondly,an improved random forest regression model is established.Genetic algorithm is used to optimize the hyperparameters of the random forest regression model.Then the optimal parameter combination is found in the search space and the trend of data is predicted.Finally,the improved random forest data cleaning method is used to compensate for the missing data after eliminating abnormal data and the data cleaning is realized.Results show that the proposed method can accurately eliminate and compensate for the abnormal data in the process of acid production with flue gas.The method improves the accuracy of compensation for missing data.With the data after cleaning,a more accurate model can be established,which is significant to the subsequent temperature control.The conversion rate of SO_(2) can be further improved,thereby improving the yield of sulfuric acid and economic benefits.
基金supported by the National Natural Science Foundation of China(Grant No.51727805)。
文摘Random numbers are one of the key foundations of cryptography.This work implements a discrete quantum random number generator(QRNG)based on the tunneling effect of electrons in an avalanche photo diode.Without any post-processing and conditioning,this QRNG can output raw sequences at a rate of 100 Mbps.Remarkably,the statistical min-entropy of the 8,000,000 bits sequence reaches 0.9944 bits/bit,and the min-entropy validated by NIST SP 800-90B reaches 0.9872 bits/bit.This metric is currently the highest value we have investigated for QRNG raw sequences.Moreover,this QRNG can continuously and stably output raw sequences with high randomness over extended periods.The system produced a continuous output of 1,174 Gbits raw sequence for a duration of 11,744 s,with every 8 Mbits forming a unit to obtain a statistical min-entropy distribution with an average value of 0.9892 bits/bit.The statistical min-entropy of all data(1,174 Gbits)achieves the value of0.9951 bits/bit.This QRNG can produce high-quality raw sequences with good randomness and stability.It has the potential to meet the high demand in cryptography for random numbers with high quality.
基金Project supported by the National Natural Science Foundation of China(Nos.12272355,1202520411902294)+1 种基金the Opening Foundation of Shanxi Provincial Key Laboratory for Advanced Manufacturing Technology of China(No.XJZZ202304)the Shanxi Provincial Graduate Innovation Project of China(No.2023KY629)。
文摘In the practical environment,it is very common for the simultaneous occurrence of base excitation and crosswind.Scavenging the combined energy of vibration and wind with a single energy harvesting structure is fascinating.For this purpose,the effects of the wind speed and random excitation level are investigated with the stochastic averaging method(SAM)based on the energy envelope.The results of the analytical prediction are verified with the Monte-Carlo method(MCM).The numerical simulation shows that the introduction of wind can reduce the critical excitation level for triggering an inter-well jump and make a bi-stable energy harvester(BEH)realize the performance enhancement for a weak base excitation.However,as the strength of the wind increases to a particular level,the influence of the random base excitation on the dynamic responses is weakened,and the system exhibits a periodic galloping response.A comparison between a BEH and a linear energy harvester(LEH)indicates that the BEH demonstrates inferior performance for high-speed wind.Relevant experiments are conducted to investigate the validity of the theoretical prediction and numerical simulation.The experimental findings also show that strong random excitation is favorable for the BEH in the range of low wind speeds.However,as the speed of the incoming wind is up to a particular level,the disadvantage of the BEH becomes clear and evident.
基金supported by the State Key Laboratory of Natural Gas Hydrate(No.2022-KFJJ-SHW)the National Natural Science Foundation of China(No.42376058)+2 种基金the International Science&Technology Cooperation Program of China(No.2023YFE0119900)the Hainan Province Key Research and Development Project(No.ZDYF2024GXJS002)the Research Start-Up Funds of Zhufeng Scholars Program.
文摘Gas hydrate drilling expeditions in the Pearl River Mouth Basin,South China Sea,have identified concentrated gas hydrates with variable thickness.Moreover,free gas and the coexistence of gas hydrate and free gas have been confirmed by logging,coring,and production tests in the foraminifera-rich silty sediments with complex bottom-simulating reflectors(BSRs).The broad-band processing is conducted on conventional three-dimensional(3D)seismic data to improve the image and detection accuracy of gas hydratebearing layers and delineate the saturation and thickness of gas hydrate-and free gas-bearing sediments.Several geophysical attributes extracted along the base of the gas hydrate stability zone are used to demonstrate the variable distribution and the controlling factors for the differential enrichment of gas hydrate.The inverted gas hydrate saturation at the production zone is over 40% with a thickness of 90 m,showing the interbedded distribution with different boundaries between gas hydrate-and free gas-bearing layers.However,the gas hydrate saturation value at the adjacent canyon is 70%,with 30-m-thick patches and linear features.The lithological and fault controls on gas hydrate and free gas distributions are demonstrated by tracing each gas hydrate-bearing layer.Moreover,the BSR depths based on broad-band reprocessed 3D seismic data not only exhibit variations due to small-scale topographic changes caused by seafloor sedimentation and erosion but also show the upward shift of BSR and the blocky distribution of the coexistence of gas hydrate and free gas in the Pearl River Mouth Basin.
基金supported by the Innovation Team and Talents Cultivation Program of the National Administration of Traditional Chinese Medicine(ZYYCXTD-D-202002)the Fundamental Research Funds for the Central Universities(226-2022-00226).
文摘The current methods used to industrially produce sinomenine hydrochloride involve several issues,including high solvent toxicity,long process flow,and low atomic utilization efficiency,and the greenness scores of the processes are below 65 points.To solve these problems,a new process using anisole as the extractant was proposed.Anisole exhibits high selectivity for sinomenine and can be connected to the subsequent water-washing steps.After alkalization of the medicinal material,heating extraction,water washing,and acidification crystallization were carried out.The process was modeled and optimized.The design space was constructed.The recommended operating ranges for the critical process parameters were 3.0–4.0 h for alkalization time,60.0–80.0℃ for extraction temperature,2.0–3.0(volume ratio)for washing solution amount,and 2.0–2.4 mol·L^(-1) for hydrochloric acid concentration.The new process shows good robustness because different batches of medicinal materials did not greatly impact crystal purity or sinomenine transfer rate.The sinomenine transfer rate was about 20%higher than that of industrial processes.The greenness score increased to 90 points since the novel process proposed in this research solves the problems of long process flow,high solvent toxicity,and poor atomic economy,better aligning with the concept of green chemistry.
基金supported by China Agriculture Research System of MOF and MARA(CARS-41)Wens Fifth Five R&D Major Project(WENS-2020-1-ZDZX-007)。
文摘The sensory perception of food is a dynamic process,which is closely related to the release of flavor substances during oral processing.It’s not only affected by the food material,but also subjected to the individual oral environment.To explore the oral processing characteristics of soft-boiled chicken,the sensory properties,texture,particle size,viscosity,characteristic values of electronic nose and tongue of different chicken samples were investigated.The correlation analysis showed that the physical characteristics especially the cohesiveness,springiness,resilience of the sample determined oral processing behavior.The addition of chicken skin played a role in lubrication during oral processing.The particle size of the bolus was heightened at the early stage,and the fluidity was enhanced in the end,which reduced the chewing time to the swallowing point and raised the aromatic compounds signal of electronic nose.But the effect of chicken skin on chicken thigh with relatively high fat content,was opposite in electronic nose,which had a certain masking effect on the perception of umami and sweet taste.In conclusion,fat played a critical role in chicken oral processing and chicken thigh had obvious advantages in comprehensive evaluation of soft-boiled chicken,which was more popular among people.
基金the Australian Government through the Australian Research Council's Discovery Projects funding scheme(Project DP190101592)the National Natural Science Foundation of China(Grant Nos.41972280 and 52179103).
文摘The travel time of rock compressional waves is an essential parameter used for estimating important rock properties,such as porosity,permeability,and lithology.Current methods,like wireline logging tests,provide broad measurements but lack finer resolution.Laboratory-based rock core measurements offer higher resolution but are resource-intensive.Conventionally,wireline logging and rock core measurements have been used independently.This study introduces a novel approach that integrates both data sources.The method leverages the detailed features from limited core data to enhance the resolution of wireline logging data.By combining machine learning with random field theory,the method allows for probabilistic predictions in regions with sparse data sampling.In this framework,12 parameters from wireline tests are used to predict trends in rock core data.The residuals are modeled using random field theory.The outcomes are high-resolution predictions that combine both the predicted trend and the probabilistic realizations of the residual.By utilizing unconditional and conditional random field theories,this method enables unconditional and conditional simulations of the underlying high-resolution rock compressional wave travel time profile and provides uncertainty estimates.This integrated approach optimizes the use of existing core and logging data.Its applicability is confirmed in an oil project in West China.
文摘BACKGROUND The mucosal barrier's immune-brain interactions,pivotal for neural development and function,are increasingly recognized for their potential causal and therapeutic relevance to irritable bowel syndrome(IBS).Prior studies linking immune inflammation with IBS have been inconsistent.To further elucidate this relationship,we conducted a Mendelian randomization(MR)analysis of 731 immune cell markers to dissect the influence of various immune phenotypes on IBS.Our goal was to deepen our understanding of the disrupted brain-gut axis in IBS and to identify novel therapeutic targets.AIM To leverage publicly available data to perform MR analysis on 731 immune cell markers and explore their impact on IBS.We aimed to uncover immunophenotypic associations with IBS that could inform future drug development and therapeutic strategies.METHODS We performed a comprehensive two-sample MR analysis to evaluate the causal relationship between immune cell markers and IBS.By utilizing genetic data from public databases,we examined the causal associations between 731 immune cell markers,encompassing median fluorescence intensity,relative cell abundance,absolute cell count,and morphological parameters,with IBS susceptibility.Sensitivity analyses were conducted to validate our findings and address potential heterogeneity and pleiotropy.RESULTS Bidirectional false discovery rate correction indicated no significant influence of IBS on immunophenotypes.However,our analysis revealed a causal impact of IBS on 30 out of 731 immune phenotypes(P<0.05).Nine immune phenotypes demonstrated a protective effect against IBS[inverse variance weighting(IVW)<0.05,odd ratio(OR)<1],while 21 others were associated with an increased risk of IBS onset(IVW≥0.05,OR≥1).CONCLUSION Our findings underscore a substantial genetic correlation between immune cell phenotypes and IBS,providing valuable insights into the pathophysiology of the condition.These results pave the way for the development of more precise biomarkers and targeted therapies for IBS.Furthermore,this research enriches our comprehension of immune cell roles in IBS pathogenesis,offering a foundation for more effective,personalized treatment approaches.These advancements hold promise for improving IBS patient quality of life and reducing the disease burden on individuals and their families.
基金Supported by National Natural Science Foundation of China(General Program),No.82070631.
文摘BACKGROUND Clinical studies have reported that patients with gastroesophageal reflux disease(GERD)have a higher prevalence of hypertension.AIM To performed a bidirectional Mendelian randomization(MR)analysis to investi-gate the causal link between GERD and essential hypertension.METHODS Eligible single nucleotide polymorphisms(SNPs)were selected,and weighted median,inverse variance weighted(IVW)as well as MR egger(MR-Egger)re-gression were used to examine the potential causal association between GERD and hypertension.The MR-Pleiotropy RESidual Sum and Outlier analysis was used to detect and attempt to reduce horizontal pleiotropy by removing outliers SNPs.The MR-Egger intercept test,Cochran’s Q test and“leave-one-out”sen-sitivity analysis were performed to evaluate the horizontal pleiotropy,heterogen-eities,and stability of single instrumental variable.RESULTS IVW analysis exhibited an increased risk of hypertension(OR=1.46,95%CI:1.33-1.59,P=2.14E-16)in GERD patients.And the same result was obtained in replication practice(OR=1.002,95%CI:1.0008-1.003,P=0.000498).Meanwhile,the IVW analysis showed an increased risk of systolic blood pressure(β=0.78,95%CI:0.11-1.44,P=0.021)and hypertensive heart disease(OR=1.68,95%CI:1.36-2.08,P=0.0000016)in GERD patients.Moreover,we found an decreased risk of Barrett's esophagus(OR=0.91,95%CI:0.83-0.99,P=0.043)in essential hypertension patients.CONCLUSION We found that GERD would increase the risk of essential hypertension,which provided a novel prevent and therapeutic perspectives of essential hypertension.
基金supported by the National Key Research and Development Program of China(2021YFB3501002)State Key Program of National Natural Science Foundation of China(5203405)+3 种基金National Natural Science Foundation of China(51974220,52104383)National Key Research and Development Program of China(2021YFB3700902)Key Research and Development Program of Shaanxi Province(2020ZDLGY13-06,2017ZDXM-GY-037)Shaanxi Province National Science Fund for Distinguished Young Scholars(2022JC-24)。
文摘A large-scale fine-grained Mg-Gd-Y-Zn-Zr alloy plate with high strength and ductility was successfully prepared by multi-pass friction stir processing(MFSP)technology in this work.The structure of grains and long period stacking ordered(LPSO)phase were characterized,and the mechanical properties uniformity was investigated.Moreover,a quantitative relationship between the microstructure and tensile yield strength was established.The results showed that the grains in the processed zone(PZ)and interfacial zone(IZ)were refined from 50μm to 3μm and 4μm,respectively,and numerous original LPSO phases were broken.In IZ,some block-shaped 18R LPSO phases were transformed into needle-like 14H LPSO phases due to stacking faults and the short-range diffusion of solute atoms.The severe shear deformation in the form of kinetic energy caused profuse stacking fault to be generated and move rapidly,greatly increasing the transformation rate of LPSO phase.After MFSP,the ultimate tensile strength,yield strength and elongation to failure of the large-scale plate were 367 MPa,305 MPa and 18.0% respectively.Grain refinement and LPSO phase strengthening were the major strengthening mechanisms for the MFSP sample.In particularly,the strength of IZ was comparable to that of PZ because the strength contribution of the 14H LPSO phase offsets the lack of grain refinement strengthening in IZ.This result opposes the widely accepted notion that IZ is a weak region in MFSP-prepared large-scale fine-grained plate.
基金financially supported by the National Key Research and Development Program of China(2022YFB3706800,2020YFB1710100)the National Natural Science Foundation of China(51821001,52090042,52074183)。
文摘The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was proposed to reduce casting defects and improve production efficiency,which includes the random forest(RF)classification model,the feature importance analysis,and the process parameters optimization with Monte Carlo simulation.The collected data includes four types of defects and corresponding process parameters were used to construct the RF model.Classification results show a recall rate above 90% for all categories.The Gini Index was used to assess the importance of the process parameters in the formation of various defects in the RF model.Finally,the classification model was applied to different production conditions for quality prediction.In the case of process parameters optimization for gas porosity defects,this model serves as an experimental process in the Monte Carlo method to estimate a better temperature distribution.The prediction model,when applied to the factory,greatly improved the efficiency of defect detection.Results show that the scrap rate decreased from 10.16% to 6.68%.
基金supported by the Basic Research Program through the National Research Foundation of Korea(NRF)(Nos.2022R1C1C1006593,2022R1A4A3031263,and RS-2023-00271166)the National Science Foundation(Nos.2054098 and 2213693)+1 种基金the National Natural Science Foundation of China(No.52105593)Zhejiang Provincial Natural Science Foundation of China(No.LDQ24E050001).EH acknowledges a fellowship from the Hyundai Motor Chung Mong-Koo Foundation.
文摘Recently,the increasing interest in wearable technology for personal healthcare and smart virtual/augmented reality applications has led to the development of facile fabrication methods.Lasers have long been used to develop original solutions to such challenging technological problems due to their remote,sterile,rapid,and site-selective processing of materials.In this review,recent developments in relevant laser processes are summarized under two separate categories.First,transformative approaches,such as for laser-induced graphene,are introduced.In addition to design optimization and the alteration of a native substrate,the latest advances under a transformative approach now enable more complex material compositions and multilayer device configurations through the simultaneous transformation of heterogeneous precursors,or the sequential addition of functional layers coupled with other electronic elements.In addition,the more conventional laser techniques,such as ablation,sintering,and synthesis,can still be used to enhance the functionality of an entire system through the expansion of applicable materials and the adoption of new mechanisms.Later,various wearable device components developed through the corresponding laser processes are discussed,with an emphasis on chemical/physical sensors and energy devices.In addition,special attention is given to applications that use multiple laser sources or processes,which lay the foundation for the all-laser fabrication of wearable devices.
文摘BACKGROUND Non-alcoholic fatty liver disease(NAFLD)and alcohol-related liver disease(Ar-LD)constitute the primary forms of chronic liver disease,and their incidence is progressively increasing with changes in lifestyle habits.Earlier studies have do-cumented a correlation between the occurrence and development of prevalent mental disorders and fatty liver.AIM To investigate the correlation between fatty liver and mental disorders,thus ne-cessitating the implementation of a mendelian randomization(MR)study to elu-cidate this association.METHODS Data on NAFLD and ArLD were retrieved from the genome-wide association studies catalog,while information on mental disorders,including Alzheimer's disease,schizophrenia,anxiety disorder,attention deficit hyperactivity disorder(ADHD),bipolar disorder,major depressive disorder,multiple personality dis-order,obsessive-compulsive disorder(OCD),post-traumatic stress disorder(PTSD),and schizophrenia was acquired from the psychiatric genomics consor-tium.A two-sample MR method was applied to investigate mediators in signifi-cant associations.RESULTS After excluding weak instrumental variables,a causal relationship was identified between fatty liver disease and the occurrence and development of some psychia-tric disorders.Specifically,the findings indicated that ArLD was associated with a significantly elevated risk of developing ADHD(OR:5.81,95%CI:5.59-6.03,P<0.01),bipolar disorder(OR:5.73,95%CI:5.42-6.05,P=0.03),OCD(OR:6.42,95%CI:5.60-7.36,P<0.01),and PTSD(OR:5.66,95%CI:5.33-6.01,P<0.01).Meanwhile,NAFLD significantly increased the risk of developing bipolar disorder(OR:55.08,95%CI:3.59-845.51,P<0.01),OCD(OR:61.50,95%CI:6.69-565.45,P<0.01),and PTSD(OR:52.09,95%CI:4.24-639.32,P<0.01).CONCLUSION Associations were found between genetic predisposition to fatty liver disease and an increased risk of a broad range of psychiatric disorders,namely bipolar disorder,OCD,and PTSD,highlighting the significance of preven-tive measures against psychiatric disorders in patients with fatty liver disease.
基金supported in part by the Open Fund of State Key Laboratory of Integrated Chips and Systems,Fudan Universityin part by the National Science Foundation of China under Grant No.62304133 and No.62350610271.
文摘Reducing the process variation is a significant concern for resistive random access memory(RRAM).Due to its ultrahigh integration density,RRAM arrays are prone to lithographic variation during the lithography process,introducing electrical variation among different RRAM devices.In this work,an optical physical verification methodology for the RRAM array is developed,and the effects of different layout parameters on important electrical characteristics are systematically investigated.The results indicate that the RRAM devices can be categorized into three clusters according to their locations and lithography environments.The read resistance is more sensitive to the locations in the array(~30%)than SET/RESET voltage(<10%).The increase in the RRAM device length and the application of the optical proximity correction technique can help to reduce the variation to less than 10%,whereas it reduces RRAM read resistance by 4×,resulting in a higher power and area consumption.As such,we provide design guidelines to minimize the electrical variation of RRAM arrays due to the lithography process.
基金financial support of the National Natural Science Foundation of China(21776122).
文摘Due to a prolonged operation time and low mass transfer efficiency, the primary challenge in the aeration process of non-Newtonian fluids is the high energy consumption, which is closely related to the form and rate of impeller, ventilation, rheological properties and bubble morphology in the reactor. In this perspective, through optimal computational fluid dynamics models and experiments, the relationship between power consumption, volumetric mass transfer rate(kLa) and initial bubble size(d0) was constructed to establish an efficient operation mode for the aeration process of non-Newtonian fluids. It was found that reducing the d0could significantly increase the oxygen mass transfer rate, resulting in an obvious decrease in the ventilation volume and impeller speed. When d0was regulated within 2-5 mm,an optimal kLa could be achieved, and 21% of power consumption could be saved, compared to the case of bubbles with a diameter of 10 mm.