Damage to electrical equipment in an earthquake can lead to power outage of power systems.Seismic fragility analysis is a common method to assess the seismic reliability of electrical equipment.To further guarantee th...Damage to electrical equipment in an earthquake can lead to power outage of power systems.Seismic fragility analysis is a common method to assess the seismic reliability of electrical equipment.To further guarantee the efficiency of analysis,multi-source uncertainties including the structure itself and seismic excitation need to be considered.A method for seismic fragility analysis that reflects structural and seismic parameter uncertainty was developed in this study.The proposed method used a random sampling method based on Latin hypercube sampling(LHS)to account for the structure parameter uncertainty and the group structure characteristics of electrical equipment.Then,logistic Lasso regression(LLR)was used to find the seismic fragility surface based on double ground motion intensity measures(IM).The seismic fragility based on the finite element model of an±1000 kV main transformer(UHVMT)was analyzed using the proposed method.The results show that the seismic fragility function obtained by this method can be used to construct the relationship between the uncertainty parameters and the failure probability.The seismic fragility surface did not only provide the probabilities of seismic damage states under different IMs,but also had better stability than the fragility curve.Furthermore,the sensitivity analysis of the structural parameters revealed that the elastic module of the bushing and the height of the high-voltage bushing may have a greater influence.展开更多
With determination micro-Fe by 1, 10-phenanthroline spectrophotometry for example, they are systematically introduced the combinatorial measurement and regression analysis method application about metheodic principle,...With determination micro-Fe by 1, 10-phenanthroline spectrophotometry for example, they are systematically introduced the combinatorial measurement and regression analysis method application about metheodic principle, operation step and data processing in the instrumental analysis, including: calibration curve best linear equation is set up, measurand best linear equation is set up, and calculation of best value of a concentration. The results showed that mean of thrice determination , s = 0 μg/mL, RSD = 0. Results of preliminary application are simply introduced in the basic instrumental analysis for atomic absorption spectrophotometry, ion-selective electrodes, coulometry and polarographic analysis and are contrasted to results of normal measurements.展开更多
The spread of tuberculosis(TB),especially multidrug-resistant TB and extensively drug-resistant TB,has strongly motivated the research and development of new anti-TB drugs.New strategies to facilitate drug combination...The spread of tuberculosis(TB),especially multidrug-resistant TB and extensively drug-resistant TB,has strongly motivated the research and development of new anti-TB drugs.New strategies to facilitate drug combinations,including pharmacokinetics-guided dose optimization and toxicology studies of first-and second-line anti-TB drugs have also been introduced and recommended.Liquid chromatography-mass spectrometry(LC-MS)has arguably become the gold standard in the analysis of both endo-and exo-genous compounds.This technique has been applied successfully not only for therapeutic drug monitoring(TDM)but also for pharmacometabolomics analysis.TDM improves the effectiveness of treatment,reduces adverse drug reactions,and the likelihood of drug resistance development in TB patients by determining dosage regimens that produce concentrations within the therapeutic target window.Based on TDM,the dose would be optimized individually to achieve favorable outcomes.Pharmacometabolomics is essential in generating and validating hypotheses regarding the metabolism of anti-TB drugs,aiding in the discovery of potential biomarkers for TB diagnostics,treatment monitoring,and outcome evaluation.This article highlighted the current progresses in TDM of anti-TB drugs based on LC-MS bioassay in the last two decades.Besides,we discussed the advantages and disadvantages of this technique in practical use.The pressing need for non-invasive sampling approaches and stability studies of anti-TB drugs was highlighted.Lastly,we provided perspectives on the prospects of combining LC-MS-based TDM and pharmacometabolomics with other advanced strategies(pharmacometrics,drug and vaccine developments,machine learning/artificial intelligence,among others)to encapsulate in an all-inclusive approach to improve treatment outcomes of TB patients.展开更多
Lung cancer is the most common and fatal malignant disease worldwide and has the highest mortality rate among tumor-related causes of death.Early diagnosis and precision medicine can significantly improve the survival...Lung cancer is the most common and fatal malignant disease worldwide and has the highest mortality rate among tumor-related causes of death.Early diagnosis and precision medicine can significantly improve the survival rate and prognosis of lung cancer patients.At present,the clinical diagnosis of lung cancer is challenging due to a lack of effective non-invasive detection methods and biomarkers,and treatment is primarily hindered by drug resistance and high tumor heterogeneity.Liquid biopsy is a method for detecting circulating biomarkers in the blood and other body fluids containing genetic information from primary tumor tissues.Bronchoalveolar lavage fluid(BALF)is a potential liquid biopsy medium that is rich in a variety of bioactive substances and cell components.BALF contains information on the key characteristics of tumors,including the tumor subtype,gene mutation type,and tumor environment,thus BALF may be used as a diagnostic supplement to lung biopsy.In this review,the current research on BALF in the diagnosis,treatment,and prognosis of lung cancer is summarized.The advantages and disadvantages of different components of BALF,including cells,cell-free DNA,extracellular vesicles,and micro RNA are introduced.In particular,the great potential of extracellular vesicles in precision diagnosis and detection of drug-resistant for lung cancer is highlighted.In addition,the performance of liquid biopsies with different body fluid sources in lung cancer detection are compared to facilitate more selective studies involving BALF,thereby promoting the application of BALF for precision medicine in lung cancer patients in the future.展开更多
Identification of the ice channel is the basic technology for developing intelligent ships in ice-covered waters,which is important to ensure the safety and economy of navigation.In the Arctic,merchant ships with low ...Identification of the ice channel is the basic technology for developing intelligent ships in ice-covered waters,which is important to ensure the safety and economy of navigation.In the Arctic,merchant ships with low ice class often navigate in channels opened up by icebreakers.Navigation in the ice channel often depends on good maneuverability skills and abundant experience from the captain to a large extent.The ship may get stuck if steered into ice fields off the channel.Under this circumstance,it is very important to study how to identify the boundary lines of ice channels with a reliable method.In this paper,a two-staged ice channel identification method is developed based on image segmentation and corner point regression.The first stage employs the image segmentation method to extract channel regions.In the second stage,an intelligent corner regression network is proposed to extract the channel boundary lines from the channel region.A non-intelligent angle-based filtering and clustering method is proposed and compared with corner point regression network.The training and evaluation of the segmentation method and corner regression network are carried out on the synthetic and real ice channel dataset.The evaluation results show that the accuracy of the method using the corner point regression network in the second stage is achieved as high as 73.33%on the synthetic ice channel dataset and 70.66%on the real ice channel dataset,and the processing speed can reach up to 14.58frames per second.展开更多
Recent trends suggest that Chinese herbal medicine formulas(CHM formulas)are promising treatments for complex diseases.To characterize the precise syndromes,precise diseases and precise targets of the precise targets ...Recent trends suggest that Chinese herbal medicine formulas(CHM formulas)are promising treatments for complex diseases.To characterize the precise syndromes,precise diseases and precise targets of the precise targets between complex diseases and CHM formulas,we developed an artificial intelligence-based quantitative predictive algorithm(DeepTCM).DeepTCM has gone through multilevel model calibration and validation against a comprehensive set of herb and disease data so that it accurately captures the complex cellular signaling,molecular and theoretical levels of traditional Chinese medicine(TCM).As an example,our model simulated the optimal CHM formulas for the treatment of coronary heart disease(CHD)with depression,and through model sensitivity analysis,we calculated the balanced scoring of the formulas.Furthermore,we constructed a biological knowledge graph representing interactions by associating herb-target and gene-disease interactions.Finally,we experimentally confirmed the therapeutic effect and pharmacological mechanism of a novel model-predicted intervention in humans and mice.This novel multiscale model opened up a new avenue to combine“disease syndrome”and“macro micro”system modeling to facilitate translational research in CHM formulas.展开更多
Despite the maturity of ensemble numerical weather prediction(NWP),the resulting forecasts are still,more often than not,under-dispersed.As such,forecast calibration tools have become popular.Among those tools,quantil...Despite the maturity of ensemble numerical weather prediction(NWP),the resulting forecasts are still,more often than not,under-dispersed.As such,forecast calibration tools have become popular.Among those tools,quantile regression(QR)is highly competitive in terms of both flexibility and predictive performance.Nevertheless,a long-standing problem of QR is quantile crossing,which greatly limits the interpretability of QR-calibrated forecasts.On this point,this study proposes a non-crossing quantile regression neural network(NCQRNN),for calibrating ensemble NWP forecasts into a set of reliable quantile forecasts without crossing.The overarching design principle of NCQRNN is to add on top of the conventional QRNN structure another hidden layer,which imposes a non-decreasing mapping between the combined output from nodes of the last hidden layer to the nodes of the output layer,through a triangular weight matrix with positive entries.The empirical part of the work considers a solar irradiance case study,in which four years of ensemble irradiance forecasts at seven locations,issued by the European Centre for Medium-Range Weather Forecasts,are calibrated via NCQRNN,as well as via an eclectic mix of benchmarking models,ranging from the naïve climatology to the state-of-the-art deep-learning and other non-crossing models.Formal and stringent forecast verification suggests that the forecasts post-processed via NCQRNN attain the maximum sharpness subject to calibration,amongst all competitors.Furthermore,the proposed conception to resolve quantile crossing is remarkably simple yet general,and thus has broad applicability as it can be integrated with many shallow-and deep-learning-based neural networks.展开更多
The field experiment designs with single replication were frequently used for factorial experiments in which the numbers of field plots were limited, but the experimental error was difficult to be estimated. To study ...The field experiment designs with single replication were frequently used for factorial experiments in which the numbers of field plots were limited, but the experimental error was difficult to be estimated. To study a new statistical method for improving precision of regression analysis of such experiments in rice, 84 fertilizer experiments were conducted in 15 provinces of China, including Zhejiang, Jiangsu, Anhui, Hunan, Sichuan, Heilongjiang, etc. Three factors with 14 treatments (N: 0—225kg/ha, P: 0 —112. 5kg/ha, K: 0—150kg/ha) and two replications were employed using approaching optimun design. There were 2352 (84×14×2=2352) Yield deviations (d) between the individual treatment yields and its arithmetic mean. The results indicated that:展开更多
Purpose:The purpose of this study is to develop and compare model choice strategies in context of logistic regression.Model choice means the choice of the covariates to be included in the model.Design/methodology/appr...Purpose:The purpose of this study is to develop and compare model choice strategies in context of logistic regression.Model choice means the choice of the covariates to be included in the model.Design/methodology/approach:The study is based on Monte Carlo simulations.The methods are compared in terms of three measures of accuracy:specificity and two kinds of sensitivity.A loss function combining sensitivity and specificity is introduced and used for a final comparison.Findings:The choice of method depends on how much the users emphasize sensitivity against specificity.It also depends on the sample size.For a typical logistic regression setting with a moderate sample size and a small to moderate effect size,either BIC,BICc or Lasso seems to be optimal.Research limitations:Numerical simulations cannot cover the whole range of data-generating processes occurring with real-world data.Thus,more simulations are needed.Practical implications:Researchers can refer to these results if they believe that their data-generating process is somewhat similar to some of the scenarios presented in this paper.Alternatively,they could run their own simulations and calculate the loss function.Originality/value:This is a systematic comparison of model choice algorithms and heuristics in context of logistic regression.The distinction between two types of sensitivity and a comparison based on a loss function are methodological novelties.展开更多
In oil and gas exploration,elucidating the complex interdependencies among geological variables is paramount.Our study introduces the application of sophisticated regression analysis method at the forefront,aiming not...In oil and gas exploration,elucidating the complex interdependencies among geological variables is paramount.Our study introduces the application of sophisticated regression analysis method at the forefront,aiming not just at predicting geophysical logging curve values but also innovatively mitigate hydrocarbon depletion observed in geochemical logging.Through a rigorous assessment,we explore the efficacy of eight regression models,bifurcated into linear and nonlinear groups,to accommodate the multifaceted nature of geological datasets.Our linear model suite encompasses the Standard Equation,Ridge Regression,Least Absolute Shrinkage and Selection Operator,and Elastic Net,each presenting distinct advantages.The Standard Equation serves as a foundational benchmark,whereas Ridge Regression implements penalty terms to counteract overfitting,thus bolstering model robustness in the presence of multicollinearity.The Least Absolute Shrinkage and Selection Operator for variable selection functions to streamline models,enhancing their interpretability,while Elastic Net amalgamates the merits of Ridge Regression and Least Absolute Shrinkage and Selection Operator,offering a harmonized solution to model complexity and comprehensibility.On the nonlinear front,Gradient Descent,Kernel Ridge Regression,Support Vector Regression,and Piecewise Function-Fitting methods introduce innovative approaches.Gradient Descent assures computational efficiency in optimizing solutions,Kernel Ridge Regression leverages the kernel trick to navigate nonlinear patterns,and Support Vector Regression is proficient in forecasting extremities,pivotal for exploration risk assessment.The Piecewise Function-Fitting approach,tailored for geological data,facilitates adaptable modeling of variable interrelations,accommodating abrupt data trend shifts.Our analysis identifies Ridge Regression,particularly when augmented by Piecewise Function-Fitting,as superior in recouping hydrocarbon losses,and underscoring its utility in resource quantification refinement.Meanwhile,Kernel Ridge Regression emerges as a noteworthy strategy in ameliorating porosity-logging curve prediction for well A,evidencing its aptness for intricate geological structures.This research attests to the scientific ascendancy and broad-spectrum relevance of these regression techniques over conventional methods while heralding new horizons for their deployment in the oil and gas sector.The insights garnered from these advanced modeling strategies are set to transform geological and engineering practices in hydrocarbon prediction,evaluation,and recovery.展开更多
Precision therapy has become the preferred choice attributed to the optimal drug concentration in target sites,increased therapeutic efficacy,and reduced adverse effects.Over the past few years,sprayable or injectable...Precision therapy has become the preferred choice attributed to the optimal drug concentration in target sites,increased therapeutic efficacy,and reduced adverse effects.Over the past few years,sprayable or injectable thermosensitive hydrogels have exhibited high therapeutic potential.These can be applied as cell-growing scaffolds or drug-releasing reservoirs by simply mixing in a free-flowing sol phase at room temperature.Inspired by their unique properties,thermosensitive hydrogels have been widely applied as drug delivery and treatment platforms for precision medicine.In this review,the state-of-theart developments in thermosensitive hydrogels for precision therapy are investigated,which covers from the thermo-gelling mechanisms and main components to biomedical applications,including wound healing,anti-tumor activity,osteogenesis,and periodontal,sinonasal and ophthalmic diseases.The most promising applications and trends of thermosensitive hydrogels for precision therapy are also discussed in light of their unique features.展开更多
Silicon(Si)diffraction microlens arrays are usually used to integrating with infrared focal plane arrays(IRFPAs)to improve their performance.The errors of lithography are unavoidable in the process of the Si diffrac-t...Silicon(Si)diffraction microlens arrays are usually used to integrating with infrared focal plane arrays(IRFPAs)to improve their performance.The errors of lithography are unavoidable in the process of the Si diffrac-tion microlens arrays preparation in the conventional engraving method.It has a serious impact on its performance and subsequent applications.In response to the problem of errors of Si diffraction microlens arrays in the conven-tional method,a novel self-alignment method for high precision Si diffraction microlens arrays preparation is pro-posed.The accuracy of the Si diffractive microlens arrays preparation is determined by the accuracy of the first li-thography mask in the novel self-alignment method.In the subsequent etching,the etched area will be protected by the mask layer and the sacrifice layer or the protective layer.The unprotection area is carved to effectively block the non-etching areas,accurately etch the etching area required,and solve the problem of errors.The high precision Si diffraction microlens arrays are obtained by the novel self-alignment method and the diffraction effi-ciency could reach 92.6%.After integrating with IRFPAs,the average blackbody responsity increased by 8.3%,and the average blackbody detectivity increased by 10.3%.It indicates that the Si diffraction microlens arrays can improve the filling factor and reduce crosstalk of IRFPAs through convergence,thereby improving the perfor-mance of the IRFPAs.The results are of great reference significance for improving their performance through opti-mizing the preparation level of micro nano devices.展开更多
Additive Runge-Kutta methods designed for preserving highly accurate solutions in mixed-precision computation were previously proposed and analyzed.These specially designed methods use reduced precision for the implic...Additive Runge-Kutta methods designed for preserving highly accurate solutions in mixed-precision computation were previously proposed and analyzed.These specially designed methods use reduced precision for the implicit computations and full precision for the explicit computations.In this work,we analyze the stability properties of these methods and their sensitivity to the low-precision rounding errors,and demonstrate their performance in terms of accuracy and efficiency.We develop codes in FORTRAN and Julia to solve nonlinear systems of ODEs and PDEs using the mixed-precision additive Runge-Kutta(MP-ARK)methods.The convergence,accuracy,and runtime of these methods are explored.We show that for a given level of accuracy,suitably chosen MP-ARK methods may provide significant reductions in runtime.展开更多
We develop a quantum precision measurement method for magnetic field at the Tesla level by utilizing a fiber diamond magnetometer.Central to our system is a micron-sized fiber diamond probe positioned on the surface o...We develop a quantum precision measurement method for magnetic field at the Tesla level by utilizing a fiber diamond magnetometer.Central to our system is a micron-sized fiber diamond probe positioned on the surface of a coplanar waveguide made of nonmagnetic materials.Calibrated with a nuclear magnetic resonance magnetometer,this probe demonstrates a broad magnetic field range from 10 mT to 1.5 T with a nonlinear error better than 0.0028%under a standard magnetic field generator and stability better than 0.0012%at a 1.5 T magnetic field.Finally,we demonstrate quantitative mapping of the vector magnetic field on the surface of a permanent magnet using the diamond magnetometer.展开更多
Background:Limited research has been conducted on the influence of autophagy-associated long non-coding RNAs(ARLncRNAs)on the prognosis of hepatocellular carcinoma(HCC).Methods:We analyzed 371 HCC samples from TCGA,id...Background:Limited research has been conducted on the influence of autophagy-associated long non-coding RNAs(ARLncRNAs)on the prognosis of hepatocellular carcinoma(HCC).Methods:We analyzed 371 HCC samples from TCGA,identifying expression networks of ARLncRNAs using autophagy-related genes.Screening for prognostically relevant ARLncRNAs involved univariate Cox regression,Lasso regression,and multivariate Cox regression.A Nomogram was further employed to assess the reliability of Riskscore,calculated from the signatures of screened ARLncRNAs,in predicting outcomes.Additionally,we compared drug sensitivities in patient groups with differing risk levels and investigated potential biological pathways through enrichment analysis,using consensus clustering to identify subgroups related to ARLncRNAs.Results:The screening process identified 27 ARLncRNAs,with 13 being associated with HCC prognosis.Consequently,a set of signatures comprising 8 ARLncRNAs was successfully constructed as independent prognostic factors for HCC.Patients in the high-risk group showed very poor prognoses in most clinical categories.The Riskscore was closely related to immune cell scores,such as macrophages,and the DEGs between different groups were implicated in metabolism,cell cycle,and mitotic processes.Notably,high-risk group patients demonstrated a significantly lower IC50 for Paclitaxel,suggesting that Paclitaxel could be an ideal treatment for those at elevated risk for HCC.We further identified C2 as the Paclitaxel subtype,where patients exhibited higher Riskscores,reduced survival rates,and more severe clinical progression.Conclusion:The 8 signatures based on ARLncRNAs present novel targets for prognostic prediction in HCC.The drug candidate Paclitaxel may effectively treat HCC by impacting ARLncRNAs expression.With the identification of ARLncRNAsrelated isoforms,these results provide valuable insights for clinical exploration of autophagy mechanisms in HCC pathogenesis and offer potential avenues for precision medicine.展开更多
The Extensible Markup Language(XML)files,widely used for storing and exchanging information on the web require efficient parsing mechanisms to improve the performance of the applications.With the existing Document Obj...The Extensible Markup Language(XML)files,widely used for storing and exchanging information on the web require efficient parsing mechanisms to improve the performance of the applications.With the existing Document Object Model(DOM)based parsing,the performance degrades due to sequential processing and large memory requirements,thereby requiring an efficient XML parser to mitigate these issues.In this paper,we propose a Parallel XML Tree Generator(PXTG)algorithm for accelerating the parsing of XML files and a Regression-based XML Parsing Framework(RXPF)that analyzes and predicts performance through profiling,regression,and code generation for efficient parsing.The PXTG algorithm is based on dividing the XML file into n parts and producing n trees in parallel.The profiling phase of the RXPF framework produces a dataset by measuring the performance of various parsing models including StAX,SAX,DOM,JDOM,and PXTG on different cores by using multiple file sizes.The regression phase produces the prediction model,based on which the final code for efficient parsing of XML files is produced through the code generation phase.The RXPF framework has shown a significant improvement in performance varying from 9.54%to 32.34%over other existing models used for parsing XML files.展开更多
The electrical resistivity method is a geophysical tool used to characterize the subsoil and can provide an important information for precision agriculture. The lack of knowledge about agronomic properties of the soil...The electrical resistivity method is a geophysical tool used to characterize the subsoil and can provide an important information for precision agriculture. The lack of knowledge about agronomic properties of the soil tends to affect the agricultural coffee production system. Therefore, research related to geoelectrical properties of soil such as resistivity for characterization the region of the study for coffee cultivation purposes can improve and optimize the production. This resistivity method allows to investigate the subsurface through different techniques: 1D vertical electrical sounding and electrical imaging. The acquisition of data using these techniques permitted the creation of 2D resistivity cross section from the study area. The geoelectrical data was acquired by using a resistivity meter equipment and was processed in different softwares. The results of the geoelectrical characterization from 1D resistivity model and 2D resistivity electrical sections show that in the study area of Kabiri, there are 8 varieties of geoelectrical layers with different resistivity or conductivity. Near survey in the study area, the lowest resistivity is around 0.322 Ω·m, while the highest is about 92.1 Ω·m. These values illustrated where is possible to plant coffee for suggestion of specific fertilization plan for some area to improve the cultivation.展开更多
We present a quantitative measurement of the horizontal component of the microwave magnetic field of a coplanar waveguide using a quantum diamond probe in fiber format.The measurement results are compared in detail wi...We present a quantitative measurement of the horizontal component of the microwave magnetic field of a coplanar waveguide using a quantum diamond probe in fiber format.The measurement results are compared in detail with simulation,showing a good consistence.Further simulation shows fiber diamond probe brings negligible disturbance to the field under measurement compared to bulk diamond.This method will find important applications ranging from electromagnetic compatibility test and failure analysis of high frequency and high complexity integrated circuits.展开更多
The picking efficiency of seismic first breaks(FBs)has been greatly accelerated by deep learning(DL)technology.However,the picking accuracy and efficiency of DL methods still face huge challenges in low signal-to-nois...The picking efficiency of seismic first breaks(FBs)has been greatly accelerated by deep learning(DL)technology.However,the picking accuracy and efficiency of DL methods still face huge challenges in low signal-to-noise ratio(SNR)situations.To address this issue,we propose a regression approach to pick FBs based on bidirectional long short-term memory(Bi LSTM)neural network by learning the implicit Eikonal equation of 3D inhomogeneous media with rugged topography in the target region.We employ a regressive model that represents the relationships among the elevation of shots,offset and the elevation of receivers with their seismic traveltime to predict the unknown FBs,from common-shot gathers with sparsely distributed traces.Different from image segmentation methods which automatically extract image features and classify FBs from seismic data,the proposed method can learn the inner relationship between field geometry and FBs.In addition,the predicted results by the regressive model are continuous values of FBs rather than the discrete ones of the binary distribution.The picking results of synthetic data shows that the proposed method has low dependence on label data,and can obtain reliable and similar predicted results using two types of label data with large differences.The picking results of9380 shots for 3D seismic data generated by vibroseis indicate that the proposed method can still accurately predict FBs in low SNR data.The subsequent stacked profiles further illustrate the reliability and effectiveness of the proposed method.The results of model data and field seismic data demonstrate that the proposed regression method is a robust first-break picker with high potential for field application.展开更多
Hepatitis B virus(HBV)infection is a major player in chronic hepatitis B that may lead to the development of hepatocellular carcinoma(HCC).HBV genetics are diverse where it is classified into at least 9 genotypes(A to...Hepatitis B virus(HBV)infection is a major player in chronic hepatitis B that may lead to the development of hepatocellular carcinoma(HCC).HBV genetics are diverse where it is classified into at least 9 genotypes(A to I)and 1 putative genotype(J),each with specific geographical distribution and possible different clinical outcomes in the patient.This diversity may be associated with the precision medicine for HBV-related HCC and the success of therapeutical approaches against HCC,related to different pathogenicity of the virus and host response.This Editorial discusses recent updates on whether the classification of HBV genetic diversity is still valid in terms of viral oncogenicity to the HCC and its precision medicine,in addition to the recent advances in cellular and molecular biology technologies.展开更多
基金National Key R&D Program of China under Grant Nos.2018YFC1504504 and 2018YFC0809404。
文摘Damage to electrical equipment in an earthquake can lead to power outage of power systems.Seismic fragility analysis is a common method to assess the seismic reliability of electrical equipment.To further guarantee the efficiency of analysis,multi-source uncertainties including the structure itself and seismic excitation need to be considered.A method for seismic fragility analysis that reflects structural and seismic parameter uncertainty was developed in this study.The proposed method used a random sampling method based on Latin hypercube sampling(LHS)to account for the structure parameter uncertainty and the group structure characteristics of electrical equipment.Then,logistic Lasso regression(LLR)was used to find the seismic fragility surface based on double ground motion intensity measures(IM).The seismic fragility based on the finite element model of an±1000 kV main transformer(UHVMT)was analyzed using the proposed method.The results show that the seismic fragility function obtained by this method can be used to construct the relationship between the uncertainty parameters and the failure probability.The seismic fragility surface did not only provide the probabilities of seismic damage states under different IMs,but also had better stability than the fragility curve.Furthermore,the sensitivity analysis of the structural parameters revealed that the elastic module of the bushing and the height of the high-voltage bushing may have a greater influence.
文摘With determination micro-Fe by 1, 10-phenanthroline spectrophotometry for example, they are systematically introduced the combinatorial measurement and regression analysis method application about metheodic principle, operation step and data processing in the instrumental analysis, including: calibration curve best linear equation is set up, measurand best linear equation is set up, and calculation of best value of a concentration. The results showed that mean of thrice determination , s = 0 μg/mL, RSD = 0. Results of preliminary application are simply introduced in the basic instrumental analysis for atomic absorption spectrophotometry, ion-selective electrodes, coulometry and polarographic analysis and are contrasted to results of normal measurements.
基金sponsored by the National Research Foundation of Korea(NRF)Grant funded by the Korean government(MSIT)(Grant No.:2018R1A5A2021242).
文摘The spread of tuberculosis(TB),especially multidrug-resistant TB and extensively drug-resistant TB,has strongly motivated the research and development of new anti-TB drugs.New strategies to facilitate drug combinations,including pharmacokinetics-guided dose optimization and toxicology studies of first-and second-line anti-TB drugs have also been introduced and recommended.Liquid chromatography-mass spectrometry(LC-MS)has arguably become the gold standard in the analysis of both endo-and exo-genous compounds.This technique has been applied successfully not only for therapeutic drug monitoring(TDM)but also for pharmacometabolomics analysis.TDM improves the effectiveness of treatment,reduces adverse drug reactions,and the likelihood of drug resistance development in TB patients by determining dosage regimens that produce concentrations within the therapeutic target window.Based on TDM,the dose would be optimized individually to achieve favorable outcomes.Pharmacometabolomics is essential in generating and validating hypotheses regarding the metabolism of anti-TB drugs,aiding in the discovery of potential biomarkers for TB diagnostics,treatment monitoring,and outcome evaluation.This article highlighted the current progresses in TDM of anti-TB drugs based on LC-MS bioassay in the last two decades.Besides,we discussed the advantages and disadvantages of this technique in practical use.The pressing need for non-invasive sampling approaches and stability studies of anti-TB drugs was highlighted.Lastly,we provided perspectives on the prospects of combining LC-MS-based TDM and pharmacometabolomics with other advanced strategies(pharmacometrics,drug and vaccine developments,machine learning/artificial intelligence,among others)to encapsulate in an all-inclusive approach to improve treatment outcomes of TB patients.
基金supported by grants from the National Natural Science Foundation of China(Grant No.82173182)the Sichuan Science and Technology Program(Grant No.2021YJ0117 to Weiya Wang+1 种基金Grant No.2023NSFSC1939 to Dan Liu)the 1·3·5 project for Disciplines of Excellence–Clinical Research Incubation Project,West China Hospital,Sichuan University(Grant Nos.2019HXFH034 and ZYJC21074)。
文摘Lung cancer is the most common and fatal malignant disease worldwide and has the highest mortality rate among tumor-related causes of death.Early diagnosis and precision medicine can significantly improve the survival rate and prognosis of lung cancer patients.At present,the clinical diagnosis of lung cancer is challenging due to a lack of effective non-invasive detection methods and biomarkers,and treatment is primarily hindered by drug resistance and high tumor heterogeneity.Liquid biopsy is a method for detecting circulating biomarkers in the blood and other body fluids containing genetic information from primary tumor tissues.Bronchoalveolar lavage fluid(BALF)is a potential liquid biopsy medium that is rich in a variety of bioactive substances and cell components.BALF contains information on the key characteristics of tumors,including the tumor subtype,gene mutation type,and tumor environment,thus BALF may be used as a diagnostic supplement to lung biopsy.In this review,the current research on BALF in the diagnosis,treatment,and prognosis of lung cancer is summarized.The advantages and disadvantages of different components of BALF,including cells,cell-free DNA,extracellular vesicles,and micro RNA are introduced.In particular,the great potential of extracellular vesicles in precision diagnosis and detection of drug-resistant for lung cancer is highlighted.In addition,the performance of liquid biopsies with different body fluid sources in lung cancer detection are compared to facilitate more selective studies involving BALF,thereby promoting the application of BALF for precision medicine in lung cancer patients in the future.
基金financially supported by the National Key Research and Development Program(Grant No.2022YFE0107000)the General Projects of the National Natural Science Foundation of China(Grant No.52171259)the High-Tech Ship Research Project of the Ministry of Industry and Information Technology(Grant No.[2021]342)。
文摘Identification of the ice channel is the basic technology for developing intelligent ships in ice-covered waters,which is important to ensure the safety and economy of navigation.In the Arctic,merchant ships with low ice class often navigate in channels opened up by icebreakers.Navigation in the ice channel often depends on good maneuverability skills and abundant experience from the captain to a large extent.The ship may get stuck if steered into ice fields off the channel.Under this circumstance,it is very important to study how to identify the boundary lines of ice channels with a reliable method.In this paper,a two-staged ice channel identification method is developed based on image segmentation and corner point regression.The first stage employs the image segmentation method to extract channel regions.In the second stage,an intelligent corner regression network is proposed to extract the channel boundary lines from the channel region.A non-intelligent angle-based filtering and clustering method is proposed and compared with corner point regression network.The training and evaluation of the segmentation method and corner regression network are carried out on the synthetic and real ice channel dataset.The evaluation results show that the accuracy of the method using the corner point regression network in the second stage is achieved as high as 73.33%on the synthetic ice channel dataset and 70.66%on the real ice channel dataset,and the processing speed can reach up to 14.58frames per second.
基金supported by the National Natural Science Foundation of China(Grant No.:82174246)the National Key R&D Program of China(Grant No.:2019YFC1708701)the Postdoctoral Innovation Talent Support Program(Grant No.:BX20220329).
文摘Recent trends suggest that Chinese herbal medicine formulas(CHM formulas)are promising treatments for complex diseases.To characterize the precise syndromes,precise diseases and precise targets of the precise targets between complex diseases and CHM formulas,we developed an artificial intelligence-based quantitative predictive algorithm(DeepTCM).DeepTCM has gone through multilevel model calibration and validation against a comprehensive set of herb and disease data so that it accurately captures the complex cellular signaling,molecular and theoretical levels of traditional Chinese medicine(TCM).As an example,our model simulated the optimal CHM formulas for the treatment of coronary heart disease(CHD)with depression,and through model sensitivity analysis,we calculated the balanced scoring of the formulas.Furthermore,we constructed a biological knowledge graph representing interactions by associating herb-target and gene-disease interactions.Finally,we experimentally confirmed the therapeutic effect and pharmacological mechanism of a novel model-predicted intervention in humans and mice.This novel multiscale model opened up a new avenue to combine“disease syndrome”and“macro micro”system modeling to facilitate translational research in CHM formulas.
基金supported by the National Natural Science Foundation of China (Project No.42375192)the China Meteorological Administration Climate Change Special Program (CMA-CCSP+1 种基金Project No.QBZ202315)support by the Vector Stiftung through the Young Investigator Group"Artificial Intelligence for Probabilistic Weather Forecasting."
文摘Despite the maturity of ensemble numerical weather prediction(NWP),the resulting forecasts are still,more often than not,under-dispersed.As such,forecast calibration tools have become popular.Among those tools,quantile regression(QR)is highly competitive in terms of both flexibility and predictive performance.Nevertheless,a long-standing problem of QR is quantile crossing,which greatly limits the interpretability of QR-calibrated forecasts.On this point,this study proposes a non-crossing quantile regression neural network(NCQRNN),for calibrating ensemble NWP forecasts into a set of reliable quantile forecasts without crossing.The overarching design principle of NCQRNN is to add on top of the conventional QRNN structure another hidden layer,which imposes a non-decreasing mapping between the combined output from nodes of the last hidden layer to the nodes of the output layer,through a triangular weight matrix with positive entries.The empirical part of the work considers a solar irradiance case study,in which four years of ensemble irradiance forecasts at seven locations,issued by the European Centre for Medium-Range Weather Forecasts,are calibrated via NCQRNN,as well as via an eclectic mix of benchmarking models,ranging from the naïve climatology to the state-of-the-art deep-learning and other non-crossing models.Formal and stringent forecast verification suggests that the forecasts post-processed via NCQRNN attain the maximum sharpness subject to calibration,amongst all competitors.Furthermore,the proposed conception to resolve quantile crossing is remarkably simple yet general,and thus has broad applicability as it can be integrated with many shallow-and deep-learning-based neural networks.
文摘The field experiment designs with single replication were frequently used for factorial experiments in which the numbers of field plots were limited, but the experimental error was difficult to be estimated. To study a new statistical method for improving precision of regression analysis of such experiments in rice, 84 fertilizer experiments were conducted in 15 provinces of China, including Zhejiang, Jiangsu, Anhui, Hunan, Sichuan, Heilongjiang, etc. Three factors with 14 treatments (N: 0—225kg/ha, P: 0 —112. 5kg/ha, K: 0—150kg/ha) and two replications were employed using approaching optimun design. There were 2352 (84×14×2=2352) Yield deviations (d) between the individual treatment yields and its arithmetic mean. The results indicated that:
文摘Purpose:The purpose of this study is to develop and compare model choice strategies in context of logistic regression.Model choice means the choice of the covariates to be included in the model.Design/methodology/approach:The study is based on Monte Carlo simulations.The methods are compared in terms of three measures of accuracy:specificity and two kinds of sensitivity.A loss function combining sensitivity and specificity is introduced and used for a final comparison.Findings:The choice of method depends on how much the users emphasize sensitivity against specificity.It also depends on the sample size.For a typical logistic regression setting with a moderate sample size and a small to moderate effect size,either BIC,BICc or Lasso seems to be optimal.Research limitations:Numerical simulations cannot cover the whole range of data-generating processes occurring with real-world data.Thus,more simulations are needed.Practical implications:Researchers can refer to these results if they believe that their data-generating process is somewhat similar to some of the scenarios presented in this paper.Alternatively,they could run their own simulations and calculate the loss function.Originality/value:This is a systematic comparison of model choice algorithms and heuristics in context of logistic regression.The distinction between two types of sensitivity and a comparison based on a loss function are methodological novelties.
文摘In oil and gas exploration,elucidating the complex interdependencies among geological variables is paramount.Our study introduces the application of sophisticated regression analysis method at the forefront,aiming not just at predicting geophysical logging curve values but also innovatively mitigate hydrocarbon depletion observed in geochemical logging.Through a rigorous assessment,we explore the efficacy of eight regression models,bifurcated into linear and nonlinear groups,to accommodate the multifaceted nature of geological datasets.Our linear model suite encompasses the Standard Equation,Ridge Regression,Least Absolute Shrinkage and Selection Operator,and Elastic Net,each presenting distinct advantages.The Standard Equation serves as a foundational benchmark,whereas Ridge Regression implements penalty terms to counteract overfitting,thus bolstering model robustness in the presence of multicollinearity.The Least Absolute Shrinkage and Selection Operator for variable selection functions to streamline models,enhancing their interpretability,while Elastic Net amalgamates the merits of Ridge Regression and Least Absolute Shrinkage and Selection Operator,offering a harmonized solution to model complexity and comprehensibility.On the nonlinear front,Gradient Descent,Kernel Ridge Regression,Support Vector Regression,and Piecewise Function-Fitting methods introduce innovative approaches.Gradient Descent assures computational efficiency in optimizing solutions,Kernel Ridge Regression leverages the kernel trick to navigate nonlinear patterns,and Support Vector Regression is proficient in forecasting extremities,pivotal for exploration risk assessment.The Piecewise Function-Fitting approach,tailored for geological data,facilitates adaptable modeling of variable interrelations,accommodating abrupt data trend shifts.Our analysis identifies Ridge Regression,particularly when augmented by Piecewise Function-Fitting,as superior in recouping hydrocarbon losses,and underscoring its utility in resource quantification refinement.Meanwhile,Kernel Ridge Regression emerges as a noteworthy strategy in ameliorating porosity-logging curve prediction for well A,evidencing its aptness for intricate geological structures.This research attests to the scientific ascendancy and broad-spectrum relevance of these regression techniques over conventional methods while heralding new horizons for their deployment in the oil and gas sector.The insights garnered from these advanced modeling strategies are set to transform geological and engineering practices in hydrocarbon prediction,evaluation,and recovery.
基金financially supported by the National Natural Science Foundation of China(Grants 52172276)fund from Anhui Provincial Institute of Translational Medicine(2021zhyx-B15)。
文摘Precision therapy has become the preferred choice attributed to the optimal drug concentration in target sites,increased therapeutic efficacy,and reduced adverse effects.Over the past few years,sprayable or injectable thermosensitive hydrogels have exhibited high therapeutic potential.These can be applied as cell-growing scaffolds or drug-releasing reservoirs by simply mixing in a free-flowing sol phase at room temperature.Inspired by their unique properties,thermosensitive hydrogels have been widely applied as drug delivery and treatment platforms for precision medicine.In this review,the state-of-theart developments in thermosensitive hydrogels for precision therapy are investigated,which covers from the thermo-gelling mechanisms and main components to biomedical applications,including wound healing,anti-tumor activity,osteogenesis,and periodontal,sinonasal and ophthalmic diseases.The most promising applications and trends of thermosensitive hydrogels for precision therapy are also discussed in light of their unique features.
基金Supported by the National Natural Science Foundation of China(NSFC 62105100)the National Key research and development program in the 14th five year plan(2021YFA1200700)。
文摘Silicon(Si)diffraction microlens arrays are usually used to integrating with infrared focal plane arrays(IRFPAs)to improve their performance.The errors of lithography are unavoidable in the process of the Si diffrac-tion microlens arrays preparation in the conventional engraving method.It has a serious impact on its performance and subsequent applications.In response to the problem of errors of Si diffraction microlens arrays in the conven-tional method,a novel self-alignment method for high precision Si diffraction microlens arrays preparation is pro-posed.The accuracy of the Si diffractive microlens arrays preparation is determined by the accuracy of the first li-thography mask in the novel self-alignment method.In the subsequent etching,the etched area will be protected by the mask layer and the sacrifice layer or the protective layer.The unprotection area is carved to effectively block the non-etching areas,accurately etch the etching area required,and solve the problem of errors.The high precision Si diffraction microlens arrays are obtained by the novel self-alignment method and the diffraction effi-ciency could reach 92.6%.After integrating with IRFPAs,the average blackbody responsity increased by 8.3%,and the average blackbody detectivity increased by 10.3%.It indicates that the Si diffraction microlens arrays can improve the filling factor and reduce crosstalk of IRFPAs through convergence,thereby improving the perfor-mance of the IRFPAs.The results are of great reference significance for improving their performance through opti-mizing the preparation level of micro nano devices.
基金supported by ONR UMass Dartmouth Marine and UnderSea Technology(MUST)grant N00014-20-1-2849 under the project S31320000049160by DOE grant DE-SC0023164 sub-award RC114586-UMD+2 种基金by AFOSR grants FA9550-18-1-0383 and FA9550-23-1-0037supported by Michigan State University,by AFOSR grants FA9550-19-1-0281 and FA9550-18-1-0383by DOE grant DE-SC0023164.
文摘Additive Runge-Kutta methods designed for preserving highly accurate solutions in mixed-precision computation were previously proposed and analyzed.These specially designed methods use reduced precision for the implicit computations and full precision for the explicit computations.In this work,we analyze the stability properties of these methods and their sensitivity to the low-precision rounding errors,and demonstrate their performance in terms of accuracy and efficiency.We develop codes in FORTRAN and Julia to solve nonlinear systems of ODEs and PDEs using the mixed-precision additive Runge-Kutta(MP-ARK)methods.The convergence,accuracy,and runtime of these methods are explored.We show that for a given level of accuracy,suitably chosen MP-ARK methods may provide significant reductions in runtime.
基金Project supported by the National Key R&D Program of China(Grant No.2021YFB2012600)。
文摘We develop a quantum precision measurement method for magnetic field at the Tesla level by utilizing a fiber diamond magnetometer.Central to our system is a micron-sized fiber diamond probe positioned on the surface of a coplanar waveguide made of nonmagnetic materials.Calibrated with a nuclear magnetic resonance magnetometer,this probe demonstrates a broad magnetic field range from 10 mT to 1.5 T with a nonlinear error better than 0.0028%under a standard magnetic field generator and stability better than 0.0012%at a 1.5 T magnetic field.Finally,we demonstrate quantitative mapping of the vector magnetic field on the surface of a permanent magnet using the diamond magnetometer.
文摘Background:Limited research has been conducted on the influence of autophagy-associated long non-coding RNAs(ARLncRNAs)on the prognosis of hepatocellular carcinoma(HCC).Methods:We analyzed 371 HCC samples from TCGA,identifying expression networks of ARLncRNAs using autophagy-related genes.Screening for prognostically relevant ARLncRNAs involved univariate Cox regression,Lasso regression,and multivariate Cox regression.A Nomogram was further employed to assess the reliability of Riskscore,calculated from the signatures of screened ARLncRNAs,in predicting outcomes.Additionally,we compared drug sensitivities in patient groups with differing risk levels and investigated potential biological pathways through enrichment analysis,using consensus clustering to identify subgroups related to ARLncRNAs.Results:The screening process identified 27 ARLncRNAs,with 13 being associated with HCC prognosis.Consequently,a set of signatures comprising 8 ARLncRNAs was successfully constructed as independent prognostic factors for HCC.Patients in the high-risk group showed very poor prognoses in most clinical categories.The Riskscore was closely related to immune cell scores,such as macrophages,and the DEGs between different groups were implicated in metabolism,cell cycle,and mitotic processes.Notably,high-risk group patients demonstrated a significantly lower IC50 for Paclitaxel,suggesting that Paclitaxel could be an ideal treatment for those at elevated risk for HCC.We further identified C2 as the Paclitaxel subtype,where patients exhibited higher Riskscores,reduced survival rates,and more severe clinical progression.Conclusion:The 8 signatures based on ARLncRNAs present novel targets for prognostic prediction in HCC.The drug candidate Paclitaxel may effectively treat HCC by impacting ARLncRNAs expression.With the identification of ARLncRNAsrelated isoforms,these results provide valuable insights for clinical exploration of autophagy mechanisms in HCC pathogenesis and offer potential avenues for precision medicine.
文摘The Extensible Markup Language(XML)files,widely used for storing and exchanging information on the web require efficient parsing mechanisms to improve the performance of the applications.With the existing Document Object Model(DOM)based parsing,the performance degrades due to sequential processing and large memory requirements,thereby requiring an efficient XML parser to mitigate these issues.In this paper,we propose a Parallel XML Tree Generator(PXTG)algorithm for accelerating the parsing of XML files and a Regression-based XML Parsing Framework(RXPF)that analyzes and predicts performance through profiling,regression,and code generation for efficient parsing.The PXTG algorithm is based on dividing the XML file into n parts and producing n trees in parallel.The profiling phase of the RXPF framework produces a dataset by measuring the performance of various parsing models including StAX,SAX,DOM,JDOM,and PXTG on different cores by using multiple file sizes.The regression phase produces the prediction model,based on which the final code for efficient parsing of XML files is produced through the code generation phase.The RXPF framework has shown a significant improvement in performance varying from 9.54%to 32.34%over other existing models used for parsing XML files.
文摘The electrical resistivity method is a geophysical tool used to characterize the subsoil and can provide an important information for precision agriculture. The lack of knowledge about agronomic properties of the soil tends to affect the agricultural coffee production system. Therefore, research related to geoelectrical properties of soil such as resistivity for characterization the region of the study for coffee cultivation purposes can improve and optimize the production. This resistivity method allows to investigate the subsurface through different techniques: 1D vertical electrical sounding and electrical imaging. The acquisition of data using these techniques permitted the creation of 2D resistivity cross section from the study area. The geoelectrical data was acquired by using a resistivity meter equipment and was processed in different softwares. The results of the geoelectrical characterization from 1D resistivity model and 2D resistivity electrical sections show that in the study area of Kabiri, there are 8 varieties of geoelectrical layers with different resistivity or conductivity. Near survey in the study area, the lowest resistivity is around 0.322 Ω·m, while the highest is about 92.1 Ω·m. These values illustrated where is possible to plant coffee for suggestion of specific fertilization plan for some area to improve the cultivation.
基金Project supported by the National Key Research and Development Program of China (Grant No.2021YFB2012600)。
文摘We present a quantitative measurement of the horizontal component of the microwave magnetic field of a coplanar waveguide using a quantum diamond probe in fiber format.The measurement results are compared in detail with simulation,showing a good consistence.Further simulation shows fiber diamond probe brings negligible disturbance to the field under measurement compared to bulk diamond.This method will find important applications ranging from electromagnetic compatibility test and failure analysis of high frequency and high complexity integrated circuits.
基金financially supported by the National Key R&D Program of China(2018YFA0702504)the National Natural Science Foundation of China(42174152)+1 种基金the Strategic Cooperation Technology Projects of China National Petroleum Corporation(CNPC)and China University of Petroleum-Beijing(CUPB)(ZLZX2020-03)the R&D Department of China National Petroleum Corporation(2022DQ0604-01)。
文摘The picking efficiency of seismic first breaks(FBs)has been greatly accelerated by deep learning(DL)technology.However,the picking accuracy and efficiency of DL methods still face huge challenges in low signal-to-noise ratio(SNR)situations.To address this issue,we propose a regression approach to pick FBs based on bidirectional long short-term memory(Bi LSTM)neural network by learning the implicit Eikonal equation of 3D inhomogeneous media with rugged topography in the target region.We employ a regressive model that represents the relationships among the elevation of shots,offset and the elevation of receivers with their seismic traveltime to predict the unknown FBs,from common-shot gathers with sparsely distributed traces.Different from image segmentation methods which automatically extract image features and classify FBs from seismic data,the proposed method can learn the inner relationship between field geometry and FBs.In addition,the predicted results by the regressive model are continuous values of FBs rather than the discrete ones of the binary distribution.The picking results of synthetic data shows that the proposed method has low dependence on label data,and can obtain reliable and similar predicted results using two types of label data with large differences.The picking results of9380 shots for 3D seismic data generated by vibroseis indicate that the proposed method can still accurately predict FBs in low SNR data.The subsequent stacked profiles further illustrate the reliability and effectiveness of the proposed method.The results of model data and field seismic data demonstrate that the proposed regression method is a robust first-break picker with high potential for field application.
基金Supported by Rumah Program 2024 of Research Organization for Health,National Research and Innovation Agency of Indonesia2023 Grant of The Fondazione Veronesi,Milan,Italy(Caecilia H C Sukowati)2023/2024 Postdoctoral Fellowship of The Manajemen Talenta,Badan Riset dan Inovasi Nasional,Indonesia(Sri Jayanti).
文摘Hepatitis B virus(HBV)infection is a major player in chronic hepatitis B that may lead to the development of hepatocellular carcinoma(HCC).HBV genetics are diverse where it is classified into at least 9 genotypes(A to I)and 1 putative genotype(J),each with specific geographical distribution and possible different clinical outcomes in the patient.This diversity may be associated with the precision medicine for HBV-related HCC and the success of therapeutical approaches against HCC,related to different pathogenicity of the virus and host response.This Editorial discusses recent updates on whether the classification of HBV genetic diversity is still valid in terms of viral oncogenicity to the HCC and its precision medicine,in addition to the recent advances in cellular and molecular biology technologies.