期刊文献+
共找到72,531篇文章
< 1 2 250 >
每页显示 20 50 100
A partial least-squares regression approach to land use studies in the Suzhou-Wuxi-Changzhou region 被引量:1
1
作者 ZHANG Yang ZHOU Chenghu ZHANG Yongmin 《Journal of Geographical Sciences》 SCIE CSCD 2007年第2期234-244,共11页
In several LUCC studies, statistical methods are being used to analyze land use data. A problem using conventional statistical methods in land use analysis is that these methods assume the data to be statistically ind... In several LUCC studies, statistical methods are being used to analyze land use data. A problem using conventional statistical methods in land use analysis is that these methods assume the data to be statistically independent. But in fact, they have the tendency to be dependent, a phenomenon known as multicollinearity, especially in the cases of few observations. In this paper, a Partial Least-Squares (PLS) regression approach is developed to study relationships between land use and its influencing factors through a case study of the Suzhou-Wuxi-Changzhou region in China. Multicollinearity exists in the dataset and the number of variables is high compared to the number of observations. Four PLS factors are selected through a preliminary analysis. The correlation analyses between land use and influencing factors demonstrate the land use character of rural industrialization and urbanization in the Suzhou-Wuxi-Changzhou region, meanwhile illustrate that the first PLS factor has enough ability to best describe land use patterns quantitatively, and most of the statistical relations derived from it accord with the fact. By the decreasing capacity of the PLS factors, the reliability of model outcome decreases correspondingly. 展开更多
关键词 land use multivariate data analysis partial least-squares regression Suzhou-Wuxi-Changzhou region MULTICOLLINEARITY
下载PDF
Quantitative energy-dispersive X-ray fluorescence analysis for unknown samples using full-spectrum least-squares regression 被引量:6
2
作者 Yong-Li Liu Qing-Xian Zhang +2 位作者 Jian Zhang Hai-Tao Bai Liang-Quan Ge 《Nuclear Science and Techniques》 SCIE CAS CSCD 2019年第3期149-159,共11页
The full-spectrum least-squares(FSLS) method is introduced to perform quantitative energy-dispersive X-ray fluorescence analysis for unknown solid samples.Based on the conventional least-squares principle, this spectr... The full-spectrum least-squares(FSLS) method is introduced to perform quantitative energy-dispersive X-ray fluorescence analysis for unknown solid samples.Based on the conventional least-squares principle, this spectrum evaluation method is able to obtain the background-corrected and interference-free net peaks, which is significant for quantization analyses. A variety of analytical parameters and functions to describe the features of the fluorescence spectra of pure elements are used and established, such as the mass absorption coefficient, the Gi factor, and fundamental fluorescence formulas. The FSLS iterative program was compiled in the C language. The content of each component should reach the convergence criterion at the end of the calculations. After a basic theory analysis and experimental preparation, 13 national standard soil samples were detected using a spectrometer to test the feasibility of using the algorithm. The results show that the calculated contents of Ti, Fe, Ni, Cu, and Zn have the same changing tendency as the corresponding standard content in the 13 reference samples. Accuracies of 0.35% and 14.03% are obtained, respectively, for Fe and Ti, whose standard concentrations are 8.82% and 0.578%, respectively. However, the calculated results of trace elements (only tens of lg/g) deviate from the standard values. This may be because of measurement accuracy and mutual effects between the elements. 展开更多
关键词 Energy-dispersive X-ray fluorescence analysis Full-spectrum least-squares METHOD Effective atomic number Mass attenuation coefficient Fundamental parameter METHOD
下载PDF
Based on Partial Least-squares Regression to Build up and Analyze the Model of Rice Evapotranspiration
3
作者 ZHAO Chang shan,FU Hong,HUANG Bu hai (Northeast Agricultural University,Harbin,Heilongjiang,150030,PRC) 《Journal of Northeast Agricultural University(English Edition)》 CAS 2003年第1期1-8,共8页
During the course of calculating the rice evapotranspiration using weather factors,we often find that some independent variables have multiple correlation.The phenomena can lead to the traditional multivariate regress... During the course of calculating the rice evapotranspiration using weather factors,we often find that some independent variables have multiple correlation.The phenomena can lead to the traditional multivariate regression model which based on least square method distortion.And the stability of the model will be lost.The model will be built based on partial least square regression in the paper,through applying the idea of main component analyze and typical correlation analyze,the writer picks up some component from original material.Thus,the writer builds up the model of rice evapotranspiration to solve the multiple correlation among the independent variables (some weather factors).At last,the writer analyses the model in some parts,and gains the satisfied result. 展开更多
关键词 Partial Least squares regression EVAPOTRANSPIRATION
下载PDF
Linear-regression models and algorithms based on the Total-Least-Squares principle 被引量:1
4
作者 Ding Shijun Jiang Weiping Shen Zhijuani 《Geodesy and Geodynamics》 2012年第2期42-46,共5页
In classical regression analysis, the error of independent variable is usually not taken into account in regression analysis. This paper presents two solution methods for the case that both the independent and the dep... In classical regression analysis, the error of independent variable is usually not taken into account in regression analysis. This paper presents two solution methods for the case that both the independent and the dependent variables have errors. These methods are derived from the condition-adjustment and indirect-adjustment models based on the Total-Least-Squares principle. The equivalence of these two methods is also proven in theory. 展开更多
关键词 Total-least-squares (TLS) principle regression analysis adjustment model EQUIVALENCE
下载PDF
PARTIAL LEAST-SQUARES(PLS)REGRESSION AND SPECTROPHOTOMETRY AS APPLIED TO THE ANALYSIS OF MULTICOMPONENT MIXTURES
5
作者 Xin An LIU Le Ming SHI +4 位作者 Zhi Hong XU Zhong Xiao PAN Zhi Liang LI Ying GAO Laboratory No.502,Institute of Chemical Defense,Beijing 102205 Laboratory of Computer Chemistry,Institute of Chemical Metallurgy,Chinese Academy of Sciences,Beijing 100080 《Chinese Chemical Letters》 SCIE CAS CSCD 1991年第3期233-236,共4页
The UV absorption spectra of o-naphthol,α-naphthylamine,2,7-dihydroxy naphthalene,2,4-dimethoxy ben- zaldehyde and methyl salicylate,overlap severely;therefore it is impossible to determine them in mixtures by tradit... The UV absorption spectra of o-naphthol,α-naphthylamine,2,7-dihydroxy naphthalene,2,4-dimethoxy ben- zaldehyde and methyl salicylate,overlap severely;therefore it is impossible to determine them in mixtures by traditional spectrophotometric methods.In this paper,the partial least-squares(PLS)regression is applied to the simultaneous determination of these compounds in mixtures by UV spectrophtometry without any pretreatment of the samples.Ten synthetic mixture samples are analyzed by the proposed method.The mean recoveries are 99.4%,996%,100.2%,99.3% and 99.1%,and the relative standard deviations(RSD) are 1.87%,1.98%,1.94%,0.960% and 0.672%,respectively. 展开更多
关键词 PLS)regression AND SPECTROPHOTOMETRY AS APPLIED TO THE ANALYSIS OF MULTICOMPONENT MIXTURES PARTIAL least-squares AS
下载PDF
Fuzzy Least-Squares Linear Regression Approach to Ascertain Stochastic Demand in the Vehicle Routing Problem
6
作者 Fatemeh Torfi Reza Zanjirani Farahani Iraj Mahdavi 《Applied Mathematics》 2011年第1期64-73,共10页
Estimation of stochastic demand in physical distribution in general and efficient transport routs management in particular is emerging as a crucial factor in urban planning domain. It is particularly important in some... Estimation of stochastic demand in physical distribution in general and efficient transport routs management in particular is emerging as a crucial factor in urban planning domain. It is particularly important in some municipalities such as Tehran where a sound demand management calls for a realistic analysis of the routing system. The methodology involved critically investigating a fuzzy least-squares linear regression approach (FLLRs) to estimate the stochastic demands in the vehicle routing problem (VRP) bearing in mind the customer's preferences order. A FLLR method is proposed in solving the VRP with stochastic demands: approximate-distance fuzzy least-squares (ADFL) estimator ADFL estimator is applied to original data taken from a case study. The SSR values of the ADFL estimator and real demand are obtained and then compared to SSR values of the nominal demand and real demand. Empirical results showed that the proposed method can be viable in solving problems under circumstances of having vague and imprecise performance ratings. The results further proved that application of the ADFL was realistic and efficient estimator to face the sto- chastic demand challenges in vehicle routing system management and solve relevant problems. 展开更多
关键词 FUZZY least-squares STOCHASTIC LOCATION ROUTING Problems
下载PDF
Characterizing and estimating rice brown spot disease severity using stepwise regression,principal component regression and partial least-square regression 被引量:13
7
作者 LIU Zhan-yu1, HUANG Jing-feng1, SHI Jing-jing1, TAO Rong-xiang2, ZHOU Wan3, ZHANG Li-li3 (1Institute of Agriculture Remote Sensing and Information System Application, Zhejiang University, Hangzhou 310029, China) (2Institute of Plant Protection and Microbiology, Zhejiang Academy of Agricultural Sciences, Hangzhou 310021, China) (3Plant Inspection Station of Hangzhou City, Hangzhou 310020, China) 《Journal of Zhejiang University-Science B(Biomedicine & Biotechnology)》 SCIE CAS CSCD 2007年第10期738-744,共7页
Detecting plant health conditions plays a key role in farm pest management and crop protection. In this study, measurement of hyperspectral leaf reflectance in rice crop (Oryzasativa L.) was conducted on groups of hea... Detecting plant health conditions plays a key role in farm pest management and crop protection. In this study, measurement of hyperspectral leaf reflectance in rice crop (Oryzasativa L.) was conducted on groups of healthy and infected leaves by the fungus Bipolaris oryzae (Helminthosporium oryzae Breda. de Hann) through the wavelength range from 350 to 2 500 nm. The percentage of leaf surface lesions was estimated and defined as the disease severity. Statistical methods like multiple stepwise regression, principal component analysis and partial least-square regression were utilized to calculate and estimate the disease severity of rice brown spot at the leaf level. Our results revealed that multiple stepwise linear regressions could efficiently estimate disease severity with three wavebands in seven steps. The root mean square errors (RMSEs) for training (n=210) and testing (n=53) dataset were 6.5% and 5.8%, respectively. Principal component analysis showed that the first principal component could explain approximately 80% of the variance of the original hyperspectral reflectance. The regression model with the first two principal components predicted a disease severity with RMSEs of 16.3% and 13.9% for the training and testing dataset, respec-tively. Partial least-square regression with seven extracted factors could most effectively predict disease severity compared with other statistical methods with RMSEs of 4.1% and 2.0% for the training and testing dataset, respectively. Our research demon-strates that it is feasible to estimate the disease severity of rice brown spot using hyperspectral reflectance data at the leaf level. 展开更多
关键词 HYPERSPECTRAL reflectance Rice BROWN SPOT PARTIAL least-square (PLS) regression STEPWISE regression Principal component regression (PCR)
下载PDF
Two-Staged Method for Ice Channel Identification Based on Image Segmentation and Corner Point Regression 被引量:1
8
作者 DONG Wen-bo ZHOU Li +2 位作者 DING Shi-feng WANG Ai-ming CAI Jin-yan 《China Ocean Engineering》 SCIE EI CSCD 2024年第2期313-325,共13页
Identification of the ice channel is the basic technology for developing intelligent ships in ice-covered waters,which is important to ensure the safety and economy of navigation.In the Arctic,merchant ships with low ... Identification of the ice channel is the basic technology for developing intelligent ships in ice-covered waters,which is important to ensure the safety and economy of navigation.In the Arctic,merchant ships with low ice class often navigate in channels opened up by icebreakers.Navigation in the ice channel often depends on good maneuverability skills and abundant experience from the captain to a large extent.The ship may get stuck if steered into ice fields off the channel.Under this circumstance,it is very important to study how to identify the boundary lines of ice channels with a reliable method.In this paper,a two-staged ice channel identification method is developed based on image segmentation and corner point regression.The first stage employs the image segmentation method to extract channel regions.In the second stage,an intelligent corner regression network is proposed to extract the channel boundary lines from the channel region.A non-intelligent angle-based filtering and clustering method is proposed and compared with corner point regression network.The training and evaluation of the segmentation method and corner regression network are carried out on the synthetic and real ice channel dataset.The evaluation results show that the accuracy of the method using the corner point regression network in the second stage is achieved as high as 73.33%on the synthetic ice channel dataset and 70.66%on the real ice channel dataset,and the processing speed can reach up to 14.58frames per second. 展开更多
关键词 ice channel ship navigation IDENTIFICATION image segmentation corner point regression
下载PDF
Non-crossing Quantile Regression Neural Network as a Calibration Tool for Ensemble Weather Forecasts 被引量:1
9
作者 Mengmeng SONG Dazhi YANG +7 位作者 Sebastian LERCH Xiang'ao XIA Gokhan Mert YAGLI Jamie M.BRIGHT Yanbo SHEN Bai LIU Xingli LIU Martin Janos MAYER 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2024年第7期1417-1437,共21页
Despite the maturity of ensemble numerical weather prediction(NWP),the resulting forecasts are still,more often than not,under-dispersed.As such,forecast calibration tools have become popular.Among those tools,quantil... Despite the maturity of ensemble numerical weather prediction(NWP),the resulting forecasts are still,more often than not,under-dispersed.As such,forecast calibration tools have become popular.Among those tools,quantile regression(QR)is highly competitive in terms of both flexibility and predictive performance.Nevertheless,a long-standing problem of QR is quantile crossing,which greatly limits the interpretability of QR-calibrated forecasts.On this point,this study proposes a non-crossing quantile regression neural network(NCQRNN),for calibrating ensemble NWP forecasts into a set of reliable quantile forecasts without crossing.The overarching design principle of NCQRNN is to add on top of the conventional QRNN structure another hidden layer,which imposes a non-decreasing mapping between the combined output from nodes of the last hidden layer to the nodes of the output layer,through a triangular weight matrix with positive entries.The empirical part of the work considers a solar irradiance case study,in which four years of ensemble irradiance forecasts at seven locations,issued by the European Centre for Medium-Range Weather Forecasts,are calibrated via NCQRNN,as well as via an eclectic mix of benchmarking models,ranging from the naïve climatology to the state-of-the-art deep-learning and other non-crossing models.Formal and stringent forecast verification suggests that the forecasts post-processed via NCQRNN attain the maximum sharpness subject to calibration,amongst all competitors.Furthermore,the proposed conception to resolve quantile crossing is remarkably simple yet general,and thus has broad applicability as it can be integrated with many shallow-and deep-learning-based neural networks. 展开更多
关键词 ensemble weather forecasting forecast calibration non-crossing quantile regression neural network CORP reliability diagram POST-PROCESSING
下载PDF
A comparison of model choice strategies for logistic regression
10
作者 Markku Karhunen 《Journal of Data and Information Science》 CSCD 2024年第1期37-52,共16页
Purpose:The purpose of this study is to develop and compare model choice strategies in context of logistic regression.Model choice means the choice of the covariates to be included in the model.Design/methodology/appr... Purpose:The purpose of this study is to develop and compare model choice strategies in context of logistic regression.Model choice means the choice of the covariates to be included in the model.Design/methodology/approach:The study is based on Monte Carlo simulations.The methods are compared in terms of three measures of accuracy:specificity and two kinds of sensitivity.A loss function combining sensitivity and specificity is introduced and used for a final comparison.Findings:The choice of method depends on how much the users emphasize sensitivity against specificity.It also depends on the sample size.For a typical logistic regression setting with a moderate sample size and a small to moderate effect size,either BIC,BICc or Lasso seems to be optimal.Research limitations:Numerical simulations cannot cover the whole range of data-generating processes occurring with real-world data.Thus,more simulations are needed.Practical implications:Researchers can refer to these results if they believe that their data-generating process is somewhat similar to some of the scenarios presented in this paper.Alternatively,they could run their own simulations and calculate the loss function.Originality/value:This is a systematic comparison of model choice algorithms and heuristics in context of logistic regression.The distinction between two types of sensitivity and a comparison based on a loss function are methodological novelties. 展开更多
关键词 Model choice Logistic regression Logit regression Monte Carlo simulations Sensitivity SPECIFICITY
下载PDF
Regression analysis and its application to oil and gas exploration:A case study of hydrocarbon loss recovery and porosity prediction,China
11
作者 Yang Li Xiaoguang Li +3 位作者 Mingyu Guo Chang Chen Pengbo Ni Zijian Huang 《Energy Geoscience》 EI 2024年第4期240-252,共13页
In oil and gas exploration,elucidating the complex interdependencies among geological variables is paramount.Our study introduces the application of sophisticated regression analysis method at the forefront,aiming not... In oil and gas exploration,elucidating the complex interdependencies among geological variables is paramount.Our study introduces the application of sophisticated regression analysis method at the forefront,aiming not just at predicting geophysical logging curve values but also innovatively mitigate hydrocarbon depletion observed in geochemical logging.Through a rigorous assessment,we explore the efficacy of eight regression models,bifurcated into linear and nonlinear groups,to accommodate the multifaceted nature of geological datasets.Our linear model suite encompasses the Standard Equation,Ridge Regression,Least Absolute Shrinkage and Selection Operator,and Elastic Net,each presenting distinct advantages.The Standard Equation serves as a foundational benchmark,whereas Ridge Regression implements penalty terms to counteract overfitting,thus bolstering model robustness in the presence of multicollinearity.The Least Absolute Shrinkage and Selection Operator for variable selection functions to streamline models,enhancing their interpretability,while Elastic Net amalgamates the merits of Ridge Regression and Least Absolute Shrinkage and Selection Operator,offering a harmonized solution to model complexity and comprehensibility.On the nonlinear front,Gradient Descent,Kernel Ridge Regression,Support Vector Regression,and Piecewise Function-Fitting methods introduce innovative approaches.Gradient Descent assures computational efficiency in optimizing solutions,Kernel Ridge Regression leverages the kernel trick to navigate nonlinear patterns,and Support Vector Regression is proficient in forecasting extremities,pivotal for exploration risk assessment.The Piecewise Function-Fitting approach,tailored for geological data,facilitates adaptable modeling of variable interrelations,accommodating abrupt data trend shifts.Our analysis identifies Ridge Regression,particularly when augmented by Piecewise Function-Fitting,as superior in recouping hydrocarbon losses,and underscoring its utility in resource quantification refinement.Meanwhile,Kernel Ridge Regression emerges as a noteworthy strategy in ameliorating porosity-logging curve prediction for well A,evidencing its aptness for intricate geological structures.This research attests to the scientific ascendancy and broad-spectrum relevance of these regression techniques over conventional methods while heralding new horizons for their deployment in the oil and gas sector.The insights garnered from these advanced modeling strategies are set to transform geological and engineering practices in hydrocarbon prediction,evaluation,and recovery. 展开更多
关键词 regression analysis Oil and gas exploration Multiple linear regression model Nonlinear regression model Hydrocarbon loss recovery Porosity prediction
下载PDF
Performance Enhancement of XML Parsing Using Regression and Parallelism
12
作者 Muhammad Ali Minhaj Ahmad Khan 《Computer Systems Science & Engineering》 2024年第2期287-303,共17页
The Extensible Markup Language(XML)files,widely used for storing and exchanging information on the web require efficient parsing mechanisms to improve the performance of the applications.With the existing Document Obj... The Extensible Markup Language(XML)files,widely used for storing and exchanging information on the web require efficient parsing mechanisms to improve the performance of the applications.With the existing Document Object Model(DOM)based parsing,the performance degrades due to sequential processing and large memory requirements,thereby requiring an efficient XML parser to mitigate these issues.In this paper,we propose a Parallel XML Tree Generator(PXTG)algorithm for accelerating the parsing of XML files and a Regression-based XML Parsing Framework(RXPF)that analyzes and predicts performance through profiling,regression,and code generation for efficient parsing.The PXTG algorithm is based on dividing the XML file into n parts and producing n trees in parallel.The profiling phase of the RXPF framework produces a dataset by measuring the performance of various parsing models including StAX,SAX,DOM,JDOM,and PXTG on different cores by using multiple file sizes.The regression phase produces the prediction model,based on which the final code for efficient parsing of XML files is produced through the code generation phase.The RXPF framework has shown a significant improvement in performance varying from 9.54%to 32.34%over other existing models used for parsing XML files. 展开更多
关键词 regression parallel parsing multi-cores XML
下载PDF
A regression approach for seismic first-break picking
13
作者 Huan Yuan San-Yi Yuan +2 位作者 Jie Wu Wen-Jing Sang Yu-He Zhao 《Petroleum Science》 SCIE EI CAS CSCD 2024年第3期1584-1596,共13页
The picking efficiency of seismic first breaks(FBs)has been greatly accelerated by deep learning(DL)technology.However,the picking accuracy and efficiency of DL methods still face huge challenges in low signal-to-nois... The picking efficiency of seismic first breaks(FBs)has been greatly accelerated by deep learning(DL)technology.However,the picking accuracy and efficiency of DL methods still face huge challenges in low signal-to-noise ratio(SNR)situations.To address this issue,we propose a regression approach to pick FBs based on bidirectional long short-term memory(Bi LSTM)neural network by learning the implicit Eikonal equation of 3D inhomogeneous media with rugged topography in the target region.We employ a regressive model that represents the relationships among the elevation of shots,offset and the elevation of receivers with their seismic traveltime to predict the unknown FBs,from common-shot gathers with sparsely distributed traces.Different from image segmentation methods which automatically extract image features and classify FBs from seismic data,the proposed method can learn the inner relationship between field geometry and FBs.In addition,the predicted results by the regressive model are continuous values of FBs rather than the discrete ones of the binary distribution.The picking results of synthetic data shows that the proposed method has low dependence on label data,and can obtain reliable and similar predicted results using two types of label data with large differences.The picking results of9380 shots for 3D seismic data generated by vibroseis indicate that the proposed method can still accurately predict FBs in low SNR data.The subsequent stacked profiles further illustrate the reliability and effectiveness of the proposed method.The results of model data and field seismic data demonstrate that the proposed regression method is a robust first-break picker with high potential for field application. 展开更多
关键词 First-break picking Low signal-to-noiseratio regression BiLSTM TRAVELTIME Geometry Noisy seismic data
下载PDF
Optimization of Artificial Viscosity in Production Codes Based on Gaussian Regression Surrogate Models
14
作者 Vitaliy Gyrya Evan Lieberman +1 位作者 Mark Kenamond Mikhail Shashkov 《Communications on Applied Mathematics and Computation》 EI 2024年第3期1521-1550,共30页
To accurately model flows with shock waves using staggered-grid Lagrangian hydrodynamics, the artificial viscosity has to be introduced to convert kinetic energy into internal energy, thereby increasing the entropy ac... To accurately model flows with shock waves using staggered-grid Lagrangian hydrodynamics, the artificial viscosity has to be introduced to convert kinetic energy into internal energy, thereby increasing the entropy across shocks. Determining the appropriate strength of the artificial viscosity is an art and strongly depends on the particular problem and experience of the researcher. The objective of this study is to pose the problem of finding the appropriate strength of the artificial viscosity as an optimization problem and solve this problem using machine learning (ML) tools, specifically using surrogate models based on Gaussian Process regression (GPR) and Bayesian analysis. We describe the optimization method and discuss various practical details of its implementation. The shock-containing problems for which we apply this method all have been implemented in the LANL code FLAG (Burton in Connectivity structures and differencing techniques for staggered-grid free-Lagrange hydrodynamics, Tech. Rep. UCRL-JC-110555, Lawrence Livermore National Laboratory, Livermore, CA, 1992, 1992, in Consistent finite-volume discretization of hydrodynamic conservation laws for unstructured grids, Tech. Rep. CRL-JC-118788, Lawrence Livermore National Laboratory, Livermore, CA, 1992, 1994, Multidimensional discretization of conservation laws for unstructured polyhedral grids, Tech. Rep. UCRL-JC-118306, Lawrence Livermore National Laboratory, Livermore, CA, 1992, 1994, in FLAG, a multi-dimensional, multiple mesh, adaptive free-Lagrange, hydrodynamics code. In: NECDC, 1992). First, we apply ML to find optimal values to isolated shock problems of different strengths. Second, we apply ML to optimize the viscosity for a one-dimensional (1D) propagating detonation problem based on Zel’dovich-von Neumann-Doring (ZND) (Fickett and Davis in Detonation: theory and experiment. Dover books on physics. Dover Publications, Mineola, 2000) detonation theory using a reactive burn model. We compare results for default (currently used values in FLAG) and optimized values of the artificial viscosity for these problems demonstrating the potential for significant improvement in the accuracy of computations. 展开更多
关键词 OPTIMIZATION Artificial viscosity Gaussian regression surrigate model
下载PDF
Nuclear charge radius predictions by kernel ridge regression with odd-even effects
15
作者 Lu Tang Zhen-Hua Zhang 《Nuclear Science and Techniques》 SCIE EI CAS CSCD 2024年第2期94-102,共9页
The extended kernel ridge regression(EKRR)method with odd-even effects was adopted to improve the description of the nuclear charge radius using five commonly used nuclear models.These are:(i)the isospin-dependent A^(... The extended kernel ridge regression(EKRR)method with odd-even effects was adopted to improve the description of the nuclear charge radius using five commonly used nuclear models.These are:(i)the isospin-dependent A^(1∕3) formula,(ii)relativistic continuum Hartree-Bogoliubov(RCHB)theory,(iii)Hartree-Fock-Bogoliubov(HFB)model HFB25,(iv)the Weizsacker-Skyrme(WS)model WS*,and(v)HFB25*model.In the last two models,the charge radii were calculated using a five-parameter formula with the nuclear shell corrections and deformations obtained from the WS and HFB25 models,respectively.For each model,the resultant root-mean-square deviation for the 1014 nuclei with proton number Z≥8 can be significantly reduced to 0.009-0.013 fm after considering the modification with the EKRR method.The best among them was the RCHB model,with a root-mean-square deviation of 0.0092 fm.The extrapolation abilities of the KRR and EKRR methods for the neutron-rich region were examined,and it was found that after considering the odd-even effects,the extrapolation power was improved compared with that of the original KRR method.The strong odd-even staggering of nuclear charge radii of Ca and Cu isotopes and the abrupt kinks across the neutron N=126 and 82 shell closures were also calculated and could be reproduced quite well by calculations using the EKRR method. 展开更多
关键词 Nuclear charge radius Machine learning Kernel ridge regression method
下载PDF
Predicting uniaxial compressive strength of tuff after accelerated freeze-thaw testing: Comparative analysis of regression models and artificial neural networks
16
作者 Ogün Ozan VAROL 《Journal of Mountain Science》 SCIE CSCD 2024年第10期3521-3535,共15页
Ignimbrites have been widely used as building materials in many historical and touristic structures in the Kayseri region of Türkiye. Their diverse colours and textures make them a popular choice for modern const... Ignimbrites have been widely used as building materials in many historical and touristic structures in the Kayseri region of Türkiye. Their diverse colours and textures make them a popular choice for modern construction as well. However, ignimbrites are particularly vulnerable to atmospheric conditions, such as freeze-thaw cycles, due to their high porosity, which is a result of their formation process. When water enters the pores of the ignimbrites, it can freeze during cold weather. As the water freezes and expands, it generates internal stress within the stone, causing micro-cracks to develop. Over time, repeated freeze-thaw (F-T) cycles lead to the growth of these micro-cracks into larger cracks, compromising the structural integrity of the ignimbrites and eventually making them unsuitable for use as building materials. The determination of the long-term F-T performance of ignimbrites can be established after long F-T experimental processes. Determining the long-term F-T performance of ignimbrites typically requires extensive experimental testing over prolonged freeze-thaw cycles. To streamline this process, developing accurate predictive equations becomes crucial. In this study, such equations were formulated using classical regression analyses and artificial neural networks (ANN) based on data obtained from these experiments, allowing for the prediction of the F-T performance of ignimbrites and other similar building stones without the need for lengthy testing. In this study, uniaxial compressive strength, ultrasonic propagation velocity, apparent porosity and mass loss of ignimbrites after long-term F-T were determined. Following the F-T cycles, the disintegration rate was evaluated using decay function approaches, while uniaxial compressive strength (UCS) values were predicted with minimal input parameters through both regression and ANN analyses. The ANN and regression models created for this purpose were first started with a single input value and then developed with two and three combinations. The predictive performance of the models was assessed by comparing them to regression models using the coefficient of determination (R2) as the evaluation criterion. As a result of the study, higher R2 values (0.87) were obtained in models built with artificial neural network. The results of the study indicate that ANN usage can produce results close to experimental outcomes in predicting the long-term F-T performance of ignimbrite samples. 展开更多
关键词 IGNIMBRITE Uniaxial compressive strength FREEZE-THAW Decay function regression Artificial neural network
下载PDF
Geographically and Temporally Weighted Regression in Assessing Dengue Fever Spread Factors in Yunnan Border Regions
17
作者 ZHU Xiao Xiang WANG Song Wang +3 位作者 LI Yan Fei ZHANG Ye Wu SU Xue Mei ZHAO Xiao Tao 《Biomedical and Environmental Sciences》 SCIE CAS CSCD 2024年第5期511-520,共10页
Objective This study employs the Geographically and Temporally Weighted Regression(GTWR)model to assess the impact of meteorological elements and imported cases on dengue fever outbreaks,emphasizing the spatial-tempor... Objective This study employs the Geographically and Temporally Weighted Regression(GTWR)model to assess the impact of meteorological elements and imported cases on dengue fever outbreaks,emphasizing the spatial-temporal variability of these factors in border regions.Methods We conducted a descriptive analysis of dengue fever’s temporal-spatial distribution in Yunnan border areas.Utilizing annual data from 2013 to 2019,with each county in the Yunnan border serving as a spatial unit,we constructed a GTWR model to investigate the determinants of dengue fever and their spatio-temporal heterogeneity in this region.Results The GTWR model,proving more effective than Ordinary Least Squares(OLS)analysis,identified significant spatial and temporal heterogeneity in factors influencing dengue fever’s spread along the Yunnan border.Notably,the GTWR model revealed a substantial variation in the relationship between indigenous dengue fever incidence,meteorological variables,and imported cases across different counties.Conclusion In the Yunnan border areas,local dengue incidence is affected by temperature,humidity,precipitation,wind speed,and imported cases,with these factors’influence exhibiting notable spatial and temporal variation. 展开更多
关键词 Dengue fever Meteorological factor Geographically and temporally weighted regression
下载PDF
Composition Analysis and Identification of Ancient Glass Products Based on L1 Regularization Logistic Regression
18
作者 Yuqiao Zhou Xinyang Xu Wenjing Ma 《Applied Mathematics》 2024年第1期51-64,共14页
In view of the composition analysis and identification of ancient glass products, L1 regularization, K-Means cluster analysis, elbow rule and other methods were comprehensively used to build logical regression, cluste... In view of the composition analysis and identification of ancient glass products, L1 regularization, K-Means cluster analysis, elbow rule and other methods were comprehensively used to build logical regression, cluster analysis, hyper-parameter test and other models, and SPSS, Python and other tools were used to obtain the classification rules of glass products under different fluxes, sub classification under different chemical compositions, hyper-parameter K value test and rationality analysis. Research can provide theoretical support for the protection and restoration of ancient glass relics. 展开更多
关键词 Glass Composition L1 Regularization Logistic regression Model K-Means Clustering Analysis Elbow Rule Parameter Verification
下载PDF
Nonparametric Feature Screening via the Variance of the Regression Function
19
作者 Won Chul Song Michael G. Akritas 《Open Journal of Statistics》 2024年第4期413-438,共26页
This article develops a procedure for screening variables, in ultra high-di- mensional settings, based on their predictive significance. This is achieved by ranking the variables according to the variance of their res... This article develops a procedure for screening variables, in ultra high-di- mensional settings, based on their predictive significance. This is achieved by ranking the variables according to the variance of their respective marginal regression functions (RV-SIS). We show that, under some mild technical conditions, the RV-SIS possesses a sure screening property, which is defined by Fan and Lv (2008). Numerical comparisons suggest that RV-SIS has competitive performance compared to other screening procedures, and outperforms them in many different model settings. 展开更多
关键词 Sure Independence Screening Nonparametric regression Ultrahigh-Dimensional Data Variable Selection
下载PDF
Regression Method for Rail Fastener Tightness Based on Center-Line Projection Distance Feature and Neural Network
20
作者 Yuanhang Wang Duxin Liu +4 位作者 Sheng Guo Yifan Wu Jing Liu Wei Li Hongjie Wang 《Chinese Journal of Mechanical Engineering》 SCIE EI CAS CSCD 2024年第2期356-371,共16页
In the railway system,fasteners have the functions of damping,maintaining the track distance,and adjusting the track level.Therefore,routine maintenance and inspection of fasteners are important to ensure the safe ope... In the railway system,fasteners have the functions of damping,maintaining the track distance,and adjusting the track level.Therefore,routine maintenance and inspection of fasteners are important to ensure the safe operation of track lines.Currently,assessment methods for fastener tightness include manual observation,acoustic wave detection,and image detection.There are limitations such as low accuracy and efficiency,easy interference and misjudgment,and a lack of accurate,stable,and fast detection methods.Aiming at the small deformation characteristics and large elastic change of fasteners from full loosening to full tightening,this study proposes high-precision surface-structured light technology for fastener detection and fastener deformation feature extraction based on the center-line projection distance and a fastener tightness regression method based on neural networks.First,the method uses a 3D camera to obtain a fastener point cloud and then segments the elastic rod area based on the iterative closest point algorithm registration.Principal component analysis is used to calculate the normal vector of the segmented elastic rod surface and extract the point on the centerline of the elastic rod.The point is projected onto the upper surface of the bolt to calculate the projection distance.Subsequently,the mapping relationship between the projection distance sequence and fastener tightness is established,and the influence of each parameter on the fastener tightness prediction is analyzed.Finally,by setting up a fastener detection scene in the track experimental base,collecting data,and completing the algorithm verification,the results showed that the deviation between the fastener tightness regression value obtained after the algorithm processing and the actual measured value RMSE was 0.2196 mm,which significantly improved the effect compared with other tightness detection methods,and realized an effective fastener tightness regression. 展开更多
关键词 Railway system Fasteners Tightness inspection Neural network regression 3D point cloud processing
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部