期刊文献+
共找到72,121篇文章
< 1 2 250 >
每页显示 20 50 100
Two-Staged Method for Ice Channel Identification Based on Image Segmentation and Corner Point Regression 被引量:1
1
作者 DONG Wen-bo ZHOU Li +2 位作者 DING Shi-feng WANG Ai-ming CAI Jin-yan 《China Ocean Engineering》 SCIE EI CSCD 2024年第2期313-325,共13页
Identification of the ice channel is the basic technology for developing intelligent ships in ice-covered waters,which is important to ensure the safety and economy of navigation.In the Arctic,merchant ships with low ... Identification of the ice channel is the basic technology for developing intelligent ships in ice-covered waters,which is important to ensure the safety and economy of navigation.In the Arctic,merchant ships with low ice class often navigate in channels opened up by icebreakers.Navigation in the ice channel often depends on good maneuverability skills and abundant experience from the captain to a large extent.The ship may get stuck if steered into ice fields off the channel.Under this circumstance,it is very important to study how to identify the boundary lines of ice channels with a reliable method.In this paper,a two-staged ice channel identification method is developed based on image segmentation and corner point regression.The first stage employs the image segmentation method to extract channel regions.In the second stage,an intelligent corner regression network is proposed to extract the channel boundary lines from the channel region.A non-intelligent angle-based filtering and clustering method is proposed and compared with corner point regression network.The training and evaluation of the segmentation method and corner regression network are carried out on the synthetic and real ice channel dataset.The evaluation results show that the accuracy of the method using the corner point regression network in the second stage is achieved as high as 73.33%on the synthetic ice channel dataset and 70.66%on the real ice channel dataset,and the processing speed can reach up to 14.58frames per second. 展开更多
关键词 ice channel ship navigation IDENTIFICATION image segmentation corner point regression
下载PDF
Non-crossing Quantile Regression Neural Network as a Calibration Tool for Ensemble Weather Forecasts 被引量:1
2
作者 Mengmeng SONG Dazhi YANG +7 位作者 Sebastian LERCH Xiang'ao XIA Gokhan Mert YAGLI Jamie M.BRIGHT Yanbo SHEN Bai LIU Xingli LIU Martin Janos MAYER 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2024年第7期1417-1437,共21页
Despite the maturity of ensemble numerical weather prediction(NWP),the resulting forecasts are still,more often than not,under-dispersed.As such,forecast calibration tools have become popular.Among those tools,quantil... Despite the maturity of ensemble numerical weather prediction(NWP),the resulting forecasts are still,more often than not,under-dispersed.As such,forecast calibration tools have become popular.Among those tools,quantile regression(QR)is highly competitive in terms of both flexibility and predictive performance.Nevertheless,a long-standing problem of QR is quantile crossing,which greatly limits the interpretability of QR-calibrated forecasts.On this point,this study proposes a non-crossing quantile regression neural network(NCQRNN),for calibrating ensemble NWP forecasts into a set of reliable quantile forecasts without crossing.The overarching design principle of NCQRNN is to add on top of the conventional QRNN structure another hidden layer,which imposes a non-decreasing mapping between the combined output from nodes of the last hidden layer to the nodes of the output layer,through a triangular weight matrix with positive entries.The empirical part of the work considers a solar irradiance case study,in which four years of ensemble irradiance forecasts at seven locations,issued by the European Centre for Medium-Range Weather Forecasts,are calibrated via NCQRNN,as well as via an eclectic mix of benchmarking models,ranging from the naïve climatology to the state-of-the-art deep-learning and other non-crossing models.Formal and stringent forecast verification suggests that the forecasts post-processed via NCQRNN attain the maximum sharpness subject to calibration,amongst all competitors.Furthermore,the proposed conception to resolve quantile crossing is remarkably simple yet general,and thus has broad applicability as it can be integrated with many shallow-and deep-learning-based neural networks. 展开更多
关键词 ensemble weather forecasting forecast calibration non-crossing quantile regression neural network CORP reliability diagram POST-PROCESSING
下载PDF
ASYMPTOTICS FOR THE DISTRIBUTION FUNCTION ESTIMATORS OF THE ERRORS IN SEMI-PARAMETRIC REGRESSION MODELS
3
作者 QIU Yuyang FU Keang HUANG Wei 《Journal of Systems Science & Complexity》 SCIE EI CSCD 2014年第2期360-369,共10页
This paper considers the convergence rates for nonparametric estimators of the error distribution in semi-parametric regression models. By establishing some general laws of the iterated logarithm, it shows that the ra... This paper considers the convergence rates for nonparametric estimators of the error distribution in semi-parametric regression models. By establishing some general laws of the iterated logarithm, it shows that the rates of convergence of either the empirical distribution or a smoothed version of the empirical distribution function matches exactly the rates obtained for an independent sample from the error distribution. 展开更多
关键词 Empirical distribution function kernel distribution function law of the iterated loga-rithm semi-parametric regression model residuals.
原文传递
A comparison of model choice strategies for logistic regression
4
作者 Markku Karhunen 《Journal of Data and Information Science》 CSCD 2024年第1期37-52,共16页
Purpose:The purpose of this study is to develop and compare model choice strategies in context of logistic regression.Model choice means the choice of the covariates to be included in the model.Design/methodology/appr... Purpose:The purpose of this study is to develop and compare model choice strategies in context of logistic regression.Model choice means the choice of the covariates to be included in the model.Design/methodology/approach:The study is based on Monte Carlo simulations.The methods are compared in terms of three measures of accuracy:specificity and two kinds of sensitivity.A loss function combining sensitivity and specificity is introduced and used for a final comparison.Findings:The choice of method depends on how much the users emphasize sensitivity against specificity.It also depends on the sample size.For a typical logistic regression setting with a moderate sample size and a small to moderate effect size,either BIC,BICc or Lasso seems to be optimal.Research limitations:Numerical simulations cannot cover the whole range of data-generating processes occurring with real-world data.Thus,more simulations are needed.Practical implications:Researchers can refer to these results if they believe that their data-generating process is somewhat similar to some of the scenarios presented in this paper.Alternatively,they could run their own simulations and calculate the loss function.Originality/value:This is a systematic comparison of model choice algorithms and heuristics in context of logistic regression.The distinction between two types of sensitivity and a comparison based on a loss function are methodological novelties. 展开更多
关键词 Model choice Logistic regression Logit regression Monte Carlo simulations Sensitivity SPECIFICITY
下载PDF
P-norm Semi-parametric Maximum Likelihood Regression Model
5
作者 X. Pan S.L. Yuan 《Journal of Environmental Science and Engineering》 2010年第3期48-53,共6页
In this paper, using the kernel weight function, we obtain the parameter estimation of p-norm distribution in semi-parametric regression model, which is effective to decide the distribution of random errors. Under the... In this paper, using the kernel weight function, we obtain the parameter estimation of p-norm distribution in semi-parametric regression model, which is effective to decide the distribution of random errors. Under the assumption that the distribution of observations is unimodal and symmetry, this method can give the estimates of the parametric. Finally, two simulated adjustment problem are constructed to explain this method. The new method presented in this paper shows an effective way of solving the problem; the estimated values are nearer to their theoretical ones than those by least squares adjustment approach. 展开更多
关键词 P-norm distributions semi-parametric regression kernel weight function maximum likelihood adjustment.
下载PDF
Performance Enhancement of XML Parsing Using Regression and Parallelism
6
作者 Muhammad Ali Minhaj Ahmad Khan 《Computer Systems Science & Engineering》 2024年第2期287-303,共17页
The Extensible Markup Language(XML)files,widely used for storing and exchanging information on the web require efficient parsing mechanisms to improve the performance of the applications.With the existing Document Obj... The Extensible Markup Language(XML)files,widely used for storing and exchanging information on the web require efficient parsing mechanisms to improve the performance of the applications.With the existing Document Object Model(DOM)based parsing,the performance degrades due to sequential processing and large memory requirements,thereby requiring an efficient XML parser to mitigate these issues.In this paper,we propose a Parallel XML Tree Generator(PXTG)algorithm for accelerating the parsing of XML files and a Regression-based XML Parsing Framework(RXPF)that analyzes and predicts performance through profiling,regression,and code generation for efficient parsing.The PXTG algorithm is based on dividing the XML file into n parts and producing n trees in parallel.The profiling phase of the RXPF framework produces a dataset by measuring the performance of various parsing models including StAX,SAX,DOM,JDOM,and PXTG on different cores by using multiple file sizes.The regression phase produces the prediction model,based on which the final code for efficient parsing of XML files is produced through the code generation phase.The RXPF framework has shown a significant improvement in performance varying from 9.54%to 32.34%over other existing models used for parsing XML files. 展开更多
关键词 regression parallel parsing multi-cores XML
下载PDF
A regression approach for seismic first-break picking
7
作者 Huan Yuan San-Yi Yuan +2 位作者 Jie Wu Wen-Jing Sang Yu-He Zhao 《Petroleum Science》 SCIE EI CAS CSCD 2024年第3期1584-1596,共13页
The picking efficiency of seismic first breaks(FBs)has been greatly accelerated by deep learning(DL)technology.However,the picking accuracy and efficiency of DL methods still face huge challenges in low signal-to-nois... The picking efficiency of seismic first breaks(FBs)has been greatly accelerated by deep learning(DL)technology.However,the picking accuracy and efficiency of DL methods still face huge challenges in low signal-to-noise ratio(SNR)situations.To address this issue,we propose a regression approach to pick FBs based on bidirectional long short-term memory(Bi LSTM)neural network by learning the implicit Eikonal equation of 3D inhomogeneous media with rugged topography in the target region.We employ a regressive model that represents the relationships among the elevation of shots,offset and the elevation of receivers with their seismic traveltime to predict the unknown FBs,from common-shot gathers with sparsely distributed traces.Different from image segmentation methods which automatically extract image features and classify FBs from seismic data,the proposed method can learn the inner relationship between field geometry and FBs.In addition,the predicted results by the regressive model are continuous values of FBs rather than the discrete ones of the binary distribution.The picking results of synthetic data shows that the proposed method has low dependence on label data,and can obtain reliable and similar predicted results using two types of label data with large differences.The picking results of9380 shots for 3D seismic data generated by vibroseis indicate that the proposed method can still accurately predict FBs in low SNR data.The subsequent stacked profiles further illustrate the reliability and effectiveness of the proposed method.The results of model data and field seismic data demonstrate that the proposed regression method is a robust first-break picker with high potential for field application. 展开更多
关键词 First-break picking Low signal-to-noiseratio regression BiLSTM TRAVELTIME Geometry Noisy seismic data
下载PDF
Geographically and Temporally Weighted Regression in Assessing Dengue Fever Spread Factors in Yunnan Border Regions
8
作者 ZHU Xiao Xiang WANG Song Wang +3 位作者 LI Yan Fei ZHANG Ye Wu SU Xue Mei ZHAO Xiao Tao 《Biomedical and Environmental Sciences》 SCIE CAS CSCD 2024年第5期511-520,共10页
Objective This study employs the Geographically and Temporally Weighted Regression(GTWR)model to assess the impact of meteorological elements and imported cases on dengue fever outbreaks,emphasizing the spatial-tempor... Objective This study employs the Geographically and Temporally Weighted Regression(GTWR)model to assess the impact of meteorological elements and imported cases on dengue fever outbreaks,emphasizing the spatial-temporal variability of these factors in border regions.Methods We conducted a descriptive analysis of dengue fever’s temporal-spatial distribution in Yunnan border areas.Utilizing annual data from 2013 to 2019,with each county in the Yunnan border serving as a spatial unit,we constructed a GTWR model to investigate the determinants of dengue fever and their spatio-temporal heterogeneity in this region.Results The GTWR model,proving more effective than Ordinary Least Squares(OLS)analysis,identified significant spatial and temporal heterogeneity in factors influencing dengue fever’s spread along the Yunnan border.Notably,the GTWR model revealed a substantial variation in the relationship between indigenous dengue fever incidence,meteorological variables,and imported cases across different counties.Conclusion In the Yunnan border areas,local dengue incidence is affected by temperature,humidity,precipitation,wind speed,and imported cases,with these factors’influence exhibiting notable spatial and temporal variation. 展开更多
关键词 Dengue fever Meteorological factor Geographically and temporally weighted regression
下载PDF
Nuclear charge radius predictions by kernel ridge regression with odd-even effects
9
作者 Lu Tang Zhen-Hua Zhang 《Nuclear Science and Techniques》 SCIE EI CAS CSCD 2024年第2期94-102,共9页
The extended kernel ridge regression(EKRR)method with odd-even effects was adopted to improve the description of the nuclear charge radius using five commonly used nuclear models.These are:(i)the isospin-dependent A^(... The extended kernel ridge regression(EKRR)method with odd-even effects was adopted to improve the description of the nuclear charge radius using five commonly used nuclear models.These are:(i)the isospin-dependent A^(1∕3) formula,(ii)relativistic continuum Hartree-Bogoliubov(RCHB)theory,(iii)Hartree-Fock-Bogoliubov(HFB)model HFB25,(iv)the Weizsacker-Skyrme(WS)model WS*,and(v)HFB25*model.In the last two models,the charge radii were calculated using a five-parameter formula with the nuclear shell corrections and deformations obtained from the WS and HFB25 models,respectively.For each model,the resultant root-mean-square deviation for the 1014 nuclei with proton number Z≥8 can be significantly reduced to 0.009-0.013 fm after considering the modification with the EKRR method.The best among them was the RCHB model,with a root-mean-square deviation of 0.0092 fm.The extrapolation abilities of the KRR and EKRR methods for the neutron-rich region were examined,and it was found that after considering the odd-even effects,the extrapolation power was improved compared with that of the original KRR method.The strong odd-even staggering of nuclear charge radii of Ca and Cu isotopes and the abrupt kinks across the neutron N=126 and 82 shell closures were also calculated and could be reproduced quite well by calculations using the EKRR method. 展开更多
关键词 Nuclear charge radius Machine learning Kernel ridge regression method
下载PDF
Predicting uniaxial compressive strength of tuff after accelerated freeze-thaw testing: Comparative analysis of regression models and artificial neural networks
10
作者 Ogün Ozan VAROL 《Journal of Mountain Science》 SCIE CSCD 2024年第10期3521-3535,共15页
Ignimbrites have been widely used as building materials in many historical and touristic structures in the Kayseri region of Türkiye. Their diverse colours and textures make them a popular choice for modern const... Ignimbrites have been widely used as building materials in many historical and touristic structures in the Kayseri region of Türkiye. Their diverse colours and textures make them a popular choice for modern construction as well. However, ignimbrites are particularly vulnerable to atmospheric conditions, such as freeze-thaw cycles, due to their high porosity, which is a result of their formation process. When water enters the pores of the ignimbrites, it can freeze during cold weather. As the water freezes and expands, it generates internal stress within the stone, causing micro-cracks to develop. Over time, repeated freeze-thaw (F-T) cycles lead to the growth of these micro-cracks into larger cracks, compromising the structural integrity of the ignimbrites and eventually making them unsuitable for use as building materials. The determination of the long-term F-T performance of ignimbrites can be established after long F-T experimental processes. Determining the long-term F-T performance of ignimbrites typically requires extensive experimental testing over prolonged freeze-thaw cycles. To streamline this process, developing accurate predictive equations becomes crucial. In this study, such equations were formulated using classical regression analyses and artificial neural networks (ANN) based on data obtained from these experiments, allowing for the prediction of the F-T performance of ignimbrites and other similar building stones without the need for lengthy testing. In this study, uniaxial compressive strength, ultrasonic propagation velocity, apparent porosity and mass loss of ignimbrites after long-term F-T were determined. Following the F-T cycles, the disintegration rate was evaluated using decay function approaches, while uniaxial compressive strength (UCS) values were predicted with minimal input parameters through both regression and ANN analyses. The ANN and regression models created for this purpose were first started with a single input value and then developed with two and three combinations. The predictive performance of the models was assessed by comparing them to regression models using the coefficient of determination (R2) as the evaluation criterion. As a result of the study, higher R2 values (0.87) were obtained in models built with artificial neural network. The results of the study indicate that ANN usage can produce results close to experimental outcomes in predicting the long-term F-T performance of ignimbrite samples. 展开更多
关键词 IGNIMBRITE Uniaxial compressive strength FREEZE-THAW Decay function regression Artificial neural network
下载PDF
Composition Analysis and Identification of Ancient Glass Products Based on L1 Regularization Logistic Regression
11
作者 Yuqiao Zhou Xinyang Xu Wenjing Ma 《Applied Mathematics》 2024年第1期51-64,共14页
In view of the composition analysis and identification of ancient glass products, L1 regularization, K-Means cluster analysis, elbow rule and other methods were comprehensively used to build logical regression, cluste... In view of the composition analysis and identification of ancient glass products, L1 regularization, K-Means cluster analysis, elbow rule and other methods were comprehensively used to build logical regression, cluster analysis, hyper-parameter test and other models, and SPSS, Python and other tools were used to obtain the classification rules of glass products under different fluxes, sub classification under different chemical compositions, hyper-parameter K value test and rationality analysis. Research can provide theoretical support for the protection and restoration of ancient glass relics. 展开更多
关键词 Glass Composition L1 Regularization Logistic regression Model K-Means Clustering Analysis Elbow Rule Parameter Verification
下载PDF
Nonparametric Feature Screening via the Variance of the Regression Function
12
作者 Won Chul Song Michael G. Akritas 《Open Journal of Statistics》 2024年第4期413-438,共26页
This article develops a procedure for screening variables, in ultra high-di- mensional settings, based on their predictive significance. This is achieved by ranking the variables according to the variance of their res... This article develops a procedure for screening variables, in ultra high-di- mensional settings, based on their predictive significance. This is achieved by ranking the variables according to the variance of their respective marginal regression functions (RV-SIS). We show that, under some mild technical conditions, the RV-SIS possesses a sure screening property, which is defined by Fan and Lv (2008). Numerical comparisons suggest that RV-SIS has competitive performance compared to other screening procedures, and outperforms them in many different model settings. 展开更多
关键词 Sure Independence Screening Nonparametric regression Ultrahigh-Dimensional Data Variable Selection
下载PDF
Regression Method for Rail Fastener Tightness Based on Center-Line Projection Distance Feature and Neural Network
13
作者 Yuanhang Wang Duxin Liu +4 位作者 Sheng Guo Yifan Wu Jing Liu Wei Li Hongjie Wang 《Chinese Journal of Mechanical Engineering》 SCIE EI CAS CSCD 2024年第2期356-371,共16页
In the railway system,fasteners have the functions of damping,maintaining the track distance,and adjusting the track level.Therefore,routine maintenance and inspection of fasteners are important to ensure the safe ope... In the railway system,fasteners have the functions of damping,maintaining the track distance,and adjusting the track level.Therefore,routine maintenance and inspection of fasteners are important to ensure the safe operation of track lines.Currently,assessment methods for fastener tightness include manual observation,acoustic wave detection,and image detection.There are limitations such as low accuracy and efficiency,easy interference and misjudgment,and a lack of accurate,stable,and fast detection methods.Aiming at the small deformation characteristics and large elastic change of fasteners from full loosening to full tightening,this study proposes high-precision surface-structured light technology for fastener detection and fastener deformation feature extraction based on the center-line projection distance and a fastener tightness regression method based on neural networks.First,the method uses a 3D camera to obtain a fastener point cloud and then segments the elastic rod area based on the iterative closest point algorithm registration.Principal component analysis is used to calculate the normal vector of the segmented elastic rod surface and extract the point on the centerline of the elastic rod.The point is projected onto the upper surface of the bolt to calculate the projection distance.Subsequently,the mapping relationship between the projection distance sequence and fastener tightness is established,and the influence of each parameter on the fastener tightness prediction is analyzed.Finally,by setting up a fastener detection scene in the track experimental base,collecting data,and completing the algorithm verification,the results showed that the deviation between the fastener tightness regression value obtained after the algorithm processing and the actual measured value RMSE was 0.2196 mm,which significantly improved the effect compared with other tightness detection methods,and realized an effective fastener tightness regression. 展开更多
关键词 Railway system Fasteners Tightness inspection Neural network regression 3D point cloud processing
下载PDF
The Effect of Blood Lipid Profiles on Chronic Kidney Disease in a Prospective Cohort:Based on a Regression Discontinuity Design
14
作者 Kang Lyu Shaodong Liu +8 位作者 Yanli Liu Jinlong You Xue Wang Min Jiang Chun Yin Desheng Zhang Yana Bai Minzhen Wang Shan Zheng 《Biomedical and Environmental Sciences》 SCIE CAS CSCD 2024年第10期1158-1172,共15页
Objective Previous studies on the association between lipid profiles and chronic kidney disease(CKD)have yielded inconsistent results and no defined thresholds for blood lipids.Methods A prospective cohort study inclu... Objective Previous studies on the association between lipid profiles and chronic kidney disease(CKD)have yielded inconsistent results and no defined thresholds for blood lipids.Methods A prospective cohort study including 32,351 subjects who completed baseline and follow-up surveys over 5 years was conducted.Restricted cubic splines and Cox models were used to examine the association between the lipid profiles and CKD.A regression discontinuity design was used to determine the cutoff value of lipid profiles that was significantly associated with increased the risk of CKD.Results Over a median follow-up time of 2.2(0.5,4.2)years,648(2.00%)subjects developed CKD.The lipid profiles that were significantly and linearly related to CKD included total cholesterol(TC),triglycerides(TG),high-density lipoprotein cholesterol(HDL-C),TC/HDL-C,and TG/HDL-C,whereas lowdensity lipoprotein cholesterol(LDL-C)and LDL-C/HDL-C were nonlinearly correlated with CKD.TC,TG,TC/HDL-C,and TG/HDL-C showed an upward jump at the cutoff value,increasing the risk of CKD by 0.90%,1.50%,2.30%,and 1.60%,respectively,whereas HDL-C showed a downward jump at the cutoff value,reducing this risk by 1.0%.Female and participants with dyslipidemia had a higher risk of CKD,while the cutoff values for the different characteristics of the population were different.Conclusion There was a significant association between lipid profiles and CKD in a prospective cohort from Northwest China,while TG,TC/HDL-C,and TG/HDL-C showed a stronger risk association.The specific cutoff values of lipid profiles may provide a clinical reference for screening or diagnosing CKD risk. 展开更多
关键词 Blood lipid profiles Chronic kidney disease regression discontinuity design Prospective cohort
下载PDF
Predicting Purchasing Behavior on E-Commerce Platforms: A Regression Model Approach for Understanding User Features that Lead to Purchasing
15
作者 Abraham Jallah Balyemah Sonkarlay J. Y. Weamie +2 位作者 Jiang Bin Karmue Vasco Jarnda Felix Jwakdak Joshua 《International Journal of Communications, Network and System Sciences》 2024年第6期81-103,共23页
This research introduces a novel approach to improve and optimize the predictive capacity of consumer purchase behaviors on e-commerce platforms. This study presented an introduction to the fundamental concepts of the... This research introduces a novel approach to improve and optimize the predictive capacity of consumer purchase behaviors on e-commerce platforms. This study presented an introduction to the fundamental concepts of the logistic regression algorithm. In addition, it analyzed user data obtained from an e-commerce platform. The original data were preprocessed, and a consumer purchase prediction model was developed for the e-commerce platform using the logistic regression method. The comparison study used the classic random forest approach, further enhanced by including the K-fold cross-validation method. Evaluation of the accuracy of the model’s classification was conducted using performance indicators that included the accuracy rate, the precision rate, the recall rate, and the F1 score. A visual examination determined the significance of the findings. The findings suggest that employing the logistic regression algorithm to forecast customer purchase behaviors on e-commerce platforms can improve the efficacy of the approach and yield more accurate predictions. This study serves as a valuable resource for improving the precision of forecasting customers’ purchase behaviors on e-commerce platforms. It has significant practical implications for optimizing the operational efficiency of e-commerce platforms. 展开更多
关键词 E-Commerce Platform Purchasing Behavior Prediction Logistic regression Algorithm
下载PDF
Operational optimization of copper flotation process based on the weighted Gaussian process regression and index-oriented adaptive differential evolution algorithm
16
作者 Zhiqiang Wang Dakuo He Haotian Nie 《Chinese Journal of Chemical Engineering》 SCIE EI CAS CSCD 2024年第2期167-179,共13页
Concentrate copper grade(CCG)is one of the important production indicators of copper flotation processes,and keeping the CCG at the set value is of great significance to the economic benefit of copper flotation indust... Concentrate copper grade(CCG)is one of the important production indicators of copper flotation processes,and keeping the CCG at the set value is of great significance to the economic benefit of copper flotation industrial processes.This paper addresses the fluctuation problem of CCG through an operational optimization method.Firstly,a density-based affinity propagationalgorithm is proposed so that more ideal working condition categories can be obtained for the complex raw ore properties.Next,a Bayesian network(BN)is applied to explore the relationship between the operational variables and the CCG.Based on the analysis results of BN,a weighted Gaussian process regression model is constructed to predict the CCG that a higher prediction accuracy can be obtained.To ensure the predicted CCG is close to the set value with a smaller magnitude of the operation adjustments and a smaller uncertainty of the prediction results,an index-oriented adaptive differential evolution(IOADE)algorithm is proposed,and the convergence performance of IOADE is superior to the traditional differential evolution and adaptive differential evolution methods.Finally,the effectiveness and feasibility of the proposed methods are verified by the experiments on a copper flotation industrial process. 展开更多
关键词 Weighted Gaussian process regression Index-oriented adaptive differential evolution Operational optimization Copper flotation process
下载PDF
Incorporating Lasso Regression to Physics-Informed Neural Network for Inverse PDE Problem
17
作者 Meng Ma Liu Fu +1 位作者 Xu Guo Zhi Zhai 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第10期385-399,共15页
Partial Differential Equation(PDE)is among the most fundamental tools employed to model dynamic systems.Existing PDE modeling methods are typically derived from established knowledge and known phenomena,which are time... Partial Differential Equation(PDE)is among the most fundamental tools employed to model dynamic systems.Existing PDE modeling methods are typically derived from established knowledge and known phenomena,which are time-consuming and labor-intensive.Recently,discovering governing PDEs from collected actual data via Physics Informed Neural Networks(PINNs)provides a more efficient way to analyze fresh dynamic systems and establish PEDmodels.This study proposes Sequentially Threshold Least Squares-Lasso(STLasso),a module constructed by incorporating Lasso regression into the Sequentially Threshold Least Squares(STLS)algorithm,which can complete sparse regression of PDE coefficients with the constraints of l0 norm.It further introduces PINN-STLasso,a physics informed neural network combined with Lasso sparse regression,able to find underlying PDEs from data with reduced data requirements and better interpretability.In addition,this research conducts experiments on canonical inverse PDE problems and compares the results to several recent methods.The results demonstrated that the proposed PINN-STLasso outperforms other methods,achieving lower error rates even with less data. 展开更多
关键词 Physics-informed neural network inverse partial differential equation Lasso regression scientific machine learning
下载PDF
Prediction of Ground Vibration Induced by Rock Blasting Based on Optimized Support Vector Regression Models
18
作者 Yifan Huang Zikang Zhou +1 位作者 Mingyu Li Xuedong Luo 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第6期3147-3165,共19页
Accurately estimating blasting vibration during rock blasting is the foundation of blasting vibration management.In this study,Tuna Swarm Optimization(TSO),Whale Optimization Algorithm(WOA),and Cuckoo Search(CS)were u... Accurately estimating blasting vibration during rock blasting is the foundation of blasting vibration management.In this study,Tuna Swarm Optimization(TSO),Whale Optimization Algorithm(WOA),and Cuckoo Search(CS)were used to optimize two hyperparameters in support vector regression(SVR).Based on these methods,three hybrid models to predict peak particle velocity(PPV)for bench blasting were developed.Eighty-eight samples were collected to establish the PPV database,eight initial blasting parameters were chosen as input parameters for the predictionmodel,and the PPV was the output parameter.As predictive performance evaluation indicators,the coefficient of determination(R2),rootmean square error(RMSE),mean absolute error(MAE),and a10-index were selected.The normalizedmutual information value is then used to evaluate the impact of various input parameters on the PPV prediction outcomes.According to the research findings,TSO,WOA,and CS can all enhance the predictive performance of the SVR model.The TSO-SVR model provides the most accurate predictions.The performances of the optimized hybrid SVR models are superior to the unoptimized traditional prediction model.The maximum charge per delay impacts the PPV prediction value the most. 展开更多
关键词 Blasting vibration metaheuristic algorithms support vector regression peak particle velocity normalized mutual information
下载PDF
Prediction of the undrained shear strength of remolded soil with non-linear regression,fuzzy logic,and artificial neural network
19
作者 YÜNKÜL Kaan KARAÇOR Fatih +1 位作者 GÜRBÜZ Ayhan BUDAK TahsinÖmür 《Journal of Mountain Science》 SCIE CSCD 2024年第9期3108-3122,共15页
This study aims to predict the undrained shear strength of remolded soil samples using non-linear regression analyses,fuzzy logic,and artificial neural network modeling.A total of 1306 undrained shear strength results... This study aims to predict the undrained shear strength of remolded soil samples using non-linear regression analyses,fuzzy logic,and artificial neural network modeling.A total of 1306 undrained shear strength results from 230 different remolded soil test settings reported in 21 publications were collected,utilizing six different measurement devices.Although water content,plastic limit,and liquid limit were used as input parameters for fuzzy logic and artificial neural network modeling,liquidity index or water content ratio was considered as an input parameter for non-linear regression analyses.In non-linear regression analyses,12 different regression equations were derived for the prediction of undrained shear strength of remolded soil.Feed-Forward backpropagation and the TANSIG transfer function were used for artificial neural network modeling,while the Mamdani inference system was preferred with trapezoidal and triangular membership functions for fuzzy logic modeling.The experimental results of 914 tests were used for training of the artificial neural network models,196 for validation and 196 for testing.It was observed that the accuracy of the artificial neural network and fuzzy logic modeling was higher than that of the non-linear regression analyses.Furthermore,a simple and reliable regression equation was proposed for assessments of undrained shear strength values with higher coefficients of determination. 展开更多
关键词 Undrained shear strength Liquidity index Water content ratio Non-linear regression Artificial neural networks Fuzzy logic
下载PDF
Integration of Multiple Spectral Data via a Logistic Regression Algorithm for Detection of Crop Residue Burned Areas:A Case Study of Songnen Plain,Northeast China
20
作者 ZHANG Sumei ZHANG Yuan ZHAO Hongmei 《Chinese Geographical Science》 SCIE CSCD 2024年第3期548-563,共16页
The burning of crop residues in fields is a significant global biomass burning activity which is a key element of the terrestrial carbon cycle,and an important source of atmospheric trace gasses and aerosols.Accurate ... The burning of crop residues in fields is a significant global biomass burning activity which is a key element of the terrestrial carbon cycle,and an important source of atmospheric trace gasses and aerosols.Accurate estimation of cropland burned area is both crucial and challenging,especially for the small and fragmented burned scars in China.Here we developed an automated burned area mapping algorithm that was implemented using Sentinel-2 Multi Spectral Instrument(MSI)data and its effectiveness was tested taking Songnen Plain,Northeast China as a case using satellite image of 2020.We employed a logistic regression method for integrating multiple spectral data into a synthetic indicator,and compared the results with manually interpreted burned area reference maps and the Moderate-Resolution Imaging Spectroradiometer(MODIS)MCD64A1 burned area product.The overall accuracy of the single variable logistic regression was 77.38%to 86.90%and 73.47%to 97.14%for the 52TCQ and 51TYM cases,respectively.In comparison,the accuracy of the burned area map was improved to 87.14%and 98.33%for the 52TCQ and 51TYM cases,respectively by multiple variable logistic regression of Sentind-2 images.The balance of omission error and commission error was also improved.The integration of multiple spectral data combined with a logistic regression method proves to be effective for burned area detection,offering a highly automated process with an automatic threshold determination mechanism.This method exhibits excellent extensibility and flexibility taking the image tile as the operating unit.It is suitable for burned area detection at a regional scale and can also be implemented with other satellite data. 展开更多
关键词 crop residue burning burned area Sentinel-2 Multi Spectral Instrument(MSI) logistic regression Songnen Plain China
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部