The immediate international spread of severe acute respiratory syn-drome revealed the potential threat of infectious diseases in a closely integrated and interdependent world.When an outbreak occurs,each country must ...The immediate international spread of severe acute respiratory syn-drome revealed the potential threat of infectious diseases in a closely integrated and interdependent world.When an outbreak occurs,each country must have a well-coordinated and preventative plan to address the situation.Information and Communication Technologies have provided innovative approaches to dealing with numerous facets of daily living.Although intelligent devices and applica-tions have become a vital part of our everyday lives,smart gadgets have also led to several physical and psychological health problems in modern society.Here,we used an artificial intelligence AI-based system for disease prediction using an Artificial Neural Network(ANN).The ANN improved the regularization of the classification model,hence increasing its accuracy.The unconstrained opti-mization model reduced the classifier’s cost function to obtain the lowest possible cost.To verify the performance of the intelligent system,we compared the out-comes of the suggested scheme with the results of previously proposed models.The proposed intelligent system achieved an accuracy of 0.89,and the miss rate 0.11 was higher than in previously proposed models.展开更多
In the Tano River Basin,groundwater serves as a crucial resource;however,its quantity and quality with regard to trace elements and microbiological loadings remain poorly understood due to the lack of groundwater logs...In the Tano River Basin,groundwater serves as a crucial resource;however,its quantity and quality with regard to trace elements and microbiological loadings remain poorly understood due to the lack of groundwater logs and limited water research.This study presents a comprehensive analysis of the Tano River Basin,focusing on three key objectives.First,it investigated the aquifer hydraulic parameters and the results showed significant spatial variations in borehole depths,yields,transmissivity,hydraulic conductivity,and specific capacity.Deeper boreholes were concentrated in the northeastern and southeastern zones,while geological formations,particu-larly the Apollonian Formation,exhibit a strong influence on borehole yields.The study identified areas with high transmissivity and hydraulic conductivity in the southern and eastern regions,suggesting good groundwater avail-ability and suitability for sustainable water supply.Sec-ondly,the research investigated the groundwater quality and observed that the majority of borehole samples fall within WHO(Guidelines for Drinking-water Quality,Environmental Health Criteria,Geneva,2011,2017.http://www.who.int)limit.However,some samples have pH levels below the standards,although the groundwater generally qualifies as freshwater.The study further explores hydrochemical facies and health risk assessment,highlighting the dominance of Ca–HCO3 water type.Trace element analysis reveals minimal health risks from most elements,with chromium(Cr)as the primary contributor to chronic health risk.Overall,this study has provided a key insights into the Tano River Basin’s hydrogeology and associated health risks.The outcome of this research has contributed to the broader understanding of hydrogeologi-cal dynamics and the importance of managing groundwater resources sustainably in complex geological environments.展开更多
The development of digital technology has brought about a substantial evolution in the multimedia field.The use of generative technologies to produce digital multimedia material is one of the newer developments in thi...The development of digital technology has brought about a substantial evolution in the multimedia field.The use of generative technologies to produce digital multimedia material is one of the newer developments in this field.The“Digital Generative Multimedia Tool Theory”(DGMTT)is therefore presented in this theoretical postulation by Timothy Ekeledirichukwu Onyejelem and Eric Msughter Aondover.It discusses and describes the principles behind the development and deployment of generative tools in multimedia creation.The DGMTT offers an all-encompassing structure for comprehending and evaluating the fundamentals and consequences of generative tools in the production of multimedia content.It provides information about the creation and use of these instruments,thereby promoting developments in the digital media industry.These tools create dynamic and interactive multimedia content by utilizing machine learning,artificial intelligence,and algorithms.This theory emphasizes how crucial it is to comprehend the fundamental ideas and principles of generative tools in order to use them efficiently when creating digital media content.A wide range of industries,including journalism,advertising,entertainment,education,and the arts,can benefit from the practical use of DGMTT.It gives artists the ability to use generative technologies to create unique and customized multimedia content for its viewers.展开更多
Urban traffic congestion is a severe and widely studied problem over the decade because of the negative impacts. However, in recent years some approaches emerge as proper and suitable solutions. The Carpooling initiat...Urban traffic congestion is a severe and widely studied problem over the decade because of the negative impacts. However, in recent years some approaches emerge as proper and suitable solutions. The Carpooling initiative is one of the most representative efforts to propitiate a responsible use of particular vehicles. Thus, the paper introduces a carpooling model considering the users’ preference to reach an appropriate match among drivers and passengers. In particular, the paper conducts a study of 6 of the most avid classified techniques in machine learning to create a model for the selection of travel companions. The experimental results show the models’ precision and assess the best cases using Friedman’s test. Finally, the conclusions emphasize the relevance of the proposed study and suggest that it is necessary to extend the proposal with more drives and passengers’ data.展开更多
The uniaxial compressive strength(UCS)of rock is an essential property of rock material in different relevant applications,such as rock slope,tunnel construction,and foundation.It takes enormous time and effort to obt...The uniaxial compressive strength(UCS)of rock is an essential property of rock material in different relevant applications,such as rock slope,tunnel construction,and foundation.It takes enormous time and effort to obtain the UCS values directly in the laboratory.Accordingly,an indirect determination of UCS through conducting several rock index tests that are easy and fast to carry out is of interest and importance.This study presents powerful boosting trees evaluation framework,i.e.,adaptive boosting machine,extreme gradient boosting machine(XGBoost),and category gradient boosting machine,for estimating the UCS of sandstone.Schmidt hammer rebound number,P-wave velocity,and point load index were chosen as considered factors to forecast UCS values of sandstone samples.Taylor diagrams and five regression metrics,including coefficient of determination(R2),root mean square error,mean absolute error,variance account for,and A-20 index,were used to evaluate and compare the performance of these boosting trees.The results showed that the proposed boosting trees are able to provide a high level of prediction capacity for the prepared database.In particular,itwas worth noting that XGBoost is the best model to predict sandstone strength and it achieved 0.999 training R^(2) and 0.958 testing R^(2).The proposed model had more outstanding capability than neural network with optimization techniques during training and testing phases.The performed variable importance analysis reveals that the point load index has a significant influence on predicting UCS of sandstone.展开更多
Difficulty in communicating and interacting with other people are mainly due to the neurological disorder called autism spectrum disorder(ASD)diseases.These diseases can affect the nerves at any stage of the human bein...Difficulty in communicating and interacting with other people are mainly due to the neurological disorder called autism spectrum disorder(ASD)diseases.These diseases can affect the nerves at any stage of the human being in childhood,adolescence,and adulthood.ASD is known as a behavioral disease due to the appearances of symptoms over thefirst two years that continue until adulthood.Most of the studies prove that the early detection of ASD helps improve the behavioral characteristics of patients with ASD.The detection of ASD is a very challenging task among various researchers.Machine learning(ML)algorithms still act very intelligent by learning the complex data and pre-dicting quality results.In this paper,ensemble ML techniques for the early detec-tion of ASD are proposed.In this detection,the dataset isfirst processed using three ML algorithms such as sequential minimal optimization with support vector machine,Kohonen self-organizing neural network,and random forest algorithm.The prediction results of these ML algorithms(ensemble)further use the bagging concept called max voting to predict thefinal result.The accuracy,sensitivity,and specificity of the proposed system are calculated using confusion matrix.The pro-posed ensemble technique performs better than state-of-the art ML algorithms.展开更多
The exponential growth of Internet and network usage has neces-sitated heightened security measures to protect against data and network breaches.Intrusions,executed through network packets,pose a significant challenge...The exponential growth of Internet and network usage has neces-sitated heightened security measures to protect against data and network breaches.Intrusions,executed through network packets,pose a significant challenge for firewalls to detect and prevent due to the similarity between legit-imate and intrusion traffic.The vast network traffic volume also complicates most network monitoring systems and algorithms.Several intrusion detection methods have been proposed,with machine learning techniques regarded as promising for dealing with these incidents.This study presents an Intrusion Detection System Based on Stacking Ensemble Learning base(Random For-est,Decision Tree,and k-Nearest-Neighbors).The proposed system employs pre-processing techniques to enhance classification efficiency and integrates seven machine learning algorithms.The stacking ensemble technique increases performance by incorporating three base models(Random Forest,Decision Tree,and k-Nearest-Neighbors)and a meta-model represented by the Logistic Regression algorithm.Evaluated using the UNSW-NB15 dataset,the pro-posed IDS gained an accuracy of 96.16%in the training phase and 97.95%in the testing phase,with precision of 97.78%,and 98.40%for taring and testing,respectively.The obtained results demonstrate improvements in other measurement criteria.展开更多
Condition based maintenance(CBM) is one of the solutions to machinery maintenance requirements. Latest approaches to CBM aim at reducing human engagement in the real-time fault detection and decision making. Machine l...Condition based maintenance(CBM) is one of the solutions to machinery maintenance requirements. Latest approaches to CBM aim at reducing human engagement in the real-time fault detection and decision making. Machine learning techniques like fuzzy-logic-based systems, neural networks, and support vector machines help to reduce human involvement. Most of these techniques provide fault information with 100% confidence. It is undeniably apparent that this area has a vast application scope. To facilitate future exploration, this review is presented describing the centrifugal pump faults, the signals they generate, their CBM based diagnostic schemes, and case studies for blockage and cavitation fault detection in centrifugal pump(CP) by performing the experiment on test rig. The classification accuracy is above 98% for fault detection. This review gives a head-start to new researchers in this field and identifies the un-touched areas pertaining to CP fault diagnosis.展开更多
In The Wireless Multimedia Sensor Network(WNSMs)have achieved popularity among diverse communities as a result of technological breakthroughs in sensor and current gadgets.By utilising portable technologies,it achieve...In The Wireless Multimedia Sensor Network(WNSMs)have achieved popularity among diverse communities as a result of technological breakthroughs in sensor and current gadgets.By utilising portable technologies,it achieves solid and significant results in wireless communication,media transfer,and digital transmission.Sensor nodes have been used in agriculture and industry to detect characteristics such as temperature,moisture content,and other environmental conditions in recent decades.WNSMs have also made apps easier to use by giving devices self-governing access to send and process data connected with appro-priate audio and video information.Many video sensor network studies focus on lowering power consumption and increasing transmission capacity,but the main demand is data reliability.Because of the obstacles in the sensor nodes,WMSN is subjected to a variety of attacks,including Denial of Service(DoS)attacks.Deep Convolutional Neural Network is designed with the stateaction relationship mapping which is used to identify the DDOS Attackers present in the Wireless Sensor Networks for Smart Agriculture.The Proposed work it performs the data collection about the traffic conditions and identifies the deviation between the network conditions such as packet loss due to network congestion and the presence of attackers in the network.It reduces the attacker detection delay and improves the detection accuracy.In order to protect the network against DoS assaults,an improved machine learning technique must be offered.An efficient Deep Neural Network approach is provided for detecting DoS in WMSN.The required parameters are selected using an adaptive particle swarm optimization technique.The ratio of packet transmission,energy consumption,latency,network length,and throughput will be used to evaluate the approach’s efficiency.展开更多
Cervical cancer is a serious public health issue worldwide, and early identification is crucial for better patient outcomes. Recent study has investigated how ML and DL approaches may be used to increase the accuracy ...Cervical cancer is a serious public health issue worldwide, and early identification is crucial for better patient outcomes. Recent study has investigated how ML and DL approaches may be used to increase the accuracy of vagina tests. In this piece, we conducted a thorough review of 50 research studies that applied these techniques. Our investigation compared the outcomes to well-known screening techniques and concentrated on the datasets used and performance measurements reported. According to the research, convolutional neural networks and other deep learning approaches have potential for lowering false positives and boosting screening precision. Although several research used small sample sizes or constrained datasets, this raises questions about how applicable the findings are. This paper discusses the advantages and disadvantages of the articles that were chosen, as well as prospective topics for future research, to further the application of ml and dl in cervical cancer screening. The development of cervical cancer screening technologies that are more precise, accessible, and can lead to better public health outcomes is significantly affected by these findings.展开更多
The Conference of the Parties(COP26 and 27)placed significant emphasis on climate financing policies with the objective of achieving net zero emissions and carbon neutrality.However,studies on the implementation of th...The Conference of the Parties(COP26 and 27)placed significant emphasis on climate financing policies with the objective of achieving net zero emissions and carbon neutrality.However,studies on the implementation of this policy proposition are limited.To address this gap in the literature,this study employs machine learning techniques,specifically natural language processing(NLP),to examine 77 climate bond(CB)policies from 32 countries within the context of climate financing.The findings indicate that“sustainability”and“carbon emissions control”are the most outlined policy objectives in these CB policies.Additionally,the study highlights that most CB funds are invested toward energy projects(i.e.,renewable,clean,and efficient initiatives).However,there has been a notable shift in the allocation of CB funds from climate-friendly energy projects to the construction sector between 2015 and 2019.This shift raises concerns about the potential redirection of funds from climate-focused investments to the real estate industry,potentially leading to the greenwashing of climate funds.Furthermore,policy sentiment analysis revealed that a minority of policies hold skeptical views on climate change,which may negatively influence climate actions.Thus,the findings highlight that the effective implementation of CB policies depends on policy goals,objectives,and sentiments.Finally,this study contributes to the literature by employing NLP techniques to understand policy sentiments in climate financing.展开更多
Forecasting travel demand requires a grasp of individual decision-making behavior.However,transport mode choice(TMC)is determined by personal and contextual factors that vary from person to person.Numerous characteris...Forecasting travel demand requires a grasp of individual decision-making behavior.However,transport mode choice(TMC)is determined by personal and contextual factors that vary from person to person.Numerous characteristics have a substantial impact on travel behavior(TB),which makes it important to take into account while studying transport options.Traditional statistical techniques frequently presume linear correlations,but real-world data rarely follows these presumptions,which may make it harder to grasp the complex interactions.Thorough systematic review was conducted to examine how machine learning(ML)approaches might successfully capture nonlinear correlations that conventional methods may ignore to overcome such challenges.An in-depth analysis of discrete choice models(DCM)and several ML algorithms,datasets,model validation strategies,and tuning techniques employed in previous research is carried out in the present study.Besides,the current review also summarizes DCM and ML models to predict TMC and recognize the determinants of TB in an urban area for different transport modes.The two primary goals of our study are to establish the present conceptual frameworks for the factors influencing the TMC for daily activities and to pinpoint methodological issues and limitations in previous research.With a total of 39 studies,our findings shed important light on the significance of considering factors that influence the TMC.The adjusted kernel algorithms and hyperparameter-optimized ML algorithms outperform the typical ML algorithms.RF(random forest),SVM(support vector machine),ANN(artificial neural network),and interpretable ML algorithms are the most widely used ML algorithms for the prediction of TMC where RF achieved an R2 of 0.95 and SVM achieved an accuracy of 93.18%;however,the adjusted kernel enhanced the accuracy of SVM 99.81%which shows that the interpretable algorithms outperformed the typical algorithms.The sensitivity analysis indicates that the most significant parameters influencing TMC are the age,total trip time,and the number of drivers.展开更多
Landslide is considered as one of the most severe threats to human life and property in the hilly areas of the world.The number of landslides and the level of damage across the globe has been increasing over time.Ther...Landslide is considered as one of the most severe threats to human life and property in the hilly areas of the world.The number of landslides and the level of damage across the globe has been increasing over time.Therefore,landslide management is essential to maintain the natural and socio-economic dynamics of the hilly region.Rorachu river basin is one of the most landslide-prone areas of the Sikkim selected for the present study.The prime goal of the study is to prepare landslide susceptibility maps(LSMs)using computer-based advanced machine learning techniques and compare the performance of the models.To properly understand the existing spatial relation with the landslide,twenty factors,including triggering and causative factors,were selected.A deep learning algorithm viz.convolutional neural network model(CNN)and three popular machine learning techniques,i.e.,random forest model(RF),artificial neural network model(ANN),and bagging model,were employed to prepare the LSMs.Two separate datasets including training and validation were designed by randomly taken landslide and nonlandslide points.A ratio of 70:30 was considered for the selection of both training and validation points.Multicollinearity was assessed by tolerance and variance inflation factor,and the role of individual conditioning factors was estimated using information gain ratio.The result reveals that there is no severe multicollinearity among the landslide conditioning factors,and the triggering factor rainfall appeared as the leading cause of the landslide.Based on the final prediction values of each model,LSM was constructed and successfully portioned into five distinct classes,like very low,low,moderate,high,and very high susceptibility.The susceptibility class-wise distribution of landslides shows that more than 90%of the landslide area falls under higher landslide susceptibility grades.The precision of models was examined using the area under the curve(AUC)of the receiver operating characteristics(ROC)curve and statistical methods like root mean square error(RMSE)and mean absolute error(MAE).In both datasets(training and validation),the CNN model achieved the maximum AUC value of 0.903 and 0.939,respectively.The lowest value of RMSE and MAE also reveals the better performance of the CNN model.So,it can be concluded that all the models have performed well,but the CNN model has outperformed the other models in terms of precision.展开更多
Alzheimer’s disease(AD)is a very complex disease that causes brain failure,then eventually,dementia ensues.It is a global health problem.99%of clinical trials have failed to limit the progression of this disease.The ...Alzheimer’s disease(AD)is a very complex disease that causes brain failure,then eventually,dementia ensues.It is a global health problem.99%of clinical trials have failed to limit the progression of this disease.The risks and barriers to detecting AD are huge as pathological events begin decades before appearing clinical symptoms.Therapies for AD are likely to be more helpful if the diagnosis is determined early before the final stage of neurological dysfunction.In this regard,the need becomes more urgent for biomarker-based detection.A key issue in understanding AD is the need to solve complex and high-dimensional datasets and heterogeneous biomarkers,such as genetics,magnetic resonance imaging(MRI),cerebrospinal fluid(CSF),and cognitive scores.Establishing an interpretable reasoning system and performing interoperability that achieves in terms of a semantic model is potentially very useful.Thus,our aim in this work is to propose an interpretable approach to detect AD based on Alzheimer’s disease diagnosis ontology(ADDO)and the expression of semantic web rule language(SWRL).This work implements an ontology-based application that exploits three different machine learning models.These models are random forest(RF),JRip,and J48,which have been used along with the voting ensemble.ADNI dataset was used for this study.The proposed classifier’s result with the voting ensemble achieves a higher accuracy of 94.1%and precision of 94.3%.Our approach provides effective inference rules.Besides,it contributes to a real,accurate,and interpretable classifier model based on various AD biomarkers for inferring whether the subject is a normal cognitive(NC),significant memory concern(SMC),early mild cognitive impairment(EMCI),late mild cognitive impairment(LMCI),or AD.展开更多
The study analyzes the performance of bank-specific characteristics,macroeconomic indicators,and global factors to predict the bank lending in Turkey for the period 2002Q4–2019Q2.The objective of this study is first,...The study analyzes the performance of bank-specific characteristics,macroeconomic indicators,and global factors to predict the bank lending in Turkey for the period 2002Q4–2019Q2.The objective of this study is first,to clarify the possible nonlinear and nonparametric relationships between outstanding bank loans and bank-specific,macroeconomic,and global factors.Second,it aims to propose various machine learning algorithms that determine drivers of bank lending and benefits from the advantages of these techniques.The empirical findings indicate favorable evidence that the drivers of bank lending exhibit some nonlinearities.Additionally,partial dependence plots depict that numerous bank-specific characteristics and macroeconomic indicators tend to be important variables that influence bank lending behavior.The study’s findings have some policy implications for bank managers,regulatory authorities,and policymakers.展开更多
In this paper,we use machine learning techniques to form a cancer cell model that displays the growth and promotion of synaptic and electrical signals.Here,such a technique can be applied directly to the spiking neura...In this paper,we use machine learning techniques to form a cancer cell model that displays the growth and promotion of synaptic and electrical signals.Here,such a technique can be applied directly to the spiking neural network of cancer cell synapses.The results show that machine learning techniques for the spiked network of cancer cell synapses have the powerful function of neuron models and potential supervisors for different implementations.The changes in the neural activity of tumor microenvironment caused by synaptic and electrical signals are described.It can be used to cancer cells and tumor training processes of neural networks to reproduce complex spatiotemporal dynamics and to mechanize the association of excitatory synaptic structures which are between tumors and neurons in the brain with complex human health behaviors.展开更多
The purpose of this study is to develop a standard methodology for measuring the surface free energy (SFE),and its component parts of bamboo fiber materials.The current methods was reviewed to determine the surface te...The purpose of this study is to develop a standard methodology for measuring the surface free energy (SFE),and its component parts of bamboo fiber materials.The current methods was reviewed to determine the surface tension of natural fibers and the disadvantages of techniques used were discussed.Although numerous techniques have been employed to characterize surface tension of natural fibers,it seems that the credibility of results obtained may often be dubious.In this paper,critical surface tension estimates were obtained from computer aided machine vision based measurement.Data were then analyzed by the least squares method to estimate the components of SFE.SFE was estimated by least squares analysis and also by Schultz' method.By using the Fowkes method the polar and disperse fractions of the surface free energy of bamboo fiber materials can be obtained.Strictly speaking,this method is based on a combination of the knowledge of Fowkes theory. SFE is desirable when adhesion is required,and it avoids some of the limitations of existing studies which has been proposed.The calculation steps described in this research are only intended to explain the methods.The results show that the method that only determines SFE as a single parameter may be unable to differentiate adequately between bamboo fiber materials,but it is feasible and very efficient.In order to obtain the maximum performance from the computer aided machine vision based measurement instruments,this measurement should be recommended and kept available for reference.展开更多
Maintaining software once implemented on the end-user side is laborious and,over its lifetime,is most often considerably more expensive than the initial software development.The prediction of software maintainability ...Maintaining software once implemented on the end-user side is laborious and,over its lifetime,is most often considerably more expensive than the initial software development.The prediction of software maintainability lias emerged as an important research topic to address industry expectations for reducing costs,in particular,maintenance costs.Researchers and practitioners have been working on proposing and identifying a variety of techniques ranging from statistical to machine learning(ML)for better prediction of software maintainability.This review has been carried out to analyze the empirical evidence on the accuracy of software product maintainability prediction(SPMP)using ML techniques.This paper analyzes and discusses the findings of 77 selected studies published from 2000 to 2018 according to the following criteria:maintainability prediction techniques,validation methods,accuracy criteria,overall accuracy of ML techniques,and the techniques offering the best performance.The review process followed the well-known syslematic review process.The results show that ML techniques are frequently used in predicting maintainability.In particular,artificial neural network(ANN),support vector machine/regression(SVM/R).regression&decision trees(DT),and fuzzy neuro fuzzy(FNF)techniques are more accurate in terms of PRED and MMRE.The N-fold and leave-one-out cross-validation methods,and the MMRE and PRED accuracy criteria are frequently used in empirical studies.In general,ML techniques outperformed non-machine learning techniques,e.g.,regression analysis(RA)techniques,while FNF outperformed SVM/R.DT.and ANN in most experiments.However,while many techniques were reported superior,no specific one can be identified as the best.展开更多
The compressive strength of self-compacting concrete(SCC)needs to be determined during the construction design process.This paper shows that the compressive strength of SCC(CS of SCC)can be successfully predicted from...The compressive strength of self-compacting concrete(SCC)needs to be determined during the construction design process.This paper shows that the compressive strength of SCC(CS of SCC)can be successfully predicted from mix design and curing age by a machine learning(ML)technique named the Extreme Gradient Boosting(XGB)algorithm,including non-hybrid and hybrid models.Nine ML techniques,such as Linear regression(LR),K-Nearest Neighbors(KNN),Support Vector Machine(SVM),Decision Trees(DTR),Random Forest(RF),Gradient Boosting(GB),and Artificial Neural Network using two training algorithms LBFGS and SGD(denoted as ANN_LBFGS and ANN_SGD),are also compared with the XGB model.Moreover,the hybrid models of eight ML techniques and Particle Swarm Optimization(PSO)are constructed to highlight the reliability and accuracy of SCC compressive strength prediction by the XGB_PSO hybrid model.The highest number of SCC samples available in the literature is collected for building the ML techniques.Compared with previously published works’performance,the proposed XGB method,both hybrid and non-hybrid models,is the most reliable and robust of the examined techniques,and is more accurate than existing ML methods(R2=0.9644,RMSE=4.7801,and MAE=3.4832).Therefore,the XGB model can be used as a practical tool for engineers in predicting the CS of SCC.展开更多
Accurately predicting and estimating the squeezing and ground response to tunneling remains challenging.Moreover,tunnel-squeezing hazards are much more likely to occur in deeply buried long tunnels with complex engine...Accurately predicting and estimating the squeezing and ground response to tunneling remains challenging.Moreover,tunnel-squeezing hazards are much more likely to occur in deeply buried long tunnels with complex engineering-geological environments.There-fore,a high-performance predictive model for tunnel squeezing is necessary.A superior ensemble classifier is put forward in this study,which is composed of four individual classifiers(gradient boosting classifier,extra-trees classifier,AdaBoost classifier,and Logistic regression classifier)and two optimization algorithms(Bayesian optimization(BO)and sparrow search algorithm(SSA)).The training database covers five parameters:tunnel depth(H),rock tunneling quality index(Q),tunnel diameter(D),support stiffness(K),and strength stress ratio(SSR),about which the basic information is accessible at the early design phases.However,the dataset compiled from the literature is insufficient.Thus,the ten proposed methods are used to replace the missing values.During the model training pro-cess,BO shows its strong ability to optimize seventeen hyperparameters.When applied to tune the classifiers’weights,SSA achieves a fast and efficient performance.The novel Shapley Additive Explanations–LightGBM method indicates that the K is the most important input feature,followed by SSR,Q,H,and D,respectively.The ensemble classifier is then validated using the test set and additional his-torical case projects.The validation shows that the model can achieve an accuracy of 98%(i.e.,the error rate is 2%)on the test set,higher than those achieved by previous prediction models.Moreover,the predicted probability could provide warning information for timely support measures.Finally,the application results are illustrated through tests on the tunnel sections that have not yet been excavated in the line of the Sichuan–Tibet railway project.The applied predictive tendencies and laws are in line with the practical experience.In sum-mary,the proposed model’s prediction results are reasonable,and its prediction will be more accurate as more data is collected and trained for prewarning the tunnel squeezing hazard.展开更多
文摘The immediate international spread of severe acute respiratory syn-drome revealed the potential threat of infectious diseases in a closely integrated and interdependent world.When an outbreak occurs,each country must have a well-coordinated and preventative plan to address the situation.Information and Communication Technologies have provided innovative approaches to dealing with numerous facets of daily living.Although intelligent devices and applica-tions have become a vital part of our everyday lives,smart gadgets have also led to several physical and psychological health problems in modern society.Here,we used an artificial intelligence AI-based system for disease prediction using an Artificial Neural Network(ANN).The ANN improved the regularization of the classification model,hence increasing its accuracy.The unconstrained opti-mization model reduced the classifier’s cost function to obtain the lowest possible cost.To verify the performance of the intelligent system,we compared the out-comes of the suggested scheme with the results of previously proposed models.The proposed intelligent system achieved an accuracy of 0.89,and the miss rate 0.11 was higher than in previously proposed models.
文摘In the Tano River Basin,groundwater serves as a crucial resource;however,its quantity and quality with regard to trace elements and microbiological loadings remain poorly understood due to the lack of groundwater logs and limited water research.This study presents a comprehensive analysis of the Tano River Basin,focusing on three key objectives.First,it investigated the aquifer hydraulic parameters and the results showed significant spatial variations in borehole depths,yields,transmissivity,hydraulic conductivity,and specific capacity.Deeper boreholes were concentrated in the northeastern and southeastern zones,while geological formations,particu-larly the Apollonian Formation,exhibit a strong influence on borehole yields.The study identified areas with high transmissivity and hydraulic conductivity in the southern and eastern regions,suggesting good groundwater avail-ability and suitability for sustainable water supply.Sec-ondly,the research investigated the groundwater quality and observed that the majority of borehole samples fall within WHO(Guidelines for Drinking-water Quality,Environmental Health Criteria,Geneva,2011,2017.http://www.who.int)limit.However,some samples have pH levels below the standards,although the groundwater generally qualifies as freshwater.The study further explores hydrochemical facies and health risk assessment,highlighting the dominance of Ca–HCO3 water type.Trace element analysis reveals minimal health risks from most elements,with chromium(Cr)as the primary contributor to chronic health risk.Overall,this study has provided a key insights into the Tano River Basin’s hydrogeology and associated health risks.The outcome of this research has contributed to the broader understanding of hydrogeologi-cal dynamics and the importance of managing groundwater resources sustainably in complex geological environments.
文摘The development of digital technology has brought about a substantial evolution in the multimedia field.The use of generative technologies to produce digital multimedia material is one of the newer developments in this field.The“Digital Generative Multimedia Tool Theory”(DGMTT)is therefore presented in this theoretical postulation by Timothy Ekeledirichukwu Onyejelem and Eric Msughter Aondover.It discusses and describes the principles behind the development and deployment of generative tools in multimedia creation.The DGMTT offers an all-encompassing structure for comprehending and evaluating the fundamentals and consequences of generative tools in the production of multimedia content.It provides information about the creation and use of these instruments,thereby promoting developments in the digital media industry.These tools create dynamic and interactive multimedia content by utilizing machine learning,artificial intelligence,and algorithms.This theory emphasizes how crucial it is to comprehend the fundamental ideas and principles of generative tools in order to use them efficiently when creating digital media content.A wide range of industries,including journalism,advertising,entertainment,education,and the arts,can benefit from the practical use of DGMTT.It gives artists the ability to use generative technologies to create unique and customized multimedia content for its viewers.
文摘Urban traffic congestion is a severe and widely studied problem over the decade because of the negative impacts. However, in recent years some approaches emerge as proper and suitable solutions. The Carpooling initiative is one of the most representative efforts to propitiate a responsible use of particular vehicles. Thus, the paper introduces a carpooling model considering the users’ preference to reach an appropriate match among drivers and passengers. In particular, the paper conducts a study of 6 of the most avid classified techniques in machine learning to create a model for the selection of travel companions. The experimental results show the models’ precision and assess the best cases using Friedman’s test. Finally, the conclusions emphasize the relevance of the proposed study and suggest that it is necessary to extend the proposal with more drives and passengers’ data.
基金funded by Act 211 Government of the Russian Federation,Contract No.02.A03.21.0011.
文摘The uniaxial compressive strength(UCS)of rock is an essential property of rock material in different relevant applications,such as rock slope,tunnel construction,and foundation.It takes enormous time and effort to obtain the UCS values directly in the laboratory.Accordingly,an indirect determination of UCS through conducting several rock index tests that are easy and fast to carry out is of interest and importance.This study presents powerful boosting trees evaluation framework,i.e.,adaptive boosting machine,extreme gradient boosting machine(XGBoost),and category gradient boosting machine,for estimating the UCS of sandstone.Schmidt hammer rebound number,P-wave velocity,and point load index were chosen as considered factors to forecast UCS values of sandstone samples.Taylor diagrams and five regression metrics,including coefficient of determination(R2),root mean square error,mean absolute error,variance account for,and A-20 index,were used to evaluate and compare the performance of these boosting trees.The results showed that the proposed boosting trees are able to provide a high level of prediction capacity for the prepared database.In particular,itwas worth noting that XGBoost is the best model to predict sandstone strength and it achieved 0.999 training R^(2) and 0.958 testing R^(2).The proposed model had more outstanding capability than neural network with optimization techniques during training and testing phases.The performed variable importance analysis reveals that the point load index has a significant influence on predicting UCS of sandstone.
文摘Difficulty in communicating and interacting with other people are mainly due to the neurological disorder called autism spectrum disorder(ASD)diseases.These diseases can affect the nerves at any stage of the human being in childhood,adolescence,and adulthood.ASD is known as a behavioral disease due to the appearances of symptoms over thefirst two years that continue until adulthood.Most of the studies prove that the early detection of ASD helps improve the behavioral characteristics of patients with ASD.The detection of ASD is a very challenging task among various researchers.Machine learning(ML)algorithms still act very intelligent by learning the complex data and pre-dicting quality results.In this paper,ensemble ML techniques for the early detec-tion of ASD are proposed.In this detection,the dataset isfirst processed using three ML algorithms such as sequential minimal optimization with support vector machine,Kohonen self-organizing neural network,and random forest algorithm.The prediction results of these ML algorithms(ensemble)further use the bagging concept called max voting to predict thefinal result.The accuracy,sensitivity,and specificity of the proposed system are calculated using confusion matrix.The pro-posed ensemble technique performs better than state-of-the art ML algorithms.
文摘The exponential growth of Internet and network usage has neces-sitated heightened security measures to protect against data and network breaches.Intrusions,executed through network packets,pose a significant challenge for firewalls to detect and prevent due to the similarity between legit-imate and intrusion traffic.The vast network traffic volume also complicates most network monitoring systems and algorithms.Several intrusion detection methods have been proposed,with machine learning techniques regarded as promising for dealing with these incidents.This study presents an Intrusion Detection System Based on Stacking Ensemble Learning base(Random For-est,Decision Tree,and k-Nearest-Neighbors).The proposed system employs pre-processing techniques to enhance classification efficiency and integrates seven machine learning algorithms.The stacking ensemble technique increases performance by incorporating three base models(Random Forest,Decision Tree,and k-Nearest-Neighbors)and a meta-model represented by the Logistic Regression algorithm.Evaluated using the UNSW-NB15 dataset,the pro-posed IDS gained an accuracy of 96.16%in the training phase and 97.95%in the testing phase,with precision of 97.78%,and 98.40%for taring and testing,respectively.The obtained results demonstrate improvements in other measurement criteria.
文摘Condition based maintenance(CBM) is one of the solutions to machinery maintenance requirements. Latest approaches to CBM aim at reducing human engagement in the real-time fault detection and decision making. Machine learning techniques like fuzzy-logic-based systems, neural networks, and support vector machines help to reduce human involvement. Most of these techniques provide fault information with 100% confidence. It is undeniably apparent that this area has a vast application scope. To facilitate future exploration, this review is presented describing the centrifugal pump faults, the signals they generate, their CBM based diagnostic schemes, and case studies for blockage and cavitation fault detection in centrifugal pump(CP) by performing the experiment on test rig. The classification accuracy is above 98% for fault detection. This review gives a head-start to new researchers in this field and identifies the un-touched areas pertaining to CP fault diagnosis.
文摘In The Wireless Multimedia Sensor Network(WNSMs)have achieved popularity among diverse communities as a result of technological breakthroughs in sensor and current gadgets.By utilising portable technologies,it achieves solid and significant results in wireless communication,media transfer,and digital transmission.Sensor nodes have been used in agriculture and industry to detect characteristics such as temperature,moisture content,and other environmental conditions in recent decades.WNSMs have also made apps easier to use by giving devices self-governing access to send and process data connected with appro-priate audio and video information.Many video sensor network studies focus on lowering power consumption and increasing transmission capacity,but the main demand is data reliability.Because of the obstacles in the sensor nodes,WMSN is subjected to a variety of attacks,including Denial of Service(DoS)attacks.Deep Convolutional Neural Network is designed with the stateaction relationship mapping which is used to identify the DDOS Attackers present in the Wireless Sensor Networks for Smart Agriculture.The Proposed work it performs the data collection about the traffic conditions and identifies the deviation between the network conditions such as packet loss due to network congestion and the presence of attackers in the network.It reduces the attacker detection delay and improves the detection accuracy.In order to protect the network against DoS assaults,an improved machine learning technique must be offered.An efficient Deep Neural Network approach is provided for detecting DoS in WMSN.The required parameters are selected using an adaptive particle swarm optimization technique.The ratio of packet transmission,energy consumption,latency,network length,and throughput will be used to evaluate the approach’s efficiency.
文摘Cervical cancer is a serious public health issue worldwide, and early identification is crucial for better patient outcomes. Recent study has investigated how ML and DL approaches may be used to increase the accuracy of vagina tests. In this piece, we conducted a thorough review of 50 research studies that applied these techniques. Our investigation compared the outcomes to well-known screening techniques and concentrated on the datasets used and performance measurements reported. According to the research, convolutional neural networks and other deep learning approaches have potential for lowering false positives and boosting screening precision. Although several research used small sample sizes or constrained datasets, this raises questions about how applicable the findings are. This paper discusses the advantages and disadvantages of the articles that were chosen, as well as prospective topics for future research, to further the application of ml and dl in cervical cancer screening. The development of cervical cancer screening technologies that are more precise, accessible, and can lead to better public health outcomes is significantly affected by these findings.
基金supported by the funding of Belt and Road Research Institute,Xiamen University(No:1500-X2101200)National Natural Science Foundation of China(Key Program,No:72133003).
文摘The Conference of the Parties(COP26 and 27)placed significant emphasis on climate financing policies with the objective of achieving net zero emissions and carbon neutrality.However,studies on the implementation of this policy proposition are limited.To address this gap in the literature,this study employs machine learning techniques,specifically natural language processing(NLP),to examine 77 climate bond(CB)policies from 32 countries within the context of climate financing.The findings indicate that“sustainability”and“carbon emissions control”are the most outlined policy objectives in these CB policies.Additionally,the study highlights that most CB funds are invested toward energy projects(i.e.,renewable,clean,and efficient initiatives).However,there has been a notable shift in the allocation of CB funds from climate-friendly energy projects to the construction sector between 2015 and 2019.This shift raises concerns about the potential redirection of funds from climate-focused investments to the real estate industry,potentially leading to the greenwashing of climate funds.Furthermore,policy sentiment analysis revealed that a minority of policies hold skeptical views on climate change,which may negatively influence climate actions.Thus,the findings highlight that the effective implementation of CB policies depends on policy goals,objectives,and sentiments.Finally,this study contributes to the literature by employing NLP techniques to understand policy sentiments in climate financing.
文摘Forecasting travel demand requires a grasp of individual decision-making behavior.However,transport mode choice(TMC)is determined by personal and contextual factors that vary from person to person.Numerous characteristics have a substantial impact on travel behavior(TB),which makes it important to take into account while studying transport options.Traditional statistical techniques frequently presume linear correlations,but real-world data rarely follows these presumptions,which may make it harder to grasp the complex interactions.Thorough systematic review was conducted to examine how machine learning(ML)approaches might successfully capture nonlinear correlations that conventional methods may ignore to overcome such challenges.An in-depth analysis of discrete choice models(DCM)and several ML algorithms,datasets,model validation strategies,and tuning techniques employed in previous research is carried out in the present study.Besides,the current review also summarizes DCM and ML models to predict TMC and recognize the determinants of TB in an urban area for different transport modes.The two primary goals of our study are to establish the present conceptual frameworks for the factors influencing the TMC for daily activities and to pinpoint methodological issues and limitations in previous research.With a total of 39 studies,our findings shed important light on the significance of considering factors that influence the TMC.The adjusted kernel algorithms and hyperparameter-optimized ML algorithms outperform the typical ML algorithms.RF(random forest),SVM(support vector machine),ANN(artificial neural network),and interpretable ML algorithms are the most widely used ML algorithms for the prediction of TMC where RF achieved an R2 of 0.95 and SVM achieved an accuracy of 93.18%;however,the adjusted kernel enhanced the accuracy of SVM 99.81%which shows that the interpretable algorithms outperformed the typical algorithms.The sensitivity analysis indicates that the most significant parameters influencing TMC are the age,total trip time,and the number of drivers.
文摘Landslide is considered as one of the most severe threats to human life and property in the hilly areas of the world.The number of landslides and the level of damage across the globe has been increasing over time.Therefore,landslide management is essential to maintain the natural and socio-economic dynamics of the hilly region.Rorachu river basin is one of the most landslide-prone areas of the Sikkim selected for the present study.The prime goal of the study is to prepare landslide susceptibility maps(LSMs)using computer-based advanced machine learning techniques and compare the performance of the models.To properly understand the existing spatial relation with the landslide,twenty factors,including triggering and causative factors,were selected.A deep learning algorithm viz.convolutional neural network model(CNN)and three popular machine learning techniques,i.e.,random forest model(RF),artificial neural network model(ANN),and bagging model,were employed to prepare the LSMs.Two separate datasets including training and validation were designed by randomly taken landslide and nonlandslide points.A ratio of 70:30 was considered for the selection of both training and validation points.Multicollinearity was assessed by tolerance and variance inflation factor,and the role of individual conditioning factors was estimated using information gain ratio.The result reveals that there is no severe multicollinearity among the landslide conditioning factors,and the triggering factor rainfall appeared as the leading cause of the landslide.Based on the final prediction values of each model,LSM was constructed and successfully portioned into five distinct classes,like very low,low,moderate,high,and very high susceptibility.The susceptibility class-wise distribution of landslides shows that more than 90%of the landslide area falls under higher landslide susceptibility grades.The precision of models was examined using the area under the curve(AUC)of the receiver operating characteristics(ROC)curve and statistical methods like root mean square error(RMSE)and mean absolute error(MAE).In both datasets(training and validation),the CNN model achieved the maximum AUC value of 0.903 and 0.939,respectively.The lowest value of RMSE and MAE also reveals the better performance of the CNN model.So,it can be concluded that all the models have performed well,but the CNN model has outperformed the other models in terms of precision.
基金This work was supported by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(No.2021R1A2C1011198).
文摘Alzheimer’s disease(AD)is a very complex disease that causes brain failure,then eventually,dementia ensues.It is a global health problem.99%of clinical trials have failed to limit the progression of this disease.The risks and barriers to detecting AD are huge as pathological events begin decades before appearing clinical symptoms.Therapies for AD are likely to be more helpful if the diagnosis is determined early before the final stage of neurological dysfunction.In this regard,the need becomes more urgent for biomarker-based detection.A key issue in understanding AD is the need to solve complex and high-dimensional datasets and heterogeneous biomarkers,such as genetics,magnetic resonance imaging(MRI),cerebrospinal fluid(CSF),and cognitive scores.Establishing an interpretable reasoning system and performing interoperability that achieves in terms of a semantic model is potentially very useful.Thus,our aim in this work is to propose an interpretable approach to detect AD based on Alzheimer’s disease diagnosis ontology(ADDO)and the expression of semantic web rule language(SWRL).This work implements an ontology-based application that exploits three different machine learning models.These models are random forest(RF),JRip,and J48,which have been used along with the voting ensemble.ADNI dataset was used for this study.The proposed classifier’s result with the voting ensemble achieves a higher accuracy of 94.1%and precision of 94.3%.Our approach provides effective inference rules.Besides,it contributes to a real,accurate,and interpretable classifier model based on various AD biomarkers for inferring whether the subject is a normal cognitive(NC),significant memory concern(SMC),early mild cognitive impairment(EMCI),late mild cognitive impairment(LMCI),or AD.
文摘The study analyzes the performance of bank-specific characteristics,macroeconomic indicators,and global factors to predict the bank lending in Turkey for the period 2002Q4–2019Q2.The objective of this study is first,to clarify the possible nonlinear and nonparametric relationships between outstanding bank loans and bank-specific,macroeconomic,and global factors.Second,it aims to propose various machine learning algorithms that determine drivers of bank lending and benefits from the advantages of these techniques.The empirical findings indicate favorable evidence that the drivers of bank lending exhibit some nonlinearities.Additionally,partial dependence plots depict that numerous bank-specific characteristics and macroeconomic indicators tend to be important variables that influence bank lending behavior.The study’s findings have some policy implications for bank managers,regulatory authorities,and policymakers.
基金Project supported by the National Natural Science Foundation of China(Nos.11772046 and 81870345)。
文摘In this paper,we use machine learning techniques to form a cancer cell model that displays the growth and promotion of synaptic and electrical signals.Here,such a technique can be applied directly to the spiking neural network of cancer cell synapses.The results show that machine learning techniques for the spiked network of cancer cell synapses have the powerful function of neuron models and potential supervisors for different implementations.The changes in the neural activity of tumor microenvironment caused by synaptic and electrical signals are described.It can be used to cancer cells and tumor training processes of neural networks to reproduce complex spatiotemporal dynamics and to mechanize the association of excitatory synaptic structures which are between tumors and neurons in the brain with complex human health behaviors.
基金the National Natural Science Foundation of China(No.31101085)the Scientific Research and Development Foundation for Start-up Projects of Zhejiang Agriculture and Forestry University (No.2034020044)
文摘The purpose of this study is to develop a standard methodology for measuring the surface free energy (SFE),and its component parts of bamboo fiber materials.The current methods was reviewed to determine the surface tension of natural fibers and the disadvantages of techniques used were discussed.Although numerous techniques have been employed to characterize surface tension of natural fibers,it seems that the credibility of results obtained may often be dubious.In this paper,critical surface tension estimates were obtained from computer aided machine vision based measurement.Data were then analyzed by the least squares method to estimate the components of SFE.SFE was estimated by least squares analysis and also by Schultz' method.By using the Fowkes method the polar and disperse fractions of the surface free energy of bamboo fiber materials can be obtained.Strictly speaking,this method is based on a combination of the knowledge of Fowkes theory. SFE is desirable when adhesion is required,and it avoids some of the limitations of existing studies which has been proposed.The calculation steps described in this research are only intended to explain the methods.The results show that the method that only determines SFE as a single parameter may be unable to differentiate adequately between bamboo fiber materials,but it is feasible and very efficient.In order to obtain the maximum performance from the computer aided machine vision based measurement instruments,this measurement should be recommended and kept available for reference.
文摘Maintaining software once implemented on the end-user side is laborious and,over its lifetime,is most often considerably more expensive than the initial software development.The prediction of software maintainability lias emerged as an important research topic to address industry expectations for reducing costs,in particular,maintenance costs.Researchers and practitioners have been working on proposing and identifying a variety of techniques ranging from statistical to machine learning(ML)for better prediction of software maintainability.This review has been carried out to analyze the empirical evidence on the accuracy of software product maintainability prediction(SPMP)using ML techniques.This paper analyzes and discusses the findings of 77 selected studies published from 2000 to 2018 according to the following criteria:maintainability prediction techniques,validation methods,accuracy criteria,overall accuracy of ML techniques,and the techniques offering the best performance.The review process followed the well-known syslematic review process.The results show that ML techniques are frequently used in predicting maintainability.In particular,artificial neural network(ANN),support vector machine/regression(SVM/R).regression&decision trees(DT),and fuzzy neuro fuzzy(FNF)techniques are more accurate in terms of PRED and MMRE.The N-fold and leave-one-out cross-validation methods,and the MMRE and PRED accuracy criteria are frequently used in empirical studies.In general,ML techniques outperformed non-machine learning techniques,e.g.,regression analysis(RA)techniques,while FNF outperformed SVM/R.DT.and ANN in most experiments.However,while many techniques were reported superior,no specific one can be identified as the best.
文摘The compressive strength of self-compacting concrete(SCC)needs to be determined during the construction design process.This paper shows that the compressive strength of SCC(CS of SCC)can be successfully predicted from mix design and curing age by a machine learning(ML)technique named the Extreme Gradient Boosting(XGB)algorithm,including non-hybrid and hybrid models.Nine ML techniques,such as Linear regression(LR),K-Nearest Neighbors(KNN),Support Vector Machine(SVM),Decision Trees(DTR),Random Forest(RF),Gradient Boosting(GB),and Artificial Neural Network using two training algorithms LBFGS and SGD(denoted as ANN_LBFGS and ANN_SGD),are also compared with the XGB model.Moreover,the hybrid models of eight ML techniques and Particle Swarm Optimization(PSO)are constructed to highlight the reliability and accuracy of SCC compressive strength prediction by the XGB_PSO hybrid model.The highest number of SCC samples available in the literature is collected for building the ML techniques.Compared with previously published works’performance,the proposed XGB method,both hybrid and non-hybrid models,is the most reliable and robust of the examined techniques,and is more accurate than existing ML methods(R2=0.9644,RMSE=4.7801,and MAE=3.4832).Therefore,the XGB model can be used as a practical tool for engineers in predicting the CS of SCC.
基金supported by the National Natural Science Foundation of China(Grant Nos.U21A20153,41941018,52074258,41807250,42177140)the Key Research and Development Project of Hubei Province,China(Grant No.2021BCA133).
文摘Accurately predicting and estimating the squeezing and ground response to tunneling remains challenging.Moreover,tunnel-squeezing hazards are much more likely to occur in deeply buried long tunnels with complex engineering-geological environments.There-fore,a high-performance predictive model for tunnel squeezing is necessary.A superior ensemble classifier is put forward in this study,which is composed of four individual classifiers(gradient boosting classifier,extra-trees classifier,AdaBoost classifier,and Logistic regression classifier)and two optimization algorithms(Bayesian optimization(BO)and sparrow search algorithm(SSA)).The training database covers five parameters:tunnel depth(H),rock tunneling quality index(Q),tunnel diameter(D),support stiffness(K),and strength stress ratio(SSR),about which the basic information is accessible at the early design phases.However,the dataset compiled from the literature is insufficient.Thus,the ten proposed methods are used to replace the missing values.During the model training pro-cess,BO shows its strong ability to optimize seventeen hyperparameters.When applied to tune the classifiers’weights,SSA achieves a fast and efficient performance.The novel Shapley Additive Explanations–LightGBM method indicates that the K is the most important input feature,followed by SSR,Q,H,and D,respectively.The ensemble classifier is then validated using the test set and additional his-torical case projects.The validation shows that the model can achieve an accuracy of 98%(i.e.,the error rate is 2%)on the test set,higher than those achieved by previous prediction models.Moreover,the predicted probability could provide warning information for timely support measures.Finally,the application results are illustrated through tests on the tunnel sections that have not yet been excavated in the line of the Sichuan–Tibet railway project.The applied predictive tendencies and laws are in line with the practical experience.In sum-mary,the proposed model’s prediction results are reasonable,and its prediction will be more accurate as more data is collected and trained for prewarning the tunnel squeezing hazard.