Debris flows are rapid mass movements with a mixture of rock,soil and water.High-intensity rainfall events have triggered multiple debris flows around the globe,making it an important concern from the disaster managem...Debris flows are rapid mass movements with a mixture of rock,soil and water.High-intensity rainfall events have triggered multiple debris flows around the globe,making it an important concern from the disaster management perspective.This study presents a numerical model called debris flow simulation 2D(DFS 2D)and applicability of the proposed model is investigated through the values of the model parameters used for the reproduction of an occurred debris flow at Yindongzi gully in China on 13 August 2010.The model can be used to simulate debris flows using three different rheologies and has a userfriendly interface for providing the inputs.Using DFS 2D,flow parameters can be estimated with respect to space and time.The values of the flow resistance parameters of model,dry-Coulomb and turbulent friction,were calibrated through the back analysis and the values obtained are 0.1 and 1000 m/s^(2),respectively.Two new methods of calibration are proposed in this study,considering the crosssectional area of flow and topographical changes induced by the debris flow.The proposed methods of calibration provide an effective solution to the cumulative errors induced by coarse-resolution digital elevation models(DEMs)in numerical modelling of debris flows.The statistical indices such as Willmott's index of agreement,mean-absolute-error,and normalized-root-mean-square-error of the calibrated model are 0.5,1.02 and 1.44,respectively.The comparison between simulated and observed values of topographic changes indicates that DFS 2D provides satisfactory results and can be used for dynamic modelling of debris flows.展开更多
As massive underground projects have become popular in dense urban cities,a problem has arisen:which model predicts the best for Tunnel Boring Machine(TBM)performance in these tunneling projects?However,performance le...As massive underground projects have become popular in dense urban cities,a problem has arisen:which model predicts the best for Tunnel Boring Machine(TBM)performance in these tunneling projects?However,performance level of TBMs in complex geological conditions is still a great challenge for practitioners and researchers.On the other hand,a reliable and accurate prediction of TBM performance is essential to planning an applicable tunnel construction schedule.The performance of TBM is very difficult to estimate due to various geotechnical and geological factors and machine specifications.The previously-proposed intelligent techniques in this field are mostly based on a single or base model with a low level of accuracy.Hence,this study aims to introduce a hybrid randomforest(RF)technique optimized by global harmony search with generalized oppositionbased learning(GOGHS)for forecasting TBM advance rate(AR).Optimizing the RF hyper-parameters in terms of,e.g.,tree number and maximum tree depth is the main objective of using the GOGHS-RF model.In the modelling of this study,a comprehensive databasewith themost influential parameters onTBMtogetherwithTBM AR were used as input and output variables,respectively.To examine the capability and power of the GOGHSRF model,three more hybrid models of particle swarm optimization-RF,genetic algorithm-RF and artificial bee colony-RF were also constructed to forecast TBM AR.Evaluation of the developed models was performed by calculating several performance indices,including determination coefficient(R2),root-mean-square-error(RMSE),and mean-absolute-percentage-error(MAPE).The results showed that theGOGHS-RF is a more accurate technique for estimatingTBMAR compared to the other applied models.The newly-developedGOGHS-RFmodel enjoyed R2=0.9937 and 0.9844,respectively,for train and test stages,which are higher than a pre-developed RF.Also,the importance of the input parameters was interpreted through the SHapley Additive exPlanations(SHAP)method,and it was found that thrust force per cutter is the most important variable on TBMAR.The GOGHS-RF model can be used in mechanized tunnel projects for predicting and checking performance.展开更多
The deep‐sea ground contains a huge amount of energy and mineral resources,for example,oil,gas,and minerals.Various infrastructures such as floating structures,seabed structures,and foundations have been developed to...The deep‐sea ground contains a huge amount of energy and mineral resources,for example,oil,gas,and minerals.Various infrastructures such as floating structures,seabed structures,and foundations have been developed to exploit these resources.The seabed structures and foundations can be mainly classified into three types:subsea production structures,offshore pipelines,and anchors.This study reviewed the development,installation,and operation of these infrastructures,including their structures,design,installation,marine environment loads,and applications.On this basis,the research gaps and further research directions were explored through this literature review.First,different floating structures were briefly analyzed and reviewed to introduce the design requirements of the seabed structures and foundations.Second,the subsea production structures,including subsea manifolds and their foundations,were reviewed and discussed.Third,the basic characteristics and design methods of deep‐sea pipelines,including subsea pipelines and risers,were analyzed and reviewed.Finally,the installation and bearing capacity of deep‐sea subsea anchors and seabed trench influence on the anchor were reviewed.Through the review,it was found that marine environment conditions are the key inputs for any offshore structure design.The fabrication,installation,and operation of infrastructures should carefully consider the marine loads and geological conditions.Different structures have their own mechanical problems.The fatigue and stability of pipelines mainly depend on the soil‐structure interaction.Anchor selection should consider soil types and possible trench formation.These focuses and research gaps can provide a helpful guide on further research,installation,and operation of deep‐sea structures and foundations.展开更多
In this study, a novel approach of the landslide numerical risk factor(LNRF) bivariate model was used in ensemble with linear multivariate regression(LMR) and boosted regression tree(BRT) models, coupled with radar re...In this study, a novel approach of the landslide numerical risk factor(LNRF) bivariate model was used in ensemble with linear multivariate regression(LMR) and boosted regression tree(BRT) models, coupled with radar remote sensing data and geographic information system(GIS), for landslide susceptibility mapping(LSM) in the Gorganroud watershed, Iran. Fifteen topographic, hydrological, geological and environmental conditioning factors and a landslide inventory(70%, or 298 landslides) were used in mapping. Phased array-type L-band synthetic aperture radar data were used to extract topographic parameters. Coefficients of tolerance and variance inflation factor were used to determine the coherence among conditioning factors. Data for the landslide inventory map were obtained from various resources, such as Iranian Landslide Working Party(ILWP), Forestry, Rangeland and Watershed Organisation(FRWO), extensive field surveys, interpretation of aerial photos and satellite images, and radar data. Of the total data, 30% were used to validate LSMs, using area under the curve(AUC), frequency ratio(FR) and seed cell area index(SCAI).Normalised difference vegetation index, land use/land cover and slope degree in BRT model elevation, rainfall and distance from stream were found to be important factors and were given the highest weightage in modelling. Validation results using AUC showed that the ensemble LNRF-BRT and LNRFLMR models(AUC = 0.912(91.2%) and 0.907(90.7%), respectively) had high predictive accuracy than the LNRF model alone(AUC = 0.855(85.5%)). The FR and SCAI analyses showed that all models divided the parameter classes with high precision. Overall, our novel approach of combining multivariate and machine learning methods with bivariate models, radar remote sensing data and GIS proved to be a powerful tool for landslide susceptibility mapping.展开更多
In recent years,landslide susceptibility mapping has substantially improved with advances in machine learning.However,there are still challenges remain in landslide mapping due to the availability of limited inventory...In recent years,landslide susceptibility mapping has substantially improved with advances in machine learning.However,there are still challenges remain in landslide mapping due to the availability of limited inventory data.In this paper,a novel method that improves the performance of machine learning techniques is presented.The proposed method creates synthetic inventory data using Generative Adversarial Networks(GANs)for improving the prediction of landslides.In this research,landslide inventory data of 156 landslide locations were identified in Cameron Highlands,Malaysia,taken from previous projects the authors worked on.Elevation,slope,aspect,plan curvature,profile curvature,total curvature,lithology,land use and land cover(LULC),distance to the road,distance to the river,stream power index(SPI),sediment transport index(STI),terrain roughness index(TRI),topographic wetness index(TWI)and vegetation density are geo-environmental factors considered in this study based on suggestions from previous works on Cameron Highlands.To show the capability of GANs in improving landslide prediction models,this study tests the proposed GAN model with benchmark models namely Artificial Neural Network(ANN),Support Vector Machine(SVM),Decision Trees(DT),Random Forest(RF)and Bagging ensemble models with ANN and SVM models.These models were validated using the area under the receiver operating characteristic curve(AUROC).The DT,RF,SVM,ANN and Bagging ensemble could achieve the AUROC values of(0.90,0.94,0.86,0.69 and 0.82)for the training;and the AUROC of(0.76,0.81,0.85,0.72 and 0.75)for the test,subsequently.When using additional samples,the same models achieved the AUROC values of(0.92,0.94,0.88,0.75 and 0.84)for the training and(0.78,0.82,0.82,0.78 and 0.80)for the test,respectively.Using the additional samples improved the test accuracy of all the models except SVM.As a result,in data-scarce environments,this research showed that utilizing GANs to generate supplementary samples is promising because it can improve the predictive capability of common landslide prediction models.展开更多
Catastrophic natural hazards,such as earthquake,pose serious threats to properties and human lives in urban areas.Therefore,earthquake risk assessment(ERA)is indispensable in disaster management.ERA is an integration ...Catastrophic natural hazards,such as earthquake,pose serious threats to properties and human lives in urban areas.Therefore,earthquake risk assessment(ERA)is indispensable in disaster management.ERA is an integration of the extent of probability and vulnerability of assets.This study develops an integrated model by using the artificial neural network–analytic hierarchy process(ANN–AHP)model for constructing the ERA map.The aim of the study is to quantify urban population risk that may be caused by impending earthquakes.The model is applied to the city of Banda Aceh in Indonesia,a seismically active zone of Aceh province frequently affected by devastating earthquakes.ANN is used for probability mapping,whereas AHP is used to assess urban vulnerability after the hazard map is created with the aid of earthquake intensity variation thematic layering.The risk map is subsequently created by combining the probability,hazard,and vulnerability maps.Then,the risk levels of various zones are obtained.The validation process reveals that the proposed model can map the earthquake probability based on historical events with an accuracy of 84%.Furthermore,results show that the central and southeastern regions of the city have moderate to very high risk classifications,whereas the other parts of the city fall under low to very low earthquake risk classifications.The findings of this research are useful for government agencies and decision makers,particularly in estimating risk dimensions in urban areas and for the future studies to project the preparedness strategies for Banda Aceh.展开更多
One important step in binary modeling of environmental problems is the generation of absence-datasets that are traditionally generated by random sampling and can undermine the quality of outputs.To solve this problem,...One important step in binary modeling of environmental problems is the generation of absence-datasets that are traditionally generated by random sampling and can undermine the quality of outputs.To solve this problem,this study develops the Absence Point Generation(APG)toolbox which is a Python-based ArcGIS toolbox for automated construction of absence-datasets for geospatial studies.The APG employs a frequency ratio analysis of four commonly used and important driving factors such as altitude,slope degree,topographic wetness index,and distance from rivers,and considers the presence locations buffer and density layers to define the low potential or susceptibility zones where absence-datasets are generated.To test the APG toolbox,we applied two benchmark algorithms of random forest(RF)and boosted regression trees(BRT)in a case study to investigate groundwater potential using three absence datasets i.e.,the APG,random,and selection of absence samples(SAS)toolbox.The BRT-APG and RF-APG had the area under receiver operating curve(AUC)values of 0.947 and 0.942,while BRT and RF had weaker performances with the SAS and Random datasets.This effect resulted in AUC improvements for BRT and RF by 7.2,and 9.7%from the Random dataset,and AUC improvements for BRT and RF by 6.1,and 5.4%from the SAS dataset,respectively.The APG also impacted the importance of the input factors and the pattern of the groundwater potential maps,which proves the importance of absence points in environmental binary issues.The proposed APG toolbox could be easily applied in other environmental hazards such as landslides,floods,and gully erosion,and land subsidence.展开更多
Earthquake prediction is currently the most crucial task required for the probability,hazard,risk mapping,and mitigation purposes.Earthquake prediction attracts the researchers'attention from both academia and ind...Earthquake prediction is currently the most crucial task required for the probability,hazard,risk mapping,and mitigation purposes.Earthquake prediction attracts the researchers'attention from both academia and industries.Traditionally,the risk assessment approaches have used various traditional and machine learning models.However,deep learning techniques have been rarely tested for earthquake probability mapping.Therefore,this study develops a convolutional neural network(CNN)model for earthquake probability assessment in NE India.Then conducts vulnerability using analytical hierarchy process(AHP),Venn's intersection theory for hazard,and integrated model for risk mapping.A prediction of classification task was performed in which the model predicts magnitudes more than 4 Mw that considers nine indicators.Prediction classification results and intensity variation were then used for probability and hazard mapping,respectively.Finally,earthquake risk map was produced by multiplying hazard,vulnerability,and coping capacity.The vulnerability was prepared by using six vulnerable factors,and the coping capacity was estimated by using the number of hospitals and associated variables,including budget available for disaster management.The CNN model for a probability distribution is a robust technique that provides good accuracy.Results show that CNN is superior to the other algorithms,which completed the classification prediction task with an accuracy of 0.94,precision of 0.98,recall of 0.85,and F1 score of 0.91.These indicators were used for probability mapping,and the total area of hazard(21,412.94 km^(2)),vulnerability(480.98 km^(2)),and risk(34,586.10 km^(2))was estimated.展开更多
The devastating effect of soil erosion is one of the major sources of land degradation that affects human lives in many ways which occur mainly due to deforestation, poor agricultural practices, overgrazing,wildfire a...The devastating effect of soil erosion is one of the major sources of land degradation that affects human lives in many ways which occur mainly due to deforestation, poor agricultural practices, overgrazing,wildfire and urbanization. Soil erosion often leads to soil truncation, loss of fertility, slope instability, etc.which causes irreversible effects on the poorly renewable soil resource. In view of this, a study was conducted in Kelantan River basin to predict soil loss as influenced by long-term land use/land-cover(LULC) changes in the area. The study was conducted with the aim of predicting and assessing soil erosion as it is influenced by long-term LULC changes. The 13,100 km^2 watershed was delineated into four sub-catchments Galas, Pergau, Lebir and Nenggiri for precise result estimation and ease of execution. GIS-based Universal Soil Loss Equation(USLE) model was used to predict soil loss in this study. The model inputs used for the temporal and spatial calculation of soil erosion include rainfall erosivity factor,topographic factor, land cover and management factor as well as erodibility factor. The results showed that 67.54% of soil loss is located under low erosion potential(reversible soil loss) or 0-1 t ha^(-1) yr^(-1) soil loss in Galas, 59.17% in Pergau, 53.32% in Lebir and 56.76% in Nenggiri all under the 2013 LULC condition.Results from the correlation of soil erosion rates with LULC changes indicated that cleared land in all the four catchments and under all LULC conditions(1984-2013) appears to be the dominant with the highest erosion losses. Similarly, grassland and forest were also observed to regulate erosion rates in the area. This is because the vegetation cover provided by these LULC types protects the soil from direct impact of rain drops which invariably reduce soil loss to the barest minimum. Overall, it was concluded that the results have shown the significance of LULC in the control of erosion. Maps generated from the study may be useful to planners and land use managers to take appropriate decisions for soil conservation.展开更多
The rabbit has been recognized as a valuable model in various biomedical and biological research fields because of its intermediate size and phylogenetic proximity to primates.However,the technology for precise genome...The rabbit has been recognized as a valuable model in various biomedical and biological research fields because of its intermediate size and phylogenetic proximity to primates.However,the technology for precise genome manipulations in rabbit has been stalled for decades,severely limiting its applications in biomedical research.Novel genome editing technologies,especially CRISPR/Cas9,have remarkably enhanced precise genome manipulation in rabbits,and shown their superiority and promise for generating rabbit models of human genetic diseases.In this review,we summarize the brief history of transgenic rabbit technology and the development of novel genome editing technologies in rabbits.展开更多
Gully erosion is a disruptive phenomenon which extensively affects the Iranian territory,especially in the Northern provinces.A number of studies have been recently undertaken to study this process and to predict it o...Gully erosion is a disruptive phenomenon which extensively affects the Iranian territory,especially in the Northern provinces.A number of studies have been recently undertaken to study this process and to predict it over space and ultimately,in a broader national effort,to limit its negative effects on local communities.We focused on the Bastam watershed where 9.3%of its surface is currently affected by gullying.Machine learning algorithms are currently under the magnifying glass across the geomorphological community for their high predictive ability.However,unlike the bivariate statistical models,their structure does not provide intuitive and quantifiable measures of environmental preconditioning factors.To cope with such weakness,we interpret preconditioning causes on the basis of a bivariate approach namely,Index of Entropy.And,we performed the susceptibility mapping procedure by testing three extensions of a decision tree model namely,Alternating Decision Tree(ADTree),Naive-Bayes tree(NBTree),and Logistic Model Tree(LMT).We dichotomized the gully information over space into gully presence/absence conditions,which we further explored in their calibration and validation stages.Being the presence/absence information and associated factors identical,the resulting differences are only due to the algorithmic structures of the three models we chose.Such differences are not significant in terms of performances;in fact,the three models produce outstanding predictive AUC measures(ADTree=0.922;NBTree=0.939;LMT=0.944).However,the associated mapping results depict very different patterns where only the LMT is associated with reasonable susceptibility patterns.This is a strong indication of what model combines best performance and mapping for any natural hazard-oriented application.展开更多
Geogenic dust is commonly believed to be one of the most important environmental problems in the Middle East.The present study investigated the geochemical characteristics of atmospheric dust particles in Shiraz City(...Geogenic dust is commonly believed to be one of the most important environmental problems in the Middle East.The present study investigated the geochemical characteristics of atmospheric dust particles in Shiraz City(south of Iran).Atmospheric dust samples were collected through a dry collector method by using glass trays at 10 location sites in May 2018.Elemental composition was analysed through inductively coupled plasma optical emission spectrometry.Meteorological data showed that the dustiest days were usually in spring and summer,particularly in April.X-ray diffraction analysis of atmospheric dust samples indicated that the mineralogical composition of atmospheric dust was calcite+dolomite(24%)>palygorskite(18%)>quartz(14%)>muscovite(13%)>albite(11%)>kaolinite(7%)>gypsum(7%)>zircon=anatase(3%).The high occurrence of palygorskite(16%-23%) could serve as a tracer of the source areas of dust storms from the desert of Iraq and Saudi Arabia to the South of Iran.Scanning electron microscopy indicated that the sizes of the collected dust varied from 50 μm to0.8 μm,but 10 μm was the predominant size.The atmospheric dust collected had prismatic trigonal-rhombohedral crystals and semi-rounded irregular shapes.Moreover,diatoms were detected in several samples,suggesting that emissions from dry-bed lakes,such as Hoor Al-Azim Wetland(located in the southwest of Iran),also contributed to the dust load.Backward trajectory simulations were performed at the date of sampling by using the NOAA HYSPLIT model.Results showed that the sources of atmospheric dust in the study area were the eastern area of Iraq,eastern desert of Saudi Arabia,Kuwait and Khuzestan Province.The Ca/Al ratio of the collected samples(1.14) was different from the upper continental crust(UCC) value(UCC=0.37),whereas Mg/A1(0.29),K/Al(0.22) and Ti/Al(0.07) ratios were close to the UCC value(0.04).This condition favours desert calcisols as the main mineral dust sources.Analysis of the crustal enrichment factor(EF_(crustal)) revealed geogenic sources for V,Mo,Pb,Sr,Cu and Zn(<2),whereas anthropogenic sources affected As,Cd,Cr and Ni.展开更多
Energy is a vital commodity that sustains human lives,as well as economic processes.The challenges towards energy generation,demand and supply are plenty owing to the use of fossil fuels leading to climate change and ...Energy is a vital commodity that sustains human lives,as well as economic processes.The challenges towards energy generation,demand and supply are plenty owing to the use of fossil fuels leading to climate change and environmental problems like water and air pollution.With the increasing awareness over climate change,post Paris Agreement,the role of energy plays a key role towards achieving the proposed target.The contributions in this Special Issue of Geoscience Frontiers on Energy includes 8 papers from esteemed research groups worldwide which explores,highlights and provide new insights towards the various aspects of energy.展开更多
Increased expression of matrix metalloproteinase-1(MMP-1)has been observed in the lesions of atherosclerosis and aneurysms;however,it is not fully understood whether macrophage-derived MMP-1 affects these diseases.To ...Increased expression of matrix metalloproteinase-1(MMP-1)has been observed in the lesions of atherosclerosis and aneurysms;however,it is not fully understood whether macrophage-derived MMP-1 affects these diseases.To investigate whether macrophage-derived MMP-1 participates in the development of vascular diseases,we generated transgenic(Tg)rabbits expressing human MMP-1 in the monocyte/macrophage lineage under the control of the human scavenger receptor enhancer/promoter.Tg rabbits exhibited no visible abnormalities throughout their bodies.Western blotting analysis revealed that the amount of MMP-1 proteins in the conditioned media secreted from peritoneal macrophages of Tg rabbits was up to 3-fold higher than that in non-Tg rabbits.For the first experiment,Tg and non-Tg rabbits were fed a cholesterol diet for 16 weeks,and aortic and coronary atherosclerosis were evaluated.The gross lesion area of aortic atherosclerosis in Tg rabbits was not significantly different from that in non-Tg rabbits,but Tg rabbits had marked destruction of the medial elastic lamina of the aortic lesions on microscopic examination.For the second experiment,we generated aortic aneurysms by incubating with elastase.Compared with non-Tg rabbits,Tg rabbits exhibited a significantly greater aortic dilation.Increased macrophage-derived MMP-1 led to increased medial destruction in both aortic atherosclerosis and aneurysms.These results demonstrate that MMP-1 plays a different role in the pathogenesis of atherosclerosis and aneurysms.展开更多
The sub-watershed prioritization is the ranking of different areas of a river basin according to their need to proper planning and management of soil and water resources.Decision makers should optimally allocate the i...The sub-watershed prioritization is the ranking of different areas of a river basin according to their need to proper planning and management of soil and water resources.Decision makers should optimally allocate the investments to critical sub-watersheds in an economically effective and technically efficient manner.Hence,this study aimed at developing a user-friendly geographic information system(GIS)tool,Sub-Watershed Prioritization Tool(SWPT),using the Python programming language to decrease any possible uncertainty.It used geospatial-statistical techniques for analyzing morphometric and topohydrological factors and automatically identifying critical and priority sub-watersheds.In order to assess the capability and reliability of the SWPT tool,it was successfully applied in a watershed in the Golestan Province,Northern Iran.Historical records of flood and landslide events indicated that the SWPT correctly recognized critical sub-watersheds.It provided a cost-effective approach for prioritization of sub-watersheds.Therefore,the SWPT is practically applicable and replicable to other regions where gauge data is not available for each sub-watershed.展开更多
Driven by the challenge of integrating large amount of experimental data, classification technique emerges as one of the major and popular tools in computational biology and bioinformatics research. Machine learning m...Driven by the challenge of integrating large amount of experimental data, classification technique emerges as one of the major and popular tools in computational biology and bioinformatics research. Machine learning methods, especially kernel methods with Support Vector Machines (SVMs) are very popular and effective tools. In the perspective of kernel matrix, a technique namely Eigen- matrix translation has been introduced for protein data classification. The Eigen-matrix translation strategy has a lot of nice properties which deserve more exploration. This paper investigates the major role of Eigen-matrix translation in classification. The authors propose that its importance lies in the dimension reduction of predictor attributes within the data set. This is very important when the dimension of features is huge. The authors show by numerical experiments on real biological data sets that the proposed framework is crucial and effective in improving classification accuracy. This can therefore serve as a novel perspective for future research in dimension reduction problems.展开更多
The application of Artificial Intelligence in various fields has witnessed tremendous progress in the recent years.The field of geosciences and natural hazard modelling has also benefitted immensely from the introduct...The application of Artificial Intelligence in various fields has witnessed tremendous progress in the recent years.The field of geosciences and natural hazard modelling has also benefitted immensely from the introduction of novel algorithms,the availability of large quantities of data,and the increase in computational capacity.The enhancement in algorithms can be largely attributed to the elevated complexity of the network architecture and the heightened level of abstraction found in the network's later layers.As a result,AI models lack transparency and accountability,often being dubbed as"black box"models.Explainable AI(XAI)is emerging as a solution to make AI models more transparent,especially in domains where transparency is essential.Much discussion surrounds the use of XAI for diverse purposes,as researchers explore its applications across various domains.With the growing body of research papers on XAI case studies,it has become increasingly important to address existing gaps in the literature.The current literature lacks a comprehensive understanding of the capabilities,limitations,and practical implications of XAI.This study provides a comprehensive overview of what constitutes XAI,how it is being used and potential applications in hydrometeorological natural hazards.It aims to serve as a useful reference for researchers,practitioners,and stakeholders who are currently using or intending to adopt XAI,thereby contributing to the advancements for wider acceptance of XAI in the future.展开更多
Speech recognition systems have become a unique human-computer interaction(HCI)family.Speech is one of the most naturally developed human abilities;speech signal processing opens up a transparent and hand-free computa...Speech recognition systems have become a unique human-computer interaction(HCI)family.Speech is one of the most naturally developed human abilities;speech signal processing opens up a transparent and hand-free computation experience.This paper aims to present a retrospective yet modern approach to the world of speech recognition systems.The development journey of ASR(Automatic Speech Recognition)has seen quite a few milestones and breakthrough technologies that have been highlighted in this paper.A step-by-step rundown of the fundamental stages in developing speech recognition systems has been presented,along with a brief discussion of various modern-day developments and applications in this domain.This review paper aims to summarize and provide a beginning point for those starting in the vast field of speech signal processing.Since speech recognition has a vast potential in various industries like telecommunication,emotion recognition,healthcare,etc.,this review would be helpful to researchers who aim at exploring more applications that society can quickly adopt in future years of evolution.展开更多
Contactless verification is possible with iris biometric identification,which helps prevent infections like COVID-19 from spreading.Biometric systems have grown unsteady and dangerous as a result of spoofing assaults ...Contactless verification is possible with iris biometric identification,which helps prevent infections like COVID-19 from spreading.Biometric systems have grown unsteady and dangerous as a result of spoofing assaults employing contact lenses,replayed the video,and print attacks.The work demonstrates an iris liveness detection approach by utilizing fragmental coefficients of Haar transformed Iris images as signatures to prevent spoofing attacks for the very first time in the identification of iris liveness.Seven assorted feature creation ways are studied in the presented solutions,and these created features are explored for the training of eight distinct machine learning classifiers and ensembles.The predicted iris liveness identification variants are evaluated using recall,F-measure,precision,accuracy,APCER,BPCER,and ACER.Three standard datasets were used in the investigation.The main contribution of our study is achieving a good accuracy of 99.18%with a smaller feature vector.The fragmental coefficients of Haar transformed iris image of size 8∗8 utilizing random forest algorithm showed superior iris liveness detection with reduced featured vector size(64 features).Random forest gave 99.18%accuracy.Additionally,conduct an extensive experiment on cross datasets for detailed analysis.The results of our experiments showthat the iris biometric template is decreased in size tomake the proposed framework suitable for algorithmic verification in real-time environments and settings.展开更多
In the post-genomic era, the construction and control of genetic regulatory networks using gene expression data is a hot research topic. Boolean networks (BNs) and its extension Probabilistic Boolean Networks (PBNs) h...In the post-genomic era, the construction and control of genetic regulatory networks using gene expression data is a hot research topic. Boolean networks (BNs) and its extension Probabilistic Boolean Networks (PBNs) have been served as an effective tool for this purpose. However, PBNs are difficult to be used in practice when the number of genes is large because of the huge computational cost. In this paper, we propose a simplified multivariate Markov model for approximating a PBN The new model can preserve the strength of PBNs, the ability to capture the inter-dependence of the genes in the network, qnd at the same time reduce the complexity of the network and therefore the computational cost. We then present an optimal control model with hard constraints for the purpose of control/intervention of a genetic regulatory network. Numerical experimental examples based on the yeast data are given to demonstrate the effectiveness of our proposed model and control policy.展开更多
基金financially supported by Department of Space,India(Grant No.ISRO/RES/4/663/18-19)。
文摘Debris flows are rapid mass movements with a mixture of rock,soil and water.High-intensity rainfall events have triggered multiple debris flows around the globe,making it an important concern from the disaster management perspective.This study presents a numerical model called debris flow simulation 2D(DFS 2D)and applicability of the proposed model is investigated through the values of the model parameters used for the reproduction of an occurred debris flow at Yindongzi gully in China on 13 August 2010.The model can be used to simulate debris flows using three different rheologies and has a userfriendly interface for providing the inputs.Using DFS 2D,flow parameters can be estimated with respect to space and time.The values of the flow resistance parameters of model,dry-Coulomb and turbulent friction,were calibrated through the back analysis and the values obtained are 0.1 and 1000 m/s^(2),respectively.Two new methods of calibration are proposed in this study,considering the crosssectional area of flow and topographical changes induced by the debris flow.The proposed methods of calibration provide an effective solution to the cumulative errors induced by coarse-resolution digital elevation models(DEMs)in numerical modelling of debris flows.The statistical indices such as Willmott's index of agreement,mean-absolute-error,and normalized-root-mean-square-error of the calibrated model are 0.5,1.02 and 1.44,respectively.The comparison between simulated and observed values of topographic changes indicates that DFS 2D provides satisfactory results and can be used for dynamic modelling of debris flows.
基金the National Natural Science Foundation of China(Grant 42177164)the Distinguished Youth Science Foundation of Hunan Province of China(2022JJ10073).
文摘As massive underground projects have become popular in dense urban cities,a problem has arisen:which model predicts the best for Tunnel Boring Machine(TBM)performance in these tunneling projects?However,performance level of TBMs in complex geological conditions is still a great challenge for practitioners and researchers.On the other hand,a reliable and accurate prediction of TBM performance is essential to planning an applicable tunnel construction schedule.The performance of TBM is very difficult to estimate due to various geotechnical and geological factors and machine specifications.The previously-proposed intelligent techniques in this field are mostly based on a single or base model with a low level of accuracy.Hence,this study aims to introduce a hybrid randomforest(RF)technique optimized by global harmony search with generalized oppositionbased learning(GOGHS)for forecasting TBM advance rate(AR).Optimizing the RF hyper-parameters in terms of,e.g.,tree number and maximum tree depth is the main objective of using the GOGHS-RF model.In the modelling of this study,a comprehensive databasewith themost influential parameters onTBMtogetherwithTBM AR were used as input and output variables,respectively.To examine the capability and power of the GOGHSRF model,three more hybrid models of particle swarm optimization-RF,genetic algorithm-RF and artificial bee colony-RF were also constructed to forecast TBM AR.Evaluation of the developed models was performed by calculating several performance indices,including determination coefficient(R2),root-mean-square-error(RMSE),and mean-absolute-percentage-error(MAPE).The results showed that theGOGHS-RF is a more accurate technique for estimatingTBMAR compared to the other applied models.The newly-developedGOGHS-RFmodel enjoyed R2=0.9937 and 0.9844,respectively,for train and test stages,which are higher than a pre-developed RF.Also,the importance of the input parameters was interpreted through the SHapley Additive exPlanations(SHAP)method,and it was found that thrust force per cutter is the most important variable on TBMAR.The GOGHS-RF model can be used in mechanized tunnel projects for predicting and checking performance.
基金Key Research and Development program of Zhejiang ProvinceGrant/Award Number:2018C03031+3 种基金The Open Foundation of Key Laboratory of Offshore Geotechnical and Material Engineering of Zhejiang Province,Grant/Award Number:OGME21003Natural Science Foundation of Zhejiang Province,Grant/Award Numbers:LHZ19E090003,LY15E090002Norges Forskningsr?d,Grant/Award Number:OGME21003National Natural Science Foundation of China,Grant/Award Numbers:51209183,51779220,52101334。
文摘The deep‐sea ground contains a huge amount of energy and mineral resources,for example,oil,gas,and minerals.Various infrastructures such as floating structures,seabed structures,and foundations have been developed to exploit these resources.The seabed structures and foundations can be mainly classified into three types:subsea production structures,offshore pipelines,and anchors.This study reviewed the development,installation,and operation of these infrastructures,including their structures,design,installation,marine environment loads,and applications.On this basis,the research gaps and further research directions were explored through this literature review.First,different floating structures were briefly analyzed and reviewed to introduce the design requirements of the seabed structures and foundations.Second,the subsea production structures,including subsea manifolds and their foundations,were reviewed and discussed.Third,the basic characteristics and design methods of deep‐sea pipelines,including subsea pipelines and risers,were analyzed and reviewed.Finally,the installation and bearing capacity of deep‐sea subsea anchors and seabed trench influence on the anchor were reviewed.Through the review,it was found that marine environment conditions are the key inputs for any offshore structure design.The fabrication,installation,and operation of infrastructures should carefully consider the marine loads and geological conditions.Different structures have their own mechanical problems.The fatigue and stability of pipelines mainly depend on the soil‐structure interaction.Anchor selection should consider soil types and possible trench formation.These focuses and research gaps can provide a helpful guide on further research,installation,and operation of deep‐sea structures and foundations.
基金supported by the Centre for Advanced Modelling and Geospatial Information Systems(CAMGIS),UTS under grant numbers 321740.2232335,323930,and 321740.2232357
文摘In this study, a novel approach of the landslide numerical risk factor(LNRF) bivariate model was used in ensemble with linear multivariate regression(LMR) and boosted regression tree(BRT) models, coupled with radar remote sensing data and geographic information system(GIS), for landslide susceptibility mapping(LSM) in the Gorganroud watershed, Iran. Fifteen topographic, hydrological, geological and environmental conditioning factors and a landslide inventory(70%, or 298 landslides) were used in mapping. Phased array-type L-band synthetic aperture radar data were used to extract topographic parameters. Coefficients of tolerance and variance inflation factor were used to determine the coherence among conditioning factors. Data for the landslide inventory map were obtained from various resources, such as Iranian Landslide Working Party(ILWP), Forestry, Rangeland and Watershed Organisation(FRWO), extensive field surveys, interpretation of aerial photos and satellite images, and radar data. Of the total data, 30% were used to validate LSMs, using area under the curve(AUC), frequency ratio(FR) and seed cell area index(SCAI).Normalised difference vegetation index, land use/land cover and slope degree in BRT model elevation, rainfall and distance from stream were found to be important factors and were given the highest weightage in modelling. Validation results using AUC showed that the ensemble LNRF-BRT and LNRFLMR models(AUC = 0.912(91.2%) and 0.907(90.7%), respectively) had high predictive accuracy than the LNRF model alone(AUC = 0.855(85.5%)). The FR and SCAI analyses showed that all models divided the parameter classes with high precision. Overall, our novel approach of combining multivariate and machine learning methods with bivariate models, radar remote sensing data and GIS proved to be a powerful tool for landslide susceptibility mapping.
基金This research is funded by the Centre for Advanced Modeling and Geospatial Information Systems(CAMGIS),Faculty of Engineering and Information Technology,the University of Technology Sydney,Australia.
文摘In recent years,landslide susceptibility mapping has substantially improved with advances in machine learning.However,there are still challenges remain in landslide mapping due to the availability of limited inventory data.In this paper,a novel method that improves the performance of machine learning techniques is presented.The proposed method creates synthetic inventory data using Generative Adversarial Networks(GANs)for improving the prediction of landslides.In this research,landslide inventory data of 156 landslide locations were identified in Cameron Highlands,Malaysia,taken from previous projects the authors worked on.Elevation,slope,aspect,plan curvature,profile curvature,total curvature,lithology,land use and land cover(LULC),distance to the road,distance to the river,stream power index(SPI),sediment transport index(STI),terrain roughness index(TRI),topographic wetness index(TWI)and vegetation density are geo-environmental factors considered in this study based on suggestions from previous works on Cameron Highlands.To show the capability of GANs in improving landslide prediction models,this study tests the proposed GAN model with benchmark models namely Artificial Neural Network(ANN),Support Vector Machine(SVM),Decision Trees(DT),Random Forest(RF)and Bagging ensemble models with ANN and SVM models.These models were validated using the area under the receiver operating characteristic curve(AUROC).The DT,RF,SVM,ANN and Bagging ensemble could achieve the AUROC values of(0.90,0.94,0.86,0.69 and 0.82)for the training;and the AUROC of(0.76,0.81,0.85,0.72 and 0.75)for the test,subsequently.When using additional samples,the same models achieved the AUROC values of(0.92,0.94,0.88,0.75 and 0.84)for the training and(0.78,0.82,0.82,0.78 and 0.80)for the test,respectively.Using the additional samples improved the test accuracy of all the models except SVM.As a result,in data-scarce environments,this research showed that utilizing GANs to generate supplementary samples is promising because it can improve the predictive capability of common landslide prediction models.
基金funded by Centre for Advanced Modelling and Geospatial Information Systems, University of Technology Sydney: 323930, 321740.2232335 and 321740.2232357
文摘Catastrophic natural hazards,such as earthquake,pose serious threats to properties and human lives in urban areas.Therefore,earthquake risk assessment(ERA)is indispensable in disaster management.ERA is an integration of the extent of probability and vulnerability of assets.This study develops an integrated model by using the artificial neural network–analytic hierarchy process(ANN–AHP)model for constructing the ERA map.The aim of the study is to quantify urban population risk that may be caused by impending earthquakes.The model is applied to the city of Banda Aceh in Indonesia,a seismically active zone of Aceh province frequently affected by devastating earthquakes.ANN is used for probability mapping,whereas AHP is used to assess urban vulnerability after the hazard map is created with the aid of earthquake intensity variation thematic layering.The risk map is subsequently created by combining the probability,hazard,and vulnerability maps.Then,the risk levels of various zones are obtained.The validation process reveals that the proposed model can map the earthquake probability based on historical events with an accuracy of 84%.Furthermore,results show that the central and southeastern regions of the city have moderate to very high risk classifications,whereas the other parts of the city fall under low to very low earthquake risk classifications.The findings of this research are useful for government agencies and decision makers,particularly in estimating risk dimensions in urban areas and for the future studies to project the preparedness strategies for Banda Aceh.
基金This research is supported by the MECW research programthe Centre for Advanced Middle Eastern Studies,Lund University.
文摘One important step in binary modeling of environmental problems is the generation of absence-datasets that are traditionally generated by random sampling and can undermine the quality of outputs.To solve this problem,this study develops the Absence Point Generation(APG)toolbox which is a Python-based ArcGIS toolbox for automated construction of absence-datasets for geospatial studies.The APG employs a frequency ratio analysis of four commonly used and important driving factors such as altitude,slope degree,topographic wetness index,and distance from rivers,and considers the presence locations buffer and density layers to define the low potential or susceptibility zones where absence-datasets are generated.To test the APG toolbox,we applied two benchmark algorithms of random forest(RF)and boosted regression trees(BRT)in a case study to investigate groundwater potential using three absence datasets i.e.,the APG,random,and selection of absence samples(SAS)toolbox.The BRT-APG and RF-APG had the area under receiver operating curve(AUC)values of 0.947 and 0.942,while BRT and RF had weaker performances with the SAS and Random datasets.This effect resulted in AUC improvements for BRT and RF by 7.2,and 9.7%from the Random dataset,and AUC improvements for BRT and RF by 6.1,and 5.4%from the SAS dataset,respectively.The APG also impacted the importance of the input factors and the pattern of the groundwater potential maps,which proves the importance of absence points in environmental binary issues.The proposed APG toolbox could be easily applied in other environmental hazards such as landslides,floods,and gully erosion,and land subsidence.
基金fully funded by the Center for Advanced Modeling and Geospatial Information Systems(CAMGIS),Faculty of Engineering and IT,University of Technology Sydneysupported by Researchers Supporting Project number RSP-2020/14,King Saud University,Riyadh,Saudi Arabia。
文摘Earthquake prediction is currently the most crucial task required for the probability,hazard,risk mapping,and mitigation purposes.Earthquake prediction attracts the researchers'attention from both academia and industries.Traditionally,the risk assessment approaches have used various traditional and machine learning models.However,deep learning techniques have been rarely tested for earthquake probability mapping.Therefore,this study develops a convolutional neural network(CNN)model for earthquake probability assessment in NE India.Then conducts vulnerability using analytical hierarchy process(AHP),Venn's intersection theory for hazard,and integrated model for risk mapping.A prediction of classification task was performed in which the model predicts magnitudes more than 4 Mw that considers nine indicators.Prediction classification results and intensity variation were then used for probability and hazard mapping,respectively.Finally,earthquake risk map was produced by multiplying hazard,vulnerability,and coping capacity.The vulnerability was prepared by using six vulnerable factors,and the coping capacity was estimated by using the number of hospitals and associated variables,including budget available for disaster management.The CNN model for a probability distribution is a robust technique that provides good accuracy.Results show that CNN is superior to the other algorithms,which completed the classification prediction task with an accuracy of 0.94,precision of 0.98,recall of 0.85,and F1 score of 0.91.These indicators were used for probability mapping,and the total area of hazard(21,412.94 km^(2)),vulnerability(480.98 km^(2)),and risk(34,586.10 km^(2))was estimated.
基金funded by the Fundamental Research Grant Scheme (FRGS) 2015-1 from the Ministry of Higher Education (MOHE), Malaysia
文摘The devastating effect of soil erosion is one of the major sources of land degradation that affects human lives in many ways which occur mainly due to deforestation, poor agricultural practices, overgrazing,wildfire and urbanization. Soil erosion often leads to soil truncation, loss of fertility, slope instability, etc.which causes irreversible effects on the poorly renewable soil resource. In view of this, a study was conducted in Kelantan River basin to predict soil loss as influenced by long-term land use/land-cover(LULC) changes in the area. The study was conducted with the aim of predicting and assessing soil erosion as it is influenced by long-term LULC changes. The 13,100 km^2 watershed was delineated into four sub-catchments Galas, Pergau, Lebir and Nenggiri for precise result estimation and ease of execution. GIS-based Universal Soil Loss Equation(USLE) model was used to predict soil loss in this study. The model inputs used for the temporal and spatial calculation of soil erosion include rainfall erosivity factor,topographic factor, land cover and management factor as well as erodibility factor. The results showed that 67.54% of soil loss is located under low erosion potential(reversible soil loss) or 0-1 t ha^(-1) yr^(-1) soil loss in Galas, 59.17% in Pergau, 53.32% in Lebir and 56.76% in Nenggiri all under the 2013 LULC condition.Results from the correlation of soil erosion rates with LULC changes indicated that cleared land in all the four catchments and under all LULC conditions(1984-2013) appears to be the dominant with the highest erosion losses. Similarly, grassland and forest were also observed to regulate erosion rates in the area. This is because the vegetation cover provided by these LULC types protects the soil from direct impact of rain drops which invariably reduce soil loss to the barest minimum. Overall, it was concluded that the results have shown the significance of LULC in the control of erosion. Maps generated from the study may be useful to planners and land use managers to take appropriate decisions for soil conservation.
文摘The rabbit has been recognized as a valuable model in various biomedical and biological research fields because of its intermediate size and phylogenetic proximity to primates.However,the technology for precise genome manipulations in rabbit has been stalled for decades,severely limiting its applications in biomedical research.Novel genome editing technologies,especially CRISPR/Cas9,have remarkably enhanced precise genome manipulation in rabbits,and shown their superiority and promise for generating rabbit models of human genetic diseases.In this review,we summarize the brief history of transgenic rabbit technology and the development of novel genome editing technologies in rabbits.
文摘Gully erosion is a disruptive phenomenon which extensively affects the Iranian territory,especially in the Northern provinces.A number of studies have been recently undertaken to study this process and to predict it over space and ultimately,in a broader national effort,to limit its negative effects on local communities.We focused on the Bastam watershed where 9.3%of its surface is currently affected by gullying.Machine learning algorithms are currently under the magnifying glass across the geomorphological community for their high predictive ability.However,unlike the bivariate statistical models,their structure does not provide intuitive and quantifiable measures of environmental preconditioning factors.To cope with such weakness,we interpret preconditioning causes on the basis of a bivariate approach namely,Index of Entropy.And,we performed the susceptibility mapping procedure by testing three extensions of a decision tree model namely,Alternating Decision Tree(ADTree),Naive-Bayes tree(NBTree),and Logistic Model Tree(LMT).We dichotomized the gully information over space into gully presence/absence conditions,which we further explored in their calibration and validation stages.Being the presence/absence information and associated factors identical,the resulting differences are only due to the algorithmic structures of the three models we chose.Such differences are not significant in terms of performances;in fact,the three models produce outstanding predictive AUC measures(ADTree=0.922;NBTree=0.939;LMT=0.944).However,the associated mapping results depict very different patterns where only the LMT is associated with reasonable susceptibility patterns.This is a strong indication of what model combines best performance and mapping for any natural hazard-oriented application.
基金financially supported by the Shiraz University and INSF(Iran National Science Foundation,Project No.97002616)。
文摘Geogenic dust is commonly believed to be one of the most important environmental problems in the Middle East.The present study investigated the geochemical characteristics of atmospheric dust particles in Shiraz City(south of Iran).Atmospheric dust samples were collected through a dry collector method by using glass trays at 10 location sites in May 2018.Elemental composition was analysed through inductively coupled plasma optical emission spectrometry.Meteorological data showed that the dustiest days were usually in spring and summer,particularly in April.X-ray diffraction analysis of atmospheric dust samples indicated that the mineralogical composition of atmospheric dust was calcite+dolomite(24%)>palygorskite(18%)>quartz(14%)>muscovite(13%)>albite(11%)>kaolinite(7%)>gypsum(7%)>zircon=anatase(3%).The high occurrence of palygorskite(16%-23%) could serve as a tracer of the source areas of dust storms from the desert of Iraq and Saudi Arabia to the South of Iran.Scanning electron microscopy indicated that the sizes of the collected dust varied from 50 μm to0.8 μm,but 10 μm was the predominant size.The atmospheric dust collected had prismatic trigonal-rhombohedral crystals and semi-rounded irregular shapes.Moreover,diatoms were detected in several samples,suggesting that emissions from dry-bed lakes,such as Hoor Al-Azim Wetland(located in the southwest of Iran),also contributed to the dust load.Backward trajectory simulations were performed at the date of sampling by using the NOAA HYSPLIT model.Results showed that the sources of atmospheric dust in the study area were the eastern area of Iraq,eastern desert of Saudi Arabia,Kuwait and Khuzestan Province.The Ca/Al ratio of the collected samples(1.14) was different from the upper continental crust(UCC) value(UCC=0.37),whereas Mg/A1(0.29),K/Al(0.22) and Ti/Al(0.07) ratios were close to the UCC value(0.04).This condition favours desert calcisols as the main mineral dust sources.Analysis of the crustal enrichment factor(EF_(crustal)) revealed geogenic sources for V,Mo,Pb,Sr,Cu and Zn(<2),whereas anthropogenic sources affected As,Cd,Cr and Ni.
文摘Energy is a vital commodity that sustains human lives,as well as economic processes.The challenges towards energy generation,demand and supply are plenty owing to the use of fossil fuels leading to climate change and environmental problems like water and air pollution.With the increasing awareness over climate change,post Paris Agreement,the role of energy plays a key role towards achieving the proposed target.The contributions in this Special Issue of Geoscience Frontiers on Energy includes 8 papers from esteemed research groups worldwide which explores,highlights and provide new insights towards the various aspects of energy.
基金supported in part by research grants from JSPS KAKENHI(JP26460486 to MN and JP15H04718 to JF)NIH grants(R01HL117491and RO1HL129778 to YEC)
文摘Increased expression of matrix metalloproteinase-1(MMP-1)has been observed in the lesions of atherosclerosis and aneurysms;however,it is not fully understood whether macrophage-derived MMP-1 affects these diseases.To investigate whether macrophage-derived MMP-1 participates in the development of vascular diseases,we generated transgenic(Tg)rabbits expressing human MMP-1 in the monocyte/macrophage lineage under the control of the human scavenger receptor enhancer/promoter.Tg rabbits exhibited no visible abnormalities throughout their bodies.Western blotting analysis revealed that the amount of MMP-1 proteins in the conditioned media secreted from peritoneal macrophages of Tg rabbits was up to 3-fold higher than that in non-Tg rabbits.For the first experiment,Tg and non-Tg rabbits were fed a cholesterol diet for 16 weeks,and aortic and coronary atherosclerosis were evaluated.The gross lesion area of aortic atherosclerosis in Tg rabbits was not significantly different from that in non-Tg rabbits,but Tg rabbits had marked destruction of the medial elastic lamina of the aortic lesions on microscopic examination.For the second experiment,we generated aortic aneurysms by incubating with elastase.Compared with non-Tg rabbits,Tg rabbits exhibited a significantly greater aortic dilation.Increased macrophage-derived MMP-1 led to increased medial destruction in both aortic atherosclerosis and aneurysms.These results demonstrate that MMP-1 plays a different role in the pathogenesis of atherosclerosis and aneurysms.
基金supported by the Geographic Information Science Research Group,Ton Duc Thang University,Ho Chi Minh City,Viet Nam
文摘The sub-watershed prioritization is the ranking of different areas of a river basin according to their need to proper planning and management of soil and water resources.Decision makers should optimally allocate the investments to critical sub-watersheds in an economically effective and technically efficient manner.Hence,this study aimed at developing a user-friendly geographic information system(GIS)tool,Sub-Watershed Prioritization Tool(SWPT),using the Python programming language to decrease any possible uncertainty.It used geospatial-statistical techniques for analyzing morphometric and topohydrological factors and automatically identifying critical and priority sub-watersheds.In order to assess the capability and reliability of the SWPT tool,it was successfully applied in a watershed in the Golestan Province,Northern Iran.Historical records of flood and landslide events indicated that the SWPT correctly recognized critical sub-watersheds.It provided a cost-effective approach for prioritization of sub-watersheds.Therefore,the SWPT is practically applicable and replicable to other regions where gauge data is not available for each sub-watershed.
基金supported by Research Grants Council of Hong Kong under Grant No.17301214HKU CERG Grants,Fundamental Research Funds for the Central Universities+2 种基金the Research Funds of Renmin University of ChinaHung Hing Ying Physical Research Grantthe Natural Science Foundation of China under Grant No.11271144
文摘Driven by the challenge of integrating large amount of experimental data, classification technique emerges as one of the major and popular tools in computational biology and bioinformatics research. Machine learning methods, especially kernel methods with Support Vector Machines (SVMs) are very popular and effective tools. In the perspective of kernel matrix, a technique namely Eigen- matrix translation has been introduced for protein data classification. The Eigen-matrix translation strategy has a lot of nice properties which deserve more exploration. This paper investigates the major role of Eigen-matrix translation in classification. The authors propose that its importance lies in the dimension reduction of predictor attributes within the data set. This is very important when the dimension of features is huge. The authors show by numerical experiments on real biological data sets that the proposed framework is crucial and effective in improving classification accuracy. This can therefore serve as a novel perspective for future research in dimension reduction problems.
基金supported by the Centre for Advanced Modelling and Geospatial Information Systems,Faculty of Engineering and Information Technology,University of Technology Sydneysupported by the IRTP scholarship funded by the Department of Education and Training,Govt.of Australia.
文摘The application of Artificial Intelligence in various fields has witnessed tremendous progress in the recent years.The field of geosciences and natural hazard modelling has also benefitted immensely from the introduction of novel algorithms,the availability of large quantities of data,and the increase in computational capacity.The enhancement in algorithms can be largely attributed to the elevated complexity of the network architecture and the heightened level of abstraction found in the network's later layers.As a result,AI models lack transparency and accountability,often being dubbed as"black box"models.Explainable AI(XAI)is emerging as a solution to make AI models more transparent,especially in domains where transparency is essential.Much discussion surrounds the use of XAI for diverse purposes,as researchers explore its applications across various domains.With the growing body of research papers on XAI case studies,it has become increasingly important to address existing gaps in the literature.The current literature lacks a comprehensive understanding of the capabilities,limitations,and practical implications of XAI.This study provides a comprehensive overview of what constitutes XAI,how it is being used and potential applications in hydrometeorological natural hazards.It aims to serve as a useful reference for researchers,practitioners,and stakeholders who are currently using or intending to adopt XAI,thereby contributing to the advancements for wider acceptance of XAI in the future.
文摘Speech recognition systems have become a unique human-computer interaction(HCI)family.Speech is one of the most naturally developed human abilities;speech signal processing opens up a transparent and hand-free computation experience.This paper aims to present a retrospective yet modern approach to the world of speech recognition systems.The development journey of ASR(Automatic Speech Recognition)has seen quite a few milestones and breakthrough technologies that have been highlighted in this paper.A step-by-step rundown of the fundamental stages in developing speech recognition systems has been presented,along with a brief discussion of various modern-day developments and applications in this domain.This review paper aims to summarize and provide a beginning point for those starting in the vast field of speech signal processing.Since speech recognition has a vast potential in various industries like telecommunication,emotion recognition,healthcare,etc.,this review would be helpful to researchers who aim at exploring more applications that society can quickly adopt in future years of evolution.
基金supported by theResearchers Supporting Project No.RSP-2021/14,King Saud University,Riyadh,Saudi Arabia.
文摘Contactless verification is possible with iris biometric identification,which helps prevent infections like COVID-19 from spreading.Biometric systems have grown unsteady and dangerous as a result of spoofing assaults employing contact lenses,replayed the video,and print attacks.The work demonstrates an iris liveness detection approach by utilizing fragmental coefficients of Haar transformed Iris images as signatures to prevent spoofing attacks for the very first time in the identification of iris liveness.Seven assorted feature creation ways are studied in the presented solutions,and these created features are explored for the training of eight distinct machine learning classifiers and ensembles.The predicted iris liveness identification variants are evaluated using recall,F-measure,precision,accuracy,APCER,BPCER,and ACER.Three standard datasets were used in the investigation.The main contribution of our study is achieving a good accuracy of 99.18%with a smaller feature vector.The fragmental coefficients of Haar transformed iris image of size 8∗8 utilizing random forest algorithm showed superior iris liveness detection with reduced featured vector size(64 features).Random forest gave 99.18%accuracy.Additionally,conduct an extensive experiment on cross datasets for detailed analysis.The results of our experiments showthat the iris biometric template is decreased in size tomake the proposed framework suitable for algorithmic verification in real-time environments and settings.
文摘In the post-genomic era, the construction and control of genetic regulatory networks using gene expression data is a hot research topic. Boolean networks (BNs) and its extension Probabilistic Boolean Networks (PBNs) have been served as an effective tool for this purpose. However, PBNs are difficult to be used in practice when the number of genes is large because of the huge computational cost. In this paper, we propose a simplified multivariate Markov model for approximating a PBN The new model can preserve the strength of PBNs, the ability to capture the inter-dependence of the genes in the network, qnd at the same time reduce the complexity of the network and therefore the computational cost. We then present an optimal control model with hard constraints for the purpose of control/intervention of a genetic regulatory network. Numerical experimental examples based on the yeast data are given to demonstrate the effectiveness of our proposed model and control policy.