The present trends in smart world reflects the extensive use of limited resources through information and communication technology.The limited resources like space,mobility,energy,etc.,have been consumed rigorously to...The present trends in smart world reflects the extensive use of limited resources through information and communication technology.The limited resources like space,mobility,energy,etc.,have been consumed rigorously towards creating optimized but smart instances.Thus,a new concept of IoT integrated smart city vision is yet to be proposed which includes a combination of systems like noise and air loss monitoring,web monitoring and fire detection systems,smart waste bin systems,etc.,that have not been clearly addressed in the previous researches.This paper focuses on developing an effective system for possible monitoring of losses,traffic management,thus innovating smart city at large with digitalized and integrated systems and software for fast and effective implementations.In our proposed system,a real time data analysis is performed.These data are collected by various sensors to analyze different factors that are responsible for such losses.The proposed work is validated on a real case study.展开更多
In current days,the domain of Internet of Things(IoT)and Wireless Sensor Networks(WSN)are combined for enhancing the sensor related data transmission in the forthcoming networking applications.Clustering and routing t...In current days,the domain of Internet of Things(IoT)and Wireless Sensor Networks(WSN)are combined for enhancing the sensor related data transmission in the forthcoming networking applications.Clustering and routing techniques are treated as the effective methods highly used to attain reduced energy consumption and lengthen the lifetime of the WSN assisted IoT networks.In this view,this paper presents an Ensemble of Metaheuristic Optimization based QoS aware Clustering with Multihop Routing(EMOQoSCMR)Protocol for IoT assisted WSN.The proposed EMO-QoSCMR protocol aims to achieve QoS parameters such as energy,throughput,delay,and lifetime.The proposed model involves two stage processes namely clustering and routing.Firstly,the EMO-QoSCMR protocol involves crossentropy rain optimization algorithm based clustering(CEROAC)technique to select an optimal set of cluster heads(CHs)and construct clusters.Besides,oppositional chaos game optimization based routing(OCGOR)technique is employed for the optimal set of routes in the IoT assisted WSN.The proposed model derives a fitness function based on the parameters involved in the IoT nodes such as residual energy,distance to sink node,etc.The proposed EMOQoSCMR technique has resulted to an enhanced NAN of 64 nodes whereas the LEACH,PSO-ECHS,E-OEERP,and iCSHS methods have resulted in a lesser NAN of 2,10,42,and 51 rounds.The performance of the presented protocol has been evaluated interms of energy efficiency and network lifetime.展开更多
Raw data are classified using clustering techniques in a reasonable manner to create disjoint clusters.A lot of clustering algorithms based on specific parameters have been proposed to access a high volume of datasets...Raw data are classified using clustering techniques in a reasonable manner to create disjoint clusters.A lot of clustering algorithms based on specific parameters have been proposed to access a high volume of datasets.This paper focuses on cluster analysis based on neutrosophic set implication,i.e.,a k-means algorithm with a threshold-based clustering technique.This algorithm addresses the shortcomings of the k-means clustering algorithm by overcoming the limitations of the threshold-based clustering algorithm.To evaluate the validity of the proposed method,several validity measures and validity indices are applied to the Iris dataset(from the University of California,Irvine,Machine Learning Repository)along with k-means and threshold-based clustering algorithms.The proposed method results in more segregated datasets with compacted clusters,thus achieving higher validity indices.The method also eliminates the limitations of threshold-based clustering algorithm and validates measures and respective indices along with k-means and threshold-based clustering algorithms.展开更多
Automated segmentation and classification of biomedical images act as a vital part of the diagnosis of brain tumors(BT).A primary tumor brain analysis suggests a quicker response from treatment that utilizes for impro...Automated segmentation and classification of biomedical images act as a vital part of the diagnosis of brain tumors(BT).A primary tumor brain analysis suggests a quicker response from treatment that utilizes for improving patient survival rate.The location and classification of BTs from huge medicinal images database,obtained from routine medical tasks with manual processes are a higher cost together in effort and time.An automatic recognition,place,and classifier process was desired and useful.This study introduces anAutomatedDeepResidualU-Net Segmentation with Classification model(ADRU-SCM)for Brain Tumor Diagnosis.The presentedADRUSCM model majorly focuses on the segmentation and classification of BT.To accomplish this,the presented ADRU-SCM model involves wiener filtering(WF)based preprocessing to eradicate the noise that exists in it.In addition,the ADRU-SCM model follows deep residual U-Net segmentation model to determine the affected brain regions.Moreover,VGG-19 model is exploited as a feature extractor.Finally,tunicate swarm optimization(TSO)with gated recurrent unit(GRU)model is applied as a classification model and the TSO algorithm effectually tunes theGRUhyperparameters.The performance validation of the ADRU-SCM model was tested utilizing FigShare dataset and the outcomes pointed out the better performance of the ADRU-SCM approach on recent approaches.展开更多
Phishing is a type of cybercrime in which cyber-attackers pose themselves as authorized persons or entities and hack the victims’sensitive data.E-mails,instant messages and phone calls are some of the common modes us...Phishing is a type of cybercrime in which cyber-attackers pose themselves as authorized persons or entities and hack the victims’sensitive data.E-mails,instant messages and phone calls are some of the common modes used in cyberattacks.Though the security models are continuously upgraded to prevent cyberattacks,hackers find innovative ways to target the victims.In this background,there is a drastic increase observed in the number of phishing emails sent to potential targets.This scenario necessitates the importance of designing an effective classification model.Though numerous conventional models are available in the literature for proficient classification of phishing emails,the Machine Learning(ML)techniques and the Deep Learning(DL)models have been employed in the literature.The current study presents an Intelligent Cuckoo Search(CS)Optimization Algorithm with a Deep Learning-based Phishing Email Detection and Classification(ICSOA-DLPEC)model.The aim of the proposed ICSOA-DLPEC model is to effectually distinguish the emails as either legitimate or phishing ones.At the initial stage,the pre-processing is performed through three stages such as email cleaning,tokenization and stop-word elimination.Then,the N-gram approach is;moreover,the CS algorithm is applied to extract the useful feature vectors.Moreover,the CS algorithm is employed with the Gated Recurrent Unit(GRU)model to detect and classify phishing emails.Furthermore,the CS algorithm is used to fine-tune the parameters involved in the GRU model.The performance of the proposed ICSOA-DLPEC model was experimentally validated using a benchmark dataset,and the results were assessed under several dimensions.Extensive comparative studies were conducted,and the results confirmed the superior performance of the proposed ICSOA-DLPEC model over other existing approaches.The proposed model achieved a maximum accuracy of 99.72%.展开更多
Due to the advances of intelligent transportation system(ITSs),traffic forecasting has gained significant interest as robust traffic prediction acts as an important part in different ITSs namely traffic signal control...Due to the advances of intelligent transportation system(ITSs),traffic forecasting has gained significant interest as robust traffic prediction acts as an important part in different ITSs namely traffic signal control,navigation,route mapping,etc.The traffic prediction model aims to predict the traffic conditions based on the past traffic data.For more accurate traffic prediction,this study proposes an optimal deep learning-enabled statistical analysis model.This study offers the design of optimal convolutional neural network with attention long short term memory(OCNN-ALSTM)model for traffic prediction.The proposed OCNN-ALSTM technique primarily preprocesses the traffic data by the use of min-max normalization technique.Besides,OCNN-ALSTM technique was executed for classifying and predicting the traffic data in real time cases.For enhancing the predictive outcomes of the OCNN-ALSTM technique,the bird swarm algorithm(BSA)is employed to it and thereby overall efficacy of the network gets improved.The design of BSA for optimal hyperparameter tuning of the CNN-ALSTM model shows the novelty of the work.The experimental validation of the OCNNALSTM technique is performed using benchmark datasets and the results are examined under several aspects.The simulation results reported the enhanced outcomes of the OCNN-ALSTM model over the recent methods under several dimensions.展开更多
Software-defined networking is one of the progressive and prominent innovations in Information and Communications Technology.It mitigates the issues that our conventional network was experiencing.However,traffic data ...Software-defined networking is one of the progressive and prominent innovations in Information and Communications Technology.It mitigates the issues that our conventional network was experiencing.However,traffic data generated by various applications is increasing day by day.In addition,as an organization’s digital transformation is accelerated,the amount of information to be processed inside the organization has increased explosively.It might be possible that a Software-Defined Network becomes a bottleneck and unavailable.Various models have been proposed in the literature to balance the load.However,most of the works consider only limited parameters and do not consider controller and transmission media loads.These loads also contribute to decreasing the performance of Software-Defined Networks.This work illustrates how a software-defined network can tackle the load at its software layer and give excellent results to distribute the load.We proposed a deep learning-dependent convolutional neural networkbased load balancing technique to handle a software-defined network load.The simulation results show that the proposed model requires fewer resources as compared to existing machine learning-based load balancing techniques.展开更多
Fruit classification utilizing a deep convolutional neural network(CNN)is the most promising application in personal computer vision(CV).Profound learning-related characterization made it possible to recognize fruits ...Fruit classification utilizing a deep convolutional neural network(CNN)is the most promising application in personal computer vision(CV).Profound learning-related characterization made it possible to recognize fruits from pictures.But,due to the similarity and complexity,fruit recognition becomes an issue for the stacked fruits on a weighing scale.Recently,Machine Learning(ML)methods have been used in fruit farming and agriculture and brought great convenience to human life.An automated system related to ML could perform the fruit classifier and sorting tasks previously managed by human experts.CNN’s(convolutional neural networks)have attained incredible outcomes in image classifiers in several domains.Considering the success of transfer learning and CNNs in other image classifier issues,this study introduces an Artificial Humming Bird Optimization with Siamese Convolutional Neural Network based Fruit Classification(AMO-SCNNFC)model.In the presented AMO-SCNNFC technique,image preprocessing is performed to enhance the contrast level of the image.In addition,spiral optimization(SPO)with the VGG-16 model is utilized to derive feature vectors.For fruit classification,AHO with end to end SCNN(ESCNN)model is applied to identify different classes of fruits.The performance validation of the AMO-SCNNFC technique is tested using a dataset comprising diverse classes of fruit images.Extensive comparison studies reported improving the AMOSCNNFC technique over other approaches with higher accuracy of 99.88%.展开更多
The exponential growth of data necessitates an effective data storage scheme,which helps to effectively manage the large quantity of data.To accomplish this,Deoxyribonucleic Acid(DNA)digital data storage process can b...The exponential growth of data necessitates an effective data storage scheme,which helps to effectively manage the large quantity of data.To accomplish this,Deoxyribonucleic Acid(DNA)digital data storage process can be employed,which encodes and decodes binary data to and from synthesized strands of DNA.Vector quantization(VQ)is a commonly employed scheme for image compression and the optimal codebook generation is an effective process to reach maximum compression efficiency.This article introduces a newDNAComputingwithWater StriderAlgorithm based Vector Quantization(DNAC-WSAVQ)technique for Data Storage Systems.The proposed DNAC-WSAVQ technique enables encoding data using DNA computing and then compresses it for effective data storage.Besides,the DNAC-WSAVQ model initially performsDNA encoding on the input images to generate a binary encoded form.In addition,aWater Strider algorithm with Linde-Buzo-Gray(WSA-LBG)model is applied for the compression process and thereby storage area can be considerably minimized.In order to generate optimal codebook for LBG,the WSA is applied to it.The performance validation of the DNAC-WSAVQ model is carried out and the results are inspected under several measures.The comparative study highlighted the improved outcomes of the DNAC-WSAVQ model over the existing methods.展开更多
In recent times,financial globalization has drastically increased in different ways to improve the quality of services with advanced resources.The successful applications of bitcoin Blockchain(BC)techniques enable the...In recent times,financial globalization has drastically increased in different ways to improve the quality of services with advanced resources.The successful applications of bitcoin Blockchain(BC)techniques enable the stockholders to worry about the return and risk of financial products.The stockholders focused on the prediction of return rate and risk rate of financial products.Therefore,an automatic return rate bitcoin prediction model becomes essential for BC financial products.The newly designed machine learning(ML)and deep learning(DL)approaches pave the way for return rate predictive method.This study introduces a novel Jellyfish search optimization based extreme learning machine with autoencoder(JSO-ELMAE)for return rate prediction of BC financial products.The presented JSO-ELMAE model designs a new ELMAE model for predicting the return rate of financial products.Besides,the JSO algorithm is exploited to tune the parameters related to the ELMAE model which in turn boosts the classification results.The application of JSO technique assists in optimal parameter adjustment of the ELMAE model to predict the bitcoin return rates.The experimental validation of the JSO-ELMAE model was executed and the outcomes are inspected in many aspects.The experimental values demonstrated the enhanced performance of the JSO-ELMAE model over recent state of art approaches with minimal RMSE of 0.1562.展开更多
Due to latest advancements in the field of remote sensing,it becomes easier to acquire high quality images by the use of various satellites along with the sensing components.But the massive quantity of data poses a ch...Due to latest advancements in the field of remote sensing,it becomes easier to acquire high quality images by the use of various satellites along with the sensing components.But the massive quantity of data poses a challenging issue to store and effectively transmit the remote sensing images.Therefore,image compression techniques can be utilized to process remote sensing images.In this aspect,vector quantization(VQ)can be employed for image compression and the widely applied VQ approach is Linde–Buzo–Gray(LBG)which creates a local optimum codebook for image construction.The process of constructing the codebook can be treated as the optimization issue and the metaheuristic algorithms can be utilized for resolving it.With this motivation,this article presents an intelligent satin bowerbird optimizer based compression technique(ISBO-CT)for remote sensing images.The goal of the ISBO-CT technique is to proficiently compress the remote sensing images by the effective design of codebook.Besides,the ISBO-CT technique makes use of satin bowerbird optimizer(SBO)with LBG approach is employed.The design of SBO algorithm for remote sensing image compression depicts the novelty of the work.To showcase the enhanced efficiency of ISBO-CT approach,an extensive range of simulations were applied and the outcomes reported the optimum performance of ISBO-CT technique related to the recent state of art image compression approaches.展开更多
Traditional Wireless Sensor Networks(WSNs)comprise of costeffective sensors that can send physical parameters of the target environment to an intended user.With the evolution of technology,multimedia sensor nodes have...Traditional Wireless Sensor Networks(WSNs)comprise of costeffective sensors that can send physical parameters of the target environment to an intended user.With the evolution of technology,multimedia sensor nodes have become the hot research topic since it can continue gathering multimedia content and scalar from the target domain.The existence of multimedia sensors,integrated with effective signal processing and multimedia source coding approaches,has led to the increased application of Wireless Multimedia Sensor Network(WMSN).This sort of network has the potential to capture,transmit,and receive multimedia content.Since energy is a major source in WMSN,novel clustering approaches are essential to deal with adaptive topologies of WMSN and prolonged network lifetime.With this motivation,the current study develops an Enhanced Spider Monkey Optimization-based Energy-Aware Clustering Scheme(ESMO-EACS)for WMSN.The proposed ESMO-EACS model derives ESMO algorithm by incorporating the concepts of SMO algorithm and quantum computing.The proposed ESMO-EACS model involves the design of fitness functions using distinct input parameters for effective construction of clusters.A comprehensive experimental analysis was conducted to validate the effectiveness of the proposed ESMO-EACS technique in terms of different performance measures.The simulation outcome established the superiority of the proposed ESMO-EACS technique to other methods under various measures.展开更多
In recent times,Internet of Medical Things(IoMT)gained much attention in medical services and healthcare management domain.Since healthcare sector generates massive volumes of data like personal details,historical med...In recent times,Internet of Medical Things(IoMT)gained much attention in medical services and healthcare management domain.Since healthcare sector generates massive volumes of data like personal details,historical medical data,hospitalization records,and discharging records,IoMT devices too evolved with potentials to handle such high quantities of data.Privacy and security of the data,gathered by IoMT gadgets,are major issues while transmitting or saving it in cloud.The advancements made in Artificial Intelligence(AI)and encryption techniques find a way to handle massive quantities of medical data and achieve security.In this view,the current study presents a new Optimal Privacy Preserving and Deep Learning(DL)-based Disease Diagnosis(OPPDL-DD)in IoMT environment.Initially,the proposed model enables IoMT devices to collect patient data which is then preprocessed to optimize quality.In order to decrease the computational difficulty during diagnosis,Radix Tree structure is employed.In addition,ElGamal public key cryptosystem with Rat Swarm Optimizer(EIG-RSO)is applied to encrypt the data.Upon the transmission of encrypted data to cloud,respective decryption process occurs and the actual data gets reconstructed.Finally,a hybridized methodology combining Gated Recurrent Unit(GRU)with Convolution Neural Network(CNN)is exploited as a classification model to diagnose the disease.Extensive sets of simulations were conducted to highlight the performance of the proposed model on benchmark dataset.The experimental outcomes ensure that the proposed model is superior to existing methods under different measures.展开更多
Cyberattacks are developing gradually sophisticated,requiring effective intrusion detection systems(IDSs)for monitoring computer resources and creating reports on anomalous or suspicious actions.With the popularity of...Cyberattacks are developing gradually sophisticated,requiring effective intrusion detection systems(IDSs)for monitoring computer resources and creating reports on anomalous or suspicious actions.With the popularity of Internet of Things(IoT)technology,the security of IoT networks is developing a vital problem.Because of the huge number and varied kinds of IoT devices,it can be challenging task for protecting the IoT framework utilizing a typical IDS.The typical IDSs have their restrictions once executed to IoT networks because of resource constraints and complexity.Therefore,this paper presents a new Blockchain Assisted Intrusion Detection System using Differential Flower Pollination with Deep Learning(BAIDS-DFPDL)model in IoT Environment.The presented BAIDS-DFPDLmodelmainly focuses on the identification and classification of intrusions in the IoT environment.To accomplish this,the presented BAIDS-DFPDL model follows blockchain(BC)technology for effective and secure data transmission among the agents.Besides,the presented BAIDSDFPDLmodel designs Differential Flower Pollination based feature selection(DFPFS)technique to elect features.Finally,sailfish optimization(SFO)with Restricted Boltzmann Machine(RBM)model is applied for effectual recognition of intrusions.The simulation results on benchmark dataset exhibit the enhanced performance of the BAIDS-DFPDL model over other models on the recognition of intrusions.展开更多
In recent times,internet of things(IoT)applications on the cloud might not be the effective solution for every IoT scenario,particularly for time sensitive applications.A significant alternative to use is edge computi...In recent times,internet of things(IoT)applications on the cloud might not be the effective solution for every IoT scenario,particularly for time sensitive applications.A significant alternative to use is edge computing that resolves the problem of requiring high bandwidth by end devices.Edge computing is considered a method of forwarding the processing and communication resources in the cloud towards the edge.One of the considerations of the edge computing environment is resource management that involves resource scheduling,load balancing,task scheduling,and quality of service(QoS)to accomplish improved performance.With this motivation,this paper presents new soft computing based metaheuristic algorithms for resource scheduling(RS)in the edge computing environment.The SCBMARS model involves the hybridization of the Group Teaching Optimization Algorithm(GTOA)with rat swarm optimizer(RSO)algorithm for optimal resource allocation.The goal of the SCBMA-RS model is to identify and allocate resources to every incoming user request in such a way,that the client’s necessities are satisfied with the minimum number of possible resources and optimal energy consumption.The problem is formulated based on the availability of VMs,task characteristics,and queue dynamics.The integration of GTOA and RSO algorithms assist to improve the allocation of resources among VMs in the data center.For experimental validation,a comprehensive set of simulations were performed using the CloudSim tool.The experimental results showcased the superior performance of the SCBMA-RS model interms of different measures.展开更多
Wireless sensor networks(WSN)encompass a set of inexpensive and battery powered sensor nodes,commonly employed for data gathering and tracking applications.Optimal energy utilization of the nodes in WSN is essential t...Wireless sensor networks(WSN)encompass a set of inexpensive and battery powered sensor nodes,commonly employed for data gathering and tracking applications.Optimal energy utilization of the nodes in WSN is essential to capture data effectively and transmit them to destination.The latest developments of energy efficient clustering techniques can be widely applied to accomplish energy efficiency in the network.In this aspect,this paper presents an enhanced Archimedes optimization based cluster head selection(EAOA-CHS)approach for WSN.The goal of the EAOA-CHS method is to optimally choose the CHs from the available nodes in WSN and then organize the nodes into a set of clusters.Besides,the EAOA is derived by the incorporation of the chaotic map and pseudo-random performance.Moreover,the EAOA-CHS technique determines a fitness function involving total energy consumption and lifetime of WSN.The design of EAOA for CH election in the WSN depicts the novelty of work.In order to exhibit the enhanced efficiency of EAOA-CHS technique,a set of simulations are applied on 3 distinct conditions dependent upon the place of base station(BS).The simulation results pointed out the better outcomes of the EAOA-CHS technique over the recent methods under all scenarios.展开更多
In network-based intrusion detection practices,there are more regular instances than intrusion instances.Because there is always a statistical imbalance in the instances,it is difficult to train the intrusion detectio...In network-based intrusion detection practices,there are more regular instances than intrusion instances.Because there is always a statistical imbalance in the instances,it is difficult to train the intrusion detection system effectively.In this work,we compare intrusion detection performance by increasing the rarely appearing instances rather than by eliminating the frequently appearing duplicate instances.Our technique mitigates the statistical imbalance in these instances.We also carried out an experiment on the training model by increasing the instances,thereby increasing the attack instances step by step up to 13 levels.The experiments included not only known attacks,but also unknown new intrusions.The results are compared with the existing studies from the literature,and show an improvement in accuracy,sensitivity,and specificity over previous studies.The detection rates for the remote-to-user(R2L)and user-to-root(U2L)categories are improved significantly by adding fewer instances.The detection of many intrusions is increased from a very low to a very high detection rate.The detection of newer attacks that had not been used in training improved from 9%to 12%.This study has practical applications in network administration to protect from known and unknown attacks.If network administrators are running out of instances for some attacks,they can increase the number of instances with rarely appearing instances,thereby improving the detection of both known and unknown new attacks.展开更多
The Internet of Things(IoT)is envisioned as a network of various wireless sensor nodes communicating with each other to offer state-of-the-art solutions to real-time problems.These networks of wireless sensors monitor...The Internet of Things(IoT)is envisioned as a network of various wireless sensor nodes communicating with each other to offer state-of-the-art solutions to real-time problems.These networks of wireless sensors monitor the physical environment and report the collected data to the base station,allowing for smarter decisions.Localization in wireless sensor networks is to localize a sensor node in a two-dimensional plane.However,in some application areas,such as various surveillances,underwater monitoring systems,and various environmental monitoring applications,wireless sensors are deployed in a three-dimensional plane.Recently,localization-based applications have emerged as one of the most promising services related to IoT.In this paper,we propose a novel distributed range-free algorithm for node localization in wireless sensor networks.The proposed three-dimensional hop localization algorithm is based on the distance error correction factor.In this algorithm,the error decreases with the localization process.The distance correction factor is used at various stages of the localization process,which ultimately mitigates the error.We simulated the proposed algorithm using MATLAB and verified the accuracy of the algorithm.The simulation results are compared with some of the well-known existing algorithms in the literature.The results show that the proposed three-dimensional error-correctionbased algorithm performs better than existing algorithms.展开更多
Diabetic Retinopathy(DR)is a significant blinding disease that poses serious threat to human vision rapidly.Classification and severity grading of DR are difficult processes to accomplish.Traditionally,it depends on o...Diabetic Retinopathy(DR)is a significant blinding disease that poses serious threat to human vision rapidly.Classification and severity grading of DR are difficult processes to accomplish.Traditionally,it depends on ophthalmoscopically-visible symptoms of growing severity,which is then ranked in a stepwise scale from no retinopathy to various levels of DR severity.This paper presents an ensemble of Orthogonal Learning Particle Swarm Optimization(OPSO)algorithm-based Convolutional Neural Network(CNN)Model EOPSO-CNN in order to perform DR detection and grading.The proposed EOPSO-CNN model involves three main processes such as preprocessing,feature extraction,and classification.The proposed model initially involves preprocessing stage which removes the presence of noise in the input image.Then,the watershed algorithm is applied to segment the preprocessed images.Followed by,feature extraction takes place by leveraging EOPSO-CNN model.Finally,the extracted feature vectors are provided to a Decision Tree(DT)classifier to classify the DR images.The study experiments were carried out using Messidor DR Dataset and the results showed an extraordinary performance by the proposed method over compared methods in a considerable way.The simulation outcome offered the maximum classification with accuracy,sensitivity,and specificity values being 98.47%,96.43%,and 99.02%respectively.展开更多
Reestablishment in power system brings in significant transformation in the power sector by extinguishing the possession of sound consolidated assistance.However,the collaboration of various manufacturing agencies,aut...Reestablishment in power system brings in significant transformation in the power sector by extinguishing the possession of sound consolidated assistance.However,the collaboration of various manufacturing agencies,autonomous power manufacturers,and buyers have created complex installation processes.The regular active load and inefficiency of best measures among varied associates is a huge hazard.Any sudden load deviation will give rise to immediate amendment in frequency and tie-line power errors.It is essential to deal with every zone’s frequency and tie-line power within permitted confines followed by fluctuations within the load.Therefore,it can be proficient by implementing Load Frequency Control under the Bilateral case,stabilizing the power and frequency distinction within the interrelated power grid.Balancing the net deviation in multiple areas is possible by minimizing the unbalance of Bilateral Contracts with the help of proportional integral and advanced controllers like Harris Hawks Optimizer.We proposed the advanced controller Harris Hawk optimizer-based model and validated it on a test bench.The experiment results show that the delay time is 0.0029 s and the settling time of 20.86 s only.This model can also be leveraged to examine the decision boundaries of the Bilateral case.展开更多
文摘The present trends in smart world reflects the extensive use of limited resources through information and communication technology.The limited resources like space,mobility,energy,etc.,have been consumed rigorously towards creating optimized but smart instances.Thus,a new concept of IoT integrated smart city vision is yet to be proposed which includes a combination of systems like noise and air loss monitoring,web monitoring and fire detection systems,smart waste bin systems,etc.,that have not been clearly addressed in the previous researches.This paper focuses on developing an effective system for possible monitoring of losses,traffic management,thus innovating smart city at large with digitalized and integrated systems and software for fast and effective implementations.In our proposed system,a real time data analysis is performed.These data are collected by various sensors to analyze different factors that are responsible for such losses.The proposed work is validated on a real case study.
文摘In current days,the domain of Internet of Things(IoT)and Wireless Sensor Networks(WSN)are combined for enhancing the sensor related data transmission in the forthcoming networking applications.Clustering and routing techniques are treated as the effective methods highly used to attain reduced energy consumption and lengthen the lifetime of the WSN assisted IoT networks.In this view,this paper presents an Ensemble of Metaheuristic Optimization based QoS aware Clustering with Multihop Routing(EMOQoSCMR)Protocol for IoT assisted WSN.The proposed EMO-QoSCMR protocol aims to achieve QoS parameters such as energy,throughput,delay,and lifetime.The proposed model involves two stage processes namely clustering and routing.Firstly,the EMO-QoSCMR protocol involves crossentropy rain optimization algorithm based clustering(CEROAC)technique to select an optimal set of cluster heads(CHs)and construct clusters.Besides,oppositional chaos game optimization based routing(OCGOR)technique is employed for the optimal set of routes in the IoT assisted WSN.The proposed model derives a fitness function based on the parameters involved in the IoT nodes such as residual energy,distance to sink node,etc.The proposed EMOQoSCMR technique has resulted to an enhanced NAN of 64 nodes whereas the LEACH,PSO-ECHS,E-OEERP,and iCSHS methods have resulted in a lesser NAN of 2,10,42,and 51 rounds.The performance of the presented protocol has been evaluated interms of energy efficiency and network lifetime.
文摘Raw data are classified using clustering techniques in a reasonable manner to create disjoint clusters.A lot of clustering algorithms based on specific parameters have been proposed to access a high volume of datasets.This paper focuses on cluster analysis based on neutrosophic set implication,i.e.,a k-means algorithm with a threshold-based clustering technique.This algorithm addresses the shortcomings of the k-means clustering algorithm by overcoming the limitations of the threshold-based clustering algorithm.To evaluate the validity of the proposed method,several validity measures and validity indices are applied to the Iris dataset(from the University of California,Irvine,Machine Learning Repository)along with k-means and threshold-based clustering algorithms.The proposed method results in more segregated datasets with compacted clusters,thus achieving higher validity indices.The method also eliminates the limitations of threshold-based clustering algorithm and validates measures and respective indices along with k-means and threshold-based clustering algorithms.
基金supported by the 2022 Yeungnam University Research Grant.
文摘Automated segmentation and classification of biomedical images act as a vital part of the diagnosis of brain tumors(BT).A primary tumor brain analysis suggests a quicker response from treatment that utilizes for improving patient survival rate.The location and classification of BTs from huge medicinal images database,obtained from routine medical tasks with manual processes are a higher cost together in effort and time.An automatic recognition,place,and classifier process was desired and useful.This study introduces anAutomatedDeepResidualU-Net Segmentation with Classification model(ADRU-SCM)for Brain Tumor Diagnosis.The presentedADRUSCM model majorly focuses on the segmentation and classification of BT.To accomplish this,the presented ADRU-SCM model involves wiener filtering(WF)based preprocessing to eradicate the noise that exists in it.In addition,the ADRU-SCM model follows deep residual U-Net segmentation model to determine the affected brain regions.Moreover,VGG-19 model is exploited as a feature extractor.Finally,tunicate swarm optimization(TSO)with gated recurrent unit(GRU)model is applied as a classification model and the TSO algorithm effectually tunes theGRUhyperparameters.The performance validation of the ADRU-SCM model was tested utilizing FigShare dataset and the outcomes pointed out the better performance of the ADRU-SCM approach on recent approaches.
基金This research was supported in part by Basic Science Research Program through the National Research Foundation of Korea(NRF),funded by the Ministry of Education(NRF-2021R1A6A1A03039493)in part by the NRF grant funded by the Korea government(MSIT)(NRF-2022R1A2C1004401).
文摘Phishing is a type of cybercrime in which cyber-attackers pose themselves as authorized persons or entities and hack the victims’sensitive data.E-mails,instant messages and phone calls are some of the common modes used in cyberattacks.Though the security models are continuously upgraded to prevent cyberattacks,hackers find innovative ways to target the victims.In this background,there is a drastic increase observed in the number of phishing emails sent to potential targets.This scenario necessitates the importance of designing an effective classification model.Though numerous conventional models are available in the literature for proficient classification of phishing emails,the Machine Learning(ML)techniques and the Deep Learning(DL)models have been employed in the literature.The current study presents an Intelligent Cuckoo Search(CS)Optimization Algorithm with a Deep Learning-based Phishing Email Detection and Classification(ICSOA-DLPEC)model.The aim of the proposed ICSOA-DLPEC model is to effectually distinguish the emails as either legitimate or phishing ones.At the initial stage,the pre-processing is performed through three stages such as email cleaning,tokenization and stop-word elimination.Then,the N-gram approach is;moreover,the CS algorithm is applied to extract the useful feature vectors.Moreover,the CS algorithm is employed with the Gated Recurrent Unit(GRU)model to detect and classify phishing emails.Furthermore,the CS algorithm is used to fine-tune the parameters involved in the GRU model.The performance of the proposed ICSOA-DLPEC model was experimentally validated using a benchmark dataset,and the results were assessed under several dimensions.Extensive comparative studies were conducted,and the results confirmed the superior performance of the proposed ICSOA-DLPEC model over other existing approaches.The proposed model achieved a maximum accuracy of 99.72%.
基金This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(NRF-2021R1A6A1A03039493).
文摘Due to the advances of intelligent transportation system(ITSs),traffic forecasting has gained significant interest as robust traffic prediction acts as an important part in different ITSs namely traffic signal control,navigation,route mapping,etc.The traffic prediction model aims to predict the traffic conditions based on the past traffic data.For more accurate traffic prediction,this study proposes an optimal deep learning-enabled statistical analysis model.This study offers the design of optimal convolutional neural network with attention long short term memory(OCNN-ALSTM)model for traffic prediction.The proposed OCNN-ALSTM technique primarily preprocesses the traffic data by the use of min-max normalization technique.Besides,OCNN-ALSTM technique was executed for classifying and predicting the traffic data in real time cases.For enhancing the predictive outcomes of the OCNN-ALSTM technique,the bird swarm algorithm(BSA)is employed to it and thereby overall efficacy of the network gets improved.The design of BSA for optimal hyperparameter tuning of the CNN-ALSTM model shows the novelty of the work.The experimental validation of the OCNNALSTM technique is performed using benchmark datasets and the results are examined under several aspects.The simulation results reported the enhanced outcomes of the OCNN-ALSTM model over the recent methods under several dimensions.
基金supported by Ulsan Metropolitan City-ETRI joint cooperation Project[21AS1600]Development of intelligent technology for key industries and autonomous human-mobile-space autonomous collaboration intelligence technology].
文摘Software-defined networking is one of the progressive and prominent innovations in Information and Communications Technology.It mitigates the issues that our conventional network was experiencing.However,traffic data generated by various applications is increasing day by day.In addition,as an organization’s digital transformation is accelerated,the amount of information to be processed inside the organization has increased explosively.It might be possible that a Software-Defined Network becomes a bottleneck and unavailable.Various models have been proposed in the literature to balance the load.However,most of the works consider only limited parameters and do not consider controller and transmission media loads.These loads also contribute to decreasing the performance of Software-Defined Networks.This work illustrates how a software-defined network can tackle the load at its software layer and give excellent results to distribute the load.We proposed a deep learning-dependent convolutional neural networkbased load balancing technique to handle a software-defined network load.The simulation results show that the proposed model requires fewer resources as compared to existing machine learning-based load balancing techniques.
基金supported by Basic Science Research Program through the National Research Foundation of Korea (NRF)funded by the Ministry of Education (2020R1A6A1A03038540)by Korea Institute of Planning and Evaluation for Technology in Food,Agriculture,Forestry and Fisheries (IPET)through Digital Breeding Transformation Technology Development Program,funded by Ministry of Agriculture,Food and Rural Affairs (MAFRA) (322063-03-1-SB010)by the Technology development Program (RS-2022-00156456)funded by the Ministry of SMEs and Startups (MSS,Korea).
文摘Fruit classification utilizing a deep convolutional neural network(CNN)is the most promising application in personal computer vision(CV).Profound learning-related characterization made it possible to recognize fruits from pictures.But,due to the similarity and complexity,fruit recognition becomes an issue for the stacked fruits on a weighing scale.Recently,Machine Learning(ML)methods have been used in fruit farming and agriculture and brought great convenience to human life.An automated system related to ML could perform the fruit classifier and sorting tasks previously managed by human experts.CNN’s(convolutional neural networks)have attained incredible outcomes in image classifiers in several domains.Considering the success of transfer learning and CNNs in other image classifier issues,this study introduces an Artificial Humming Bird Optimization with Siamese Convolutional Neural Network based Fruit Classification(AMO-SCNNFC)model.In the presented AMO-SCNNFC technique,image preprocessing is performed to enhance the contrast level of the image.In addition,spiral optimization(SPO)with the VGG-16 model is utilized to derive feature vectors.For fruit classification,AHO with end to end SCNN(ESCNN)model is applied to identify different classes of fruits.The performance validation of the AMO-SCNNFC technique is tested using a dataset comprising diverse classes of fruit images.Extensive comparison studies reported improving the AMOSCNNFC technique over other approaches with higher accuracy of 99.88%.
基金This research was supported in part by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(NRF-2021R1A6A1A03039493)in part by the NRF grant funded by the Korea government(MSIT)(NRF-2022R1A2C1004401)in part by the 2022 Yeungnam University Research Grant.
文摘The exponential growth of data necessitates an effective data storage scheme,which helps to effectively manage the large quantity of data.To accomplish this,Deoxyribonucleic Acid(DNA)digital data storage process can be employed,which encodes and decodes binary data to and from synthesized strands of DNA.Vector quantization(VQ)is a commonly employed scheme for image compression and the optimal codebook generation is an effective process to reach maximum compression efficiency.This article introduces a newDNAComputingwithWater StriderAlgorithm based Vector Quantization(DNAC-WSAVQ)technique for Data Storage Systems.The proposed DNAC-WSAVQ technique enables encoding data using DNA computing and then compresses it for effective data storage.Besides,the DNAC-WSAVQ model initially performsDNA encoding on the input images to generate a binary encoded form.In addition,aWater Strider algorithm with Linde-Buzo-Gray(WSA-LBG)model is applied for the compression process and thereby storage area can be considerably minimized.In order to generate optimal codebook for LBG,the WSA is applied to it.The performance validation of the DNAC-WSAVQ model is carried out and the results are inspected under several measures.The comparative study highlighted the improved outcomes of the DNAC-WSAVQ model over the existing methods.
基金supported in part by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(NRF-2021R1A6A1A03039493)by the NRF grant funded by the Korea government(MSIT)(NRF-2022R1A2C1004401).
文摘In recent times,financial globalization has drastically increased in different ways to improve the quality of services with advanced resources.The successful applications of bitcoin Blockchain(BC)techniques enable the stockholders to worry about the return and risk of financial products.The stockholders focused on the prediction of return rate and risk rate of financial products.Therefore,an automatic return rate bitcoin prediction model becomes essential for BC financial products.The newly designed machine learning(ML)and deep learning(DL)approaches pave the way for return rate predictive method.This study introduces a novel Jellyfish search optimization based extreme learning machine with autoencoder(JSO-ELMAE)for return rate prediction of BC financial products.The presented JSO-ELMAE model designs a new ELMAE model for predicting the return rate of financial products.Besides,the JSO algorithm is exploited to tune the parameters related to the ELMAE model which in turn boosts the classification results.The application of JSO technique assists in optimal parameter adjustment of the ELMAE model to predict the bitcoin return rates.The experimental validation of the JSO-ELMAE model was executed and the outcomes are inspected in many aspects.The experimental values demonstrated the enhanced performance of the JSO-ELMAE model over recent state of art approaches with minimal RMSE of 0.1562.
基金This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(2020R1A6A1A03038540)National Research Foundation of Korea(NRF)grant funded by the Korea government,Ministry of Science and ICT(MSIT)(2021R1F1A1046339).
文摘Due to latest advancements in the field of remote sensing,it becomes easier to acquire high quality images by the use of various satellites along with the sensing components.But the massive quantity of data poses a challenging issue to store and effectively transmit the remote sensing images.Therefore,image compression techniques can be utilized to process remote sensing images.In this aspect,vector quantization(VQ)can be employed for image compression and the widely applied VQ approach is Linde–Buzo–Gray(LBG)which creates a local optimum codebook for image construction.The process of constructing the codebook can be treated as the optimization issue and the metaheuristic algorithms can be utilized for resolving it.With this motivation,this article presents an intelligent satin bowerbird optimizer based compression technique(ISBO-CT)for remote sensing images.The goal of the ISBO-CT technique is to proficiently compress the remote sensing images by the effective design of codebook.Besides,the ISBO-CT technique makes use of satin bowerbird optimizer(SBO)with LBG approach is employed.The design of SBO algorithm for remote sensing image compression depicts the novelty of the work.To showcase the enhanced efficiency of ISBO-CT approach,an extensive range of simulations were applied and the outcomes reported the optimum performance of ISBO-CT technique related to the recent state of art image compression approaches.
文摘Traditional Wireless Sensor Networks(WSNs)comprise of costeffective sensors that can send physical parameters of the target environment to an intended user.With the evolution of technology,multimedia sensor nodes have become the hot research topic since it can continue gathering multimedia content and scalar from the target domain.The existence of multimedia sensors,integrated with effective signal processing and multimedia source coding approaches,has led to the increased application of Wireless Multimedia Sensor Network(WMSN).This sort of network has the potential to capture,transmit,and receive multimedia content.Since energy is a major source in WMSN,novel clustering approaches are essential to deal with adaptive topologies of WMSN and prolonged network lifetime.With this motivation,the current study develops an Enhanced Spider Monkey Optimization-based Energy-Aware Clustering Scheme(ESMO-EACS)for WMSN.The proposed ESMO-EACS model derives ESMO algorithm by incorporating the concepts of SMO algorithm and quantum computing.The proposed ESMO-EACS model involves the design of fitness functions using distinct input parameters for effective construction of clusters.A comprehensive experimental analysis was conducted to validate the effectiveness of the proposed ESMO-EACS technique in terms of different performance measures.The simulation outcome established the superiority of the proposed ESMO-EACS technique to other methods under various measures.
基金This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(2020R1A6A1A03038540)National Research Foundation of Korea(NRF)grant funded by the Korea government,Ministry of Science and ICT(MSIT)(2021R1F1A1046339)by a grant(20212020900150)from“Development and Demonstration of Technology for Customers Bigdata-based Energy Management in the Field of Heat Supply Chain”funded by Ministry of Trade,Industry and Energy of Korean government.
文摘In recent times,Internet of Medical Things(IoMT)gained much attention in medical services and healthcare management domain.Since healthcare sector generates massive volumes of data like personal details,historical medical data,hospitalization records,and discharging records,IoMT devices too evolved with potentials to handle such high quantities of data.Privacy and security of the data,gathered by IoMT gadgets,are major issues while transmitting or saving it in cloud.The advancements made in Artificial Intelligence(AI)and encryption techniques find a way to handle massive quantities of medical data and achieve security.In this view,the current study presents a new Optimal Privacy Preserving and Deep Learning(DL)-based Disease Diagnosis(OPPDL-DD)in IoMT environment.Initially,the proposed model enables IoMT devices to collect patient data which is then preprocessed to optimize quality.In order to decrease the computational difficulty during diagnosis,Radix Tree structure is employed.In addition,ElGamal public key cryptosystem with Rat Swarm Optimizer(EIG-RSO)is applied to encrypt the data.Upon the transmission of encrypted data to cloud,respective decryption process occurs and the actual data gets reconstructed.Finally,a hybridized methodology combining Gated Recurrent Unit(GRU)with Convolution Neural Network(CNN)is exploited as a classification model to diagnose the disease.Extensive sets of simulations were conducted to highlight the performance of the proposed model on benchmark dataset.The experimental outcomes ensure that the proposed model is superior to existing methods under different measures.
基金This research was supported in part by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(NRF-2021R1A6A1A03039493)in part by the NRF grant funded by the Korea government(MSIT)(NRF-2022R1A2C1004401)in part by the 2022 Yeungnam University Research Grant.
文摘Cyberattacks are developing gradually sophisticated,requiring effective intrusion detection systems(IDSs)for monitoring computer resources and creating reports on anomalous or suspicious actions.With the popularity of Internet of Things(IoT)technology,the security of IoT networks is developing a vital problem.Because of the huge number and varied kinds of IoT devices,it can be challenging task for protecting the IoT framework utilizing a typical IDS.The typical IDSs have their restrictions once executed to IoT networks because of resource constraints and complexity.Therefore,this paper presents a new Blockchain Assisted Intrusion Detection System using Differential Flower Pollination with Deep Learning(BAIDS-DFPDL)model in IoT Environment.The presented BAIDS-DFPDLmodelmainly focuses on the identification and classification of intrusions in the IoT environment.To accomplish this,the presented BAIDS-DFPDL model follows blockchain(BC)technology for effective and secure data transmission among the agents.Besides,the presented BAIDSDFPDLmodel designs Differential Flower Pollination based feature selection(DFPFS)technique to elect features.Finally,sailfish optimization(SFO)with Restricted Boltzmann Machine(RBM)model is applied for effectual recognition of intrusions.The simulation results on benchmark dataset exhibit the enhanced performance of the BAIDS-DFPDL model over other models on the recognition of intrusions.
基金This research was supported by Hankuk University of Foreign Studies Research Fund of 2021.Also,This research was supported by the MIST(Ministry of Science,ICT),Korea,under the National Program for Excellence in SW),supervised by the IITP(Institute of Information&communications Technology Planing&Evaluation)in 2021”(2019-0-01816).
文摘In recent times,internet of things(IoT)applications on the cloud might not be the effective solution for every IoT scenario,particularly for time sensitive applications.A significant alternative to use is edge computing that resolves the problem of requiring high bandwidth by end devices.Edge computing is considered a method of forwarding the processing and communication resources in the cloud towards the edge.One of the considerations of the edge computing environment is resource management that involves resource scheduling,load balancing,task scheduling,and quality of service(QoS)to accomplish improved performance.With this motivation,this paper presents new soft computing based metaheuristic algorithms for resource scheduling(RS)in the edge computing environment.The SCBMARS model involves the hybridization of the Group Teaching Optimization Algorithm(GTOA)with rat swarm optimizer(RSO)algorithm for optimal resource allocation.The goal of the SCBMA-RS model is to identify and allocate resources to every incoming user request in such a way,that the client’s necessities are satisfied with the minimum number of possible resources and optimal energy consumption.The problem is formulated based on the availability of VMs,task characteristics,and queue dynamics.The integration of GTOA and RSO algorithms assist to improve the allocation of resources among VMs in the data center.For experimental validation,a comprehensive set of simulations were performed using the CloudSim tool.The experimental results showcased the superior performance of the SCBMA-RS model interms of different measures.
基金This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(NRF-2021R1A6A1A03039493).
文摘Wireless sensor networks(WSN)encompass a set of inexpensive and battery powered sensor nodes,commonly employed for data gathering and tracking applications.Optimal energy utilization of the nodes in WSN is essential to capture data effectively and transmit them to destination.The latest developments of energy efficient clustering techniques can be widely applied to accomplish energy efficiency in the network.In this aspect,this paper presents an enhanced Archimedes optimization based cluster head selection(EAOA-CHS)approach for WSN.The goal of the EAOA-CHS method is to optimally choose the CHs from the available nodes in WSN and then organize the nodes into a set of clusters.Besides,the EAOA is derived by the incorporation of the chaotic map and pseudo-random performance.Moreover,the EAOA-CHS technique determines a fitness function involving total energy consumption and lifetime of WSN.The design of EAOA for CH election in the WSN depicts the novelty of work.In order to exhibit the enhanced efficiency of EAOA-CHS technique,a set of simulations are applied on 3 distinct conditions dependent upon the place of base station(BS).The simulation results pointed out the better outcomes of the EAOA-CHS technique over the recent methods under all scenarios.
基金the Institute for Information and Communications Technology Planning and Evaluation(IITP)funded by the Korea Government(MSIT)under Grant 20190007960022002(2020000000110).
文摘In network-based intrusion detection practices,there are more regular instances than intrusion instances.Because there is always a statistical imbalance in the instances,it is difficult to train the intrusion detection system effectively.In this work,we compare intrusion detection performance by increasing the rarely appearing instances rather than by eliminating the frequently appearing duplicate instances.Our technique mitigates the statistical imbalance in these instances.We also carried out an experiment on the training model by increasing the instances,thereby increasing the attack instances step by step up to 13 levels.The experiments included not only known attacks,but also unknown new intrusions.The results are compared with the existing studies from the literature,and show an improvement in accuracy,sensitivity,and specificity over previous studies.The detection rates for the remote-to-user(R2L)and user-to-root(U2L)categories are improved significantly by adding fewer instances.The detection of many intrusions is increased from a very low to a very high detection rate.The detection of newer attacks that had not been used in training improved from 9%to 12%.This study has practical applications in network administration to protect from known and unknown attacks.If network administrators are running out of instances for some attacks,they can increase the number of instances with rarely appearing instances,thereby improving the detection of both known and unknown new attacks.
基金the Research Grant of Kwangwoon University in 2020.
文摘The Internet of Things(IoT)is envisioned as a network of various wireless sensor nodes communicating with each other to offer state-of-the-art solutions to real-time problems.These networks of wireless sensors monitor the physical environment and report the collected data to the base station,allowing for smarter decisions.Localization in wireless sensor networks is to localize a sensor node in a two-dimensional plane.However,in some application areas,such as various surveillances,underwater monitoring systems,and various environmental monitoring applications,wireless sensors are deployed in a three-dimensional plane.Recently,localization-based applications have emerged as one of the most promising services related to IoT.In this paper,we propose a novel distributed range-free algorithm for node localization in wireless sensor networks.The proposed three-dimensional hop localization algorithm is based on the distance error correction factor.In this algorithm,the error decreases with the localization process.The distance correction factor is used at various stages of the localization process,which ultimately mitigates the error.We simulated the proposed algorithm using MATLAB and verified the accuracy of the algorithm.The simulation results are compared with some of the well-known existing algorithms in the literature.The results show that the proposed three-dimensional error-correctionbased algorithm performs better than existing algorithms.
文摘Diabetic Retinopathy(DR)is a significant blinding disease that poses serious threat to human vision rapidly.Classification and severity grading of DR are difficult processes to accomplish.Traditionally,it depends on ophthalmoscopically-visible symptoms of growing severity,which is then ranked in a stepwise scale from no retinopathy to various levels of DR severity.This paper presents an ensemble of Orthogonal Learning Particle Swarm Optimization(OPSO)algorithm-based Convolutional Neural Network(CNN)Model EOPSO-CNN in order to perform DR detection and grading.The proposed EOPSO-CNN model involves three main processes such as preprocessing,feature extraction,and classification.The proposed model initially involves preprocessing stage which removes the presence of noise in the input image.Then,the watershed algorithm is applied to segment the preprocessed images.Followed by,feature extraction takes place by leveraging EOPSO-CNN model.Finally,the extracted feature vectors are provided to a Decision Tree(DT)classifier to classify the DR images.The study experiments were carried out using Messidor DR Dataset and the results showed an extraordinary performance by the proposed method over compared methods in a considerable way.The simulation outcome offered the maximum classification with accuracy,sensitivity,and specificity values being 98.47%,96.43%,and 99.02%respectively.
基金The Deanship of Scientific Research(DSR)at King Abdulaziz University,Jeddah,Saudi Arabia has funded this project,under grant no.(FP-221-43).
文摘Reestablishment in power system brings in significant transformation in the power sector by extinguishing the possession of sound consolidated assistance.However,the collaboration of various manufacturing agencies,autonomous power manufacturers,and buyers have created complex installation processes.The regular active load and inefficiency of best measures among varied associates is a huge hazard.Any sudden load deviation will give rise to immediate amendment in frequency and tie-line power errors.It is essential to deal with every zone’s frequency and tie-line power within permitted confines followed by fluctuations within the load.Therefore,it can be proficient by implementing Load Frequency Control under the Bilateral case,stabilizing the power and frequency distinction within the interrelated power grid.Balancing the net deviation in multiple areas is possible by minimizing the unbalance of Bilateral Contracts with the help of proportional integral and advanced controllers like Harris Hawks Optimizer.We proposed the advanced controller Harris Hawk optimizer-based model and validated it on a test bench.The experiment results show that the delay time is 0.0029 s and the settling time of 20.86 s only.This model can also be leveraged to examine the decision boundaries of the Bilateral case.