Since COVID-19 infections are increasing all over the world,there is a need for developing solutions for its early and accurate diagnosis is a must.Detectionmethods for COVID-19 include screeningmethods like Chest X-r...Since COVID-19 infections are increasing all over the world,there is a need for developing solutions for its early and accurate diagnosis is a must.Detectionmethods for COVID-19 include screeningmethods like Chest X-rays and Computed Tomography(CT)scans.More work must be done on preprocessing the datasets,such as eliminating the diaphragm portions,enhancing the image intensity,and minimizing noise.In addition to the detection of COVID-19,the severity of the infection needs to be estimated.The HSDC model is proposed to solve these problems,which will detect and classify the severity of COVID-19 from X-ray and CT-scan images.For CT-scan images,the histogram threshold of the input image is adaptively determined using the ICH Swarm Optimization Segmentation(ICHSeg)algorithm.Based on the Statistical and Shape-based feature vectors(FVs),the extracted regions are classified using a Hybrid model for CT images(HSDCCT)algorithm.When the infections are detected,it’s classified as Normal,Moderate,and Severe.A fused FHI is formed for X-ray images by extracting the features of Histogram-oriented gradient(HOG)and Image profile(IP).The FHI features of X-ray images are classified using Hybrid Support Vector Machine(SVM)and Deep Convolutional Neural Network(DCNN)HSDCX algorithm into COVID-19 or else Pneumonia,or Normal.Experimental results have shown that the accuracy of the HSDC model attains the highest of 94.6 for CT-scan images and 95.6 for X-ray images when compared to SVM and DCNN.This study thus significantly helps medical professionals and doctors diagnose COVID-19 infections quickly,which is the most needed in current years.展开更多
Efficient routing protocols are crucial for enabling secure communication among the highly mobile and self-configurable nodes in Vehicular Ad-Hoc Networks(VANETs). In this work, we present a performance evaluation of ...Efficient routing protocols are crucial for enabling secure communication among the highly mobile and self-configurable nodes in Vehicular Ad-Hoc Networks(VANETs). In this work, we present a performance evaluation of different routing protocols in VANETs based on the currently available research. Our study focuses on analyzing the strength and weaknesses of the routing protocols, namely, Ad-Hoc On-demand Distance Vector(AODV), Dynamic Source Routing(DSR), and Destination-Sequenced Distance-Vector(DSDV), under varying network conditions. We examine the protocols’ performance based on key metrics such as throughput, delay, and energy consumption. We also highlight the advantages and limitations of each protocol in different scenarios, such as varying vehicular densities and mobility patterns. Our results show that AODV outperforms DSR and DSDV in terms of throughput and delay, while DSR consumes the least energy. We also observe that the performance of the routing protocols varies with the density of vehicles and the mobility patterns of the nodes. Our study highlights the importance of conducting real-world experiments to evaluate the performance of routing protocols in VANETs, as they provide more realistic and accurate results than simulation-based studies. Our findings can help in the selection and design of efficient and secure routing protocols for VANETs.展开更多
The introduction of new technologies has increased communication network coverage and the number of associating nodes in dynamic communication networks(DCN).As the network has the characteristics like decentralized an...The introduction of new technologies has increased communication network coverage and the number of associating nodes in dynamic communication networks(DCN).As the network has the characteristics like decentralized and dynamic,few nodes in the network may not associate with other nodes.These uncooperative nodes also known as selfish nodes corrupt the performance of the cooperative nodes.Namely,the nodes cause congestion,high delay,security concerns,and resource depletion.This study presents an effective selfish node detection method to address these problems.The Price of Anarchy(PoA)and the Price of Stability(PoS)in Game Theory with the Presence of Nash Equilibrium(NE)are discussed for the Selfish Node Detection.This is a novel experiment to detect selfish nodes in a network using PoA.Moreover,the least response dynamic-based Capacitated Selfish Resource Allocation(CSRA)game is introduced to improve resource usage among the nodes.The suggested strategy is simulated using the Solar Winds simulator,and the simulation results show that,when compared to earlier methods,the new scheme offers promising performance in terms of delivery rate,delay,and throughput.展开更多
Fog computing is a rapidly growing technology that aids in pipelining the possibility of mitigating breaches between the cloud and edge servers.It facil-itates the benefits of the network edge with the maximized probab...Fog computing is a rapidly growing technology that aids in pipelining the possibility of mitigating breaches between the cloud and edge servers.It facil-itates the benefits of the network edge with the maximized probability of offering interaction with the cloud.However,the fog computing characteristics are suscep-tible to counteract the challenges of security.The issues present with the Physical Layer Security(PLS)aspect in fog computing which included authentication,integrity,and confidentiality has been considered as a reason for the potential issues leading to the security breaches.In this work,the Octonion Algebra-inspired Non-Commutative Ring-based Fully Homomorphic Encryption Scheme(NCR-FHE)was proposed as a secrecy improvement technique to overcome the impersonation attack in cloud computing.The proposed approach was derived through the benefits of Octonion algebra to facilitate the maximum security for big data-based applications.The major issues in the physical layer security which may potentially lead to the possible security issues were identified.The potential issues causing the impersonation attack in the Fog computing environment were identified.The proposed approach was compared with the existing encryption approaches and claimed as a robust approach to identify the impersonation attack for the fog and edge network.The computation cost of the proposed NCR-FHE is identified to be significantly reduced by 7.18%,8.64%,9.42%,and 10.36%in terms of communication overhead for varying packet sizes,when compared to the benchmarked ECDH-DH,LHPPS,BF-PHE and SHE-PABF schemes.展开更多
The creation of the 3D rendering model involves the prediction of an accurate depth map for the input images.A proposed approach of a modified semi-global block matching algorithm with variable window size and the gra...The creation of the 3D rendering model involves the prediction of an accurate depth map for the input images.A proposed approach of a modified semi-global block matching algorithm with variable window size and the gradient assessment of objects predicts the depth map.3D modeling and view synthesis algorithms could effectively handle the obtained disparity maps.This work uses the consistency check method to find an accurate depth map for identifying occluded pixels.The prediction of the disparity map by semi-global block matching has used the benchmark dataset of Middlebury stereo for evaluation.The improved depth map quality within a reasonable process-ing time outperforms the other existing depth map prediction algorithms.The experimental results have shown that the proposed depth map predictioncould identify the inter-object boundaryeven with the presence ofocclusion with less detection error and runtime.We observed that the Middlebury stereo dataset has very few images with occluded objects,which made the attainment of gain cumbersome.Considering this gain,we have created our dataset with occlu-sion using the structured lighting technique.The proposed regularization term as an optimization process in the graph cut algorithm handles occlusion for different smoothing coefficients.The experimented results demonstrated that our dataset had outperformed the Tsukuba dataset regarding the percentage of occluded pixels.展开更多
This paper presents a new approach to the delineation of local labor markets based on evolutionary computation. The aim of the exercise is the division of a given territory into functional regions based on travel-to-w...This paper presents a new approach to the delineation of local labor markets based on evolutionary computation. The aim of the exercise is the division of a given territory into functional regions based on travel-to-work flows. Such regions are defined so that a high degree of inter-regional separation and of intra-regional integration in both cases in terms of commuting flows is guaranteed. Additional requirements include the absence of overlap between delineated regions and the exhaustive coverage of the whole territory. The procedure is based on the maximization of a fitness function that measures aggregate intra-region interaction under constraints of inter-region separation and minimum size. In the experimentation stage, two variations of the fitness function are used, and the process is also applied as a final stage for the optimization of the results from one of the most successful existing methods, which are used by the British authorities for the delineation of travel-to-work areas (TTWAs). The empirical exercise is conducted using real data for a sufficiently large territory that is considered to be representative given the density and variety of travel-to-work patterns that it embraces. The paper includes the quantitative comparison with alternative traditional methods, the assessment of the performance of the set of operators which has been specifically designed to handle the regionalization problem and the evaluation of the convergence process. The robustness of the solutions, something crucial in a research and policy-making context, is also discussed in the paper.展开更多
Clustering algorithms optimization can minimize topology maintenance overhead in large scale vehicular Ad hoc networks(VANETs)for smart transportation that results from dynamic topology,limited resources and noncentra...Clustering algorithms optimization can minimize topology maintenance overhead in large scale vehicular Ad hoc networks(VANETs)for smart transportation that results from dynamic topology,limited resources and noncentralized architecture.The performance of a clustering algorithm varies with the underlying mobility model to address the topology maintenance overhead issue in VANETs for smart transportation.To design a robust clustering algorithm,careful attention must be paid to components like mobility models and performance objectives.A clustering algorithm may not perform well with every mobility pattern.Therefore,we propose a supervisory protocol(SP)that observes the mobility pattern of vehicles and identies the realistic Mobility model through microscopic features.An analytical model can be used to determine an efcient clustering algorithm for a specic mobility model(MM).SP selects the best clustering scheme according to the mobility model and guarantees a consistent performance throughout VANET operations.The simulation has performed in three parts that is the central part simulation for setting up the clustering environment,In the second part the clustering algorithms are tested for efciency in a constrained atmosphere for some time and the third part represents the proposed scheme.The simulation results show that the proposed scheme outperforms clustering algorithms such as honey bee algorithm-based clustering and memetic clustering in terms of cluster count,re-afliation rate,control overhead and cluster lifetime.展开更多
The world is rapidly changing with the advance of information technology.The expansion of the Internet of Things(IoT)is a huge step in the development of the smart city.The IoT consists of connected devices that trans...The world is rapidly changing with the advance of information technology.The expansion of the Internet of Things(IoT)is a huge step in the development of the smart city.The IoT consists of connected devices that transfer information.The IoT architecture permits on-demand services to a public pool of resources.Cloud computing plays a vital role in developing IoT-enabled smart applications.The integration of cloud computing enhances the offering of distributed resources in the smart city.Improper management of security requirements of cloud-assisted IoT systems can bring about risks to availability,security,performance,condentiality,and privacy.The key reason for cloud-and IoT-enabled smart city application failure is improper security practices at the early stages of development.This article proposes a framework to collect security requirements during the initial development phase of cloud-assisted IoT-enabled smart city applications.Its three-layered architecture includes privacy preserved stakeholder analysis(PPSA),security requirement modeling and validation(SRMV),and secure cloud-assistance(SCA).A case study highlights the applicability and effectiveness of the proposed framework.A hybrid survey enables the identication and evaluation of signicant challenges.展开更多
An intrusion detection system(IDS)becomes an important tool for ensuring security in the network.In recent times,machine learning(ML)and deep learning(DL)models can be applied for the identification of intrusions over...An intrusion detection system(IDS)becomes an important tool for ensuring security in the network.In recent times,machine learning(ML)and deep learning(DL)models can be applied for the identification of intrusions over the network effectively.To resolve the security issues,this paper presents a new Binary Butterfly Optimization algorithm based on Feature Selection with DRL technique,called BBOFS-DRL for intrusion detection.The proposed BBOFSDRL model mainly accomplishes the recognition of intrusions in the network.To attain this,the BBOFS-DRL model initially designs the BBOFS algorithm based on the traditional butterfly optimization algorithm(BOA)to elect feature subsets.Besides,DRL model is employed for the proper identification and classification of intrusions that exist in the network.Furthermore,beetle antenna search(BAS)technique is applied to tune the DRL parameters for enhanced intrusion detection efficiency.For ensuring the superior intrusion detection outcomes of the BBOFS-DRL model,a wide-ranging experimental analysis is performed against benchmark dataset.The simulation results reported the supremacy of the BBOFS-DRL model over its recent state of art approaches.展开更多
The Brain Tumor(BT)is created by an uncontrollable rise of anomalous cells in brain tissue,and it consists of 2 types of cancers they are malignant and benign tumors.The benevolent BT does not affect the neighbouring ...The Brain Tumor(BT)is created by an uncontrollable rise of anomalous cells in brain tissue,and it consists of 2 types of cancers they are malignant and benign tumors.The benevolent BT does not affect the neighbouring healthy and normal tissue;however,the malignant could affect the adjacent brain tissues,which results in death.Initial recognition of BT is highly significant to protecting the patient’s life.Generally,the BT can be identified through the magnetic resonance imaging(MRI)scanning technique.But the radiotherapists are not offering effective tumor segmentation in MRI images because of the position and unequal shape of the tumor in the brain.Recently,ML has prevailed against standard image processing techniques.Several studies denote the superiority of machine learning(ML)techniques over standard techniques.Therefore,this study develops novel brain tumor detection and classification model using met heuristic optimization with machine learning(BTDC-MOML)model.To accomplish the detection of brain tumor effectively,a Computer-Aided Design(CAD)model using Machine Learning(ML)technique is proposed in this research manuscript.Initially,the input image pre-processing is performed using Gaborfiltering(GF)based noise removal,contrast enhancement,and skull stripping.Next,mayfly optimization with the Kapur’s thresholding based segmentation process takes place.For feature extraction proposes,local diagonal extreme patterns(LDEP)are exploited.At last,the Extreme Gradient Boosting(XGBoost)model can be used for the BT classification process.The accuracy analysis is performed in terms of Learning accuracy,and the validation accuracy is performed to determine the efficiency of the proposed research work.The experimental validation of the proposed model demonstrates its promising performance over other existing methods.展开更多
Phishing is a type of cybercrime in which cyber-attackers pose themselves as authorized persons or entities and hack the victims’sensitive data.E-mails,instant messages and phone calls are some of the common modes us...Phishing is a type of cybercrime in which cyber-attackers pose themselves as authorized persons or entities and hack the victims’sensitive data.E-mails,instant messages and phone calls are some of the common modes used in cyberattacks.Though the security models are continuously upgraded to prevent cyberattacks,hackers find innovative ways to target the victims.In this background,there is a drastic increase observed in the number of phishing emails sent to potential targets.This scenario necessitates the importance of designing an effective classification model.Though numerous conventional models are available in the literature for proficient classification of phishing emails,the Machine Learning(ML)techniques and the Deep Learning(DL)models have been employed in the literature.The current study presents an Intelligent Cuckoo Search(CS)Optimization Algorithm with a Deep Learning-based Phishing Email Detection and Classification(ICSOA-DLPEC)model.The aim of the proposed ICSOA-DLPEC model is to effectually distinguish the emails as either legitimate or phishing ones.At the initial stage,the pre-processing is performed through three stages such as email cleaning,tokenization and stop-word elimination.Then,the N-gram approach is;moreover,the CS algorithm is applied to extract the useful feature vectors.Moreover,the CS algorithm is employed with the Gated Recurrent Unit(GRU)model to detect and classify phishing emails.Furthermore,the CS algorithm is used to fine-tune the parameters involved in the GRU model.The performance of the proposed ICSOA-DLPEC model was experimentally validated using a benchmark dataset,and the results were assessed under several dimensions.Extensive comparative studies were conducted,and the results confirmed the superior performance of the proposed ICSOA-DLPEC model over other existing approaches.The proposed model achieved a maximum accuracy of 99.72%.展开更多
Because of its strong ability to solve problems,evolutionary multitask optimization(EMTO)algorithms have been widely studied recently.Evolutionary algorithms have the advantage of fast searching for the optimal soluti...Because of its strong ability to solve problems,evolutionary multitask optimization(EMTO)algorithms have been widely studied recently.Evolutionary algorithms have the advantage of fast searching for the optimal solution,but it is easy to fall into local optimum and difficult to generalize.Combining evolutionary multitask algorithms with evolutionary optimization algorithms can be an effective method for solving these problems.Through the implicit parallelism of tasks themselves and the knowledge transfer between tasks,more promising individual algorithms can be generated in the evolution process,which can jump out of the local optimum.How to better combine the two has also been studied more and more.This paper explores the existing evolutionary multitasking theory and improvement scheme in detail.Then,it summarizes the application of EMTO in different scenarios.Finally,according to the existing research,the future research trends and potential exploration directions are revealed.展开更多
Numerous methods are analysed in detail to improve task schedulingand data security performance in the cloud environment. The methodsinvolve scheduling according to the factors like makespan, waiting time,cost, deadli...Numerous methods are analysed in detail to improve task schedulingand data security performance in the cloud environment. The methodsinvolve scheduling according to the factors like makespan, waiting time,cost, deadline, and popularity. However, the methods are inappropriate forachieving higher scheduling performance. Regarding data security, existingmethods use various encryption schemes but introduce significant serviceinterruption. This article sketches a practical Real-time Application CentricTRS (Throughput-Resource utilization–Success) Scheduling with Data Security(RATRSDS) model by considering all these issues in task scheduling anddata security. The method identifies the required resource and their claim timeby receiving the service requests. Further, for the list of resources as services,the method computes throughput support (Thrs) according to the number ofstatements executed and the complete statements of the service. Similarly, themethod computes Resource utilization support (Ruts) according to the idletime on any duty cycle and total servicing time. Also, the method computesthe value of Success support (Sus) according to the number of completions forthe number of allocations. The method estimates the TRS score (ThroughputResource utilization Success) for different resources using all these supportmeasures. According to the value of the TRS score, the services are rankedand scheduled. On the other side, based on the requirement of service requests,the method computes Requirement Support (RS). The selection of service isperformed and allocated. Similarly, choosing the route according to the RouteSupport Measure (RSM) enforced route security. Finally, data security hasgets implemented with a service-based encryption technique. The RATRSDSscheme has claimed higher performance in data security and scheduling.展开更多
The impact of a Distributed Denial of Service(DDoS)attack on Soft-ware Defined Networks(SDN)is briefly analyzed.Many approaches to detecting DDoS attacks exist,varying on the feature being considered and the method us...The impact of a Distributed Denial of Service(DDoS)attack on Soft-ware Defined Networks(SDN)is briefly analyzed.Many approaches to detecting DDoS attacks exist,varying on the feature being considered and the method used.Still,the methods have a deficiency in the performance of detecting DDoS attacks and mitigating them.To improve the performance of SDN,an efficient Real-time Multi-Constrained Adaptive Replication and Traffic Approximation Model(RMCARTAM)is sketched in this article.The RMCARTAM considers different parameters or constraints in running different controllers responsible for handling incoming packets.The model is designed with multiple controllers to handle net-work traffic but can turn the controllers according to requirements.The multi-con-straint adaptive replication model monitors different features of network traffic like rate of packet reception,class-based packet reception and target-specific reception.According to these features,the method estimates the Replication Turn-ing Weight(RTW)based on which triggering controllers are performed.Similarly,the method applies Traffic Approximation(TA)in the detection of DDoS attacks.The detection of a DDoS attack is performed by approximating the incoming traf-fic to any service and using various features like hop count,payload,service fre-quency,and malformed frequency to compute various support measures on bandwidth access,data support,frequency support,malformed support,route sup-port,and so on.Using all these support measures,the method computes the value of legitimate weight to conclude the behavior of any source in identifying the mal-icious node.Identified node details are used in the mitigation of DDoS attacks.The method stimulates the network performance by reducing the power factor by switching the controller according to different factors,which also reduces the cost.In the same way,the proposed model improves the accuracy of detecting DDoS attacks by estimating the features of incoming traffic in different corners.展开更多
Prediction of stock market value is highly risky because it is based on the concept of Time Series forecasting system that can be used for investments in a safe environment with minimized chances of loss.The proposed ...Prediction of stock market value is highly risky because it is based on the concept of Time Series forecasting system that can be used for investments in a safe environment with minimized chances of loss.The proposed model uses a real time dataset offifteen Stocks as input into the system and based on the data,predicts or forecast future stock prices of different companies belonging to different sectors.The dataset includes approximatelyfifteen companies from different sectors and forecasts their results based on which the user can decide whether to invest in the particular company or not;the forecasting is done for the next quarter.Our model uses 3 main concepts for forecasting results.Thefirst one is for stocks that show periodic change throughout the season,the‘Holt-Winters Triple Exponential Smoothing’.3 basic things taken into conclusion by this algorithm are Base Level,Trend Level and Seasoning Factor.The value of all these are calculated by us and then decomposition of all these factors is done by the Holt-Winters Algorithm.The second concept is‘Recurrent Neural Network’.The specific model of recurrent neural network that is being used is Long-Short Term Memory and it’s the same as the Normal Neural Network,the only difference is that each intermediate cell is a memory cell and retails its value till the next feedback loop.The third concept is Recommendation System whichfilters and predict the rating based on the different factors.展开更多
Current revelations in medical imaging have seen a slew of computer-aided diagnostic(CAD)tools for radiologists developed.Brain tumor classification is essential for radiologists to fully support and better interpret ...Current revelations in medical imaging have seen a slew of computer-aided diagnostic(CAD)tools for radiologists developed.Brain tumor classification is essential for radiologists to fully support and better interpret magnetic resonance imaging(MRI).In this work,we reported on new observations based on binary brain tumor categorization using HYBRID CNN-LSTM.Initially,the collected image is pre-processed and augmented using the following steps such as rotation,cropping,zooming,CLAHE(Contrast Limited Adaptive Histogram Equalization),and Random Rotation with panoramic stitching(RRPS).Then,a method called particle swarm optimization(PSO)is used to segment tumor regions in an MR image.After that,a hybrid CNN-LSTM classifier is applied to classify an image as a tumor or normal.In this proposed hybrid model,the CNN classifier is used for generating the feature map and the LSTM classifier is used for the classification process.The effectiveness of the proposed approach is analyzed based on the different metrics and outcomes compared to different methods.展开更多
Handling service access in a cloud environment has been identified as a critical challenge in the modern internet world due to the increased rate of intrusion attacks.To address such threats towards cloud services,num...Handling service access in a cloud environment has been identified as a critical challenge in the modern internet world due to the increased rate of intrusion attacks.To address such threats towards cloud services,numerous techniques exist that mitigate the service threats according to different metrics.The rule-based approaches are unsuitable for new threats,whereas trust-based systems estimate trust value based on behavior,flow,and other features.However,the methods suffer from mitigating intrusion attacks at a higher rate.This article presents a novel Multi Fractal Trust Evaluation Model(MFTEM)to overcome these deficiencies.The method involves analyzing service growth,network growth,and quality of service growth.The process estimates the user’s trust in various ways and the support of the user in achieving higher service performance by calculating Trusted Service Support(TSS).Also,the user’s trust in supporting network stream by computing Trusted Network Support(TNS).Similarly,the user’s trust in achieving higher throughput is analyzed by computing Trusted QoS Support(TQS).Using all these measures,the method adds the Trust User Score(TUS)value to decide on the clearance of user requests.The proposed MFTEM model improves intrusion detection accuracy with higher performance.展开更多
The deep learning models are identified as having a significant impact on various problems.The same can be adapted to the problem of brain tumor classification.However,several deep learning models are presented earlie...The deep learning models are identified as having a significant impact on various problems.The same can be adapted to the problem of brain tumor classification.However,several deep learning models are presented earlier,but they need better classification accuracy.An efficient Multi-Feature Approximation Based Convolution Neural Network(CNN)model(MFACNN)is proposed to handle this issue.The method reads the input 3D Magnetic Resonance Imaging(MRI)images and applies Gabor filters at multiple levels.The noise-removed image has been equalized for its quality by using histogram equalization.Further,the features like white mass,grey mass,texture,and shape are extracted from the images.Extracted features are trained with deep learning Convolution Neural Network(CNN).The network has been designed with a single convolution layer towards dimensionality reduction.The texture features obtained from the brain image have been transformed into a multi-dimensional feature matrix,which has been transformed into a single-dimensional feature vector at the convolution layer.The neurons of the intermediate layer are designed to measure White Mass Texture Support(WMTS),GrayMass Texture Support(GMTS),WhiteMass Covariance Support(WMCS),GrayMass Covariance Support(GMCS),and Class Texture Adhesive Support(CTAS).In the test phase,the neurons at the intermediate layer compute the support as mentioned above values towards various classes of images.Based on that,the method adds a Multi-Variate Feature Similarity Measure(MVFSM).Based on the importance ofMVFSM,the process finds the class of brain image given and produces an efficient result.展开更多
In healthcare systems,the Internet of Things(IoT)innovation and development approached new ways to evaluate patient data.A cloud-based platform tends to process data generated by IoT medical devices instead of high st...In healthcare systems,the Internet of Things(IoT)innovation and development approached new ways to evaluate patient data.A cloud-based platform tends to process data generated by IoT medical devices instead of high storage,and computational hardware.In this paper,an intelligent healthcare system has been proposed for the prediction and severity analysis of lung disease from chest computer tomography(CT)images of patients with pneumonia,Covid-19,tuberculosis(TB),and cancer.Firstly,the CT images are captured and transmitted to the fog node through IoT devices.In the fog node,the image gets modified into a convenient and efficient format for further processing.advanced encryption Standard(AES)algorithm serves a substantial role in IoT and fog nodes for preventing data from being accessed by other operating systems.Finally,the preprocessed image can be classified automatically in the cloud by using various transfer and ensemble learning models.Herein different pre-trained deep learning architectures(Inception-ResNet-v2,VGG-19,ResNet-50)used transfer learning is adopted for feature extraction.The softmax of heterogeneous base classifiers assists to make individual predictions.As a meta-classifier,the ensemble approach is employed to obtain final optimal results.Disease predicted image is consigned to the recurrent neural network with long short-term memory(RNN-LSTM)for severity analysis,and the patient is directed to seek therapy based on the outcome.The proposed method achieved 98.6%accuracy,0.978 precision,0.982 recalls,and 0.974 F1-score on five class classifications.The experimental findings reveal that the proposed framework assists medical experts with lung disease screening and provides a valuable second perspective.展开更多
Recently,energy harvesting wireless sensor networks(EHWSN)have increased significant attention among research communities.By harvesting energy from the neighboring environment,the sensors in EHWSN resolve the energy c...Recently,energy harvesting wireless sensor networks(EHWSN)have increased significant attention among research communities.By harvesting energy from the neighboring environment,the sensors in EHWSN resolve the energy constraint problem and offers lengthened network lifetime.Clustering is one of the proficient ways for accomplishing even improved lifetime in EHWSN.The clustering process intends to appropriately elect the cluster heads(CHs)and construct clusters.Though several models are available in the literature,it is still needed to accomplish energy efficiency and security in EHWSN.In this view,this study develops a novel Chaotic Rider Optimization Based Clustering Protocol for Secure Energy Harvesting Wireless Sensor Networks(CROC-SEHWSN)model.The presented CROC-SEHWSN model aims to accomplish energy efficiency by clustering the node in EHWSN.The CROC-SEHWSN model is based on the integration of chaotic concepts with traditional rider optimization(RO)algorithm.Besides,the CROC-SEHWSN model derives a fitness function(FF)involving seven distinct parameters connected to WSN.To accomplish security,trust factor and link quality metrics are considered in the FF.The design of RO algorithm for secure clustering process shows the novelty of the work.In order to demonstrate the enhanced performance of the CROC-SEHWSN approach,a wide range of simulations are carried out and the outcomes are inspected in distinct aspects.The experimental outcome demonstrated the superior performance of the CROC-SEHWSN technique on the recent approaches with maximum network lifetime of 387.40 and 393.30 s under two scenarios.展开更多
文摘Since COVID-19 infections are increasing all over the world,there is a need for developing solutions for its early and accurate diagnosis is a must.Detectionmethods for COVID-19 include screeningmethods like Chest X-rays and Computed Tomography(CT)scans.More work must be done on preprocessing the datasets,such as eliminating the diaphragm portions,enhancing the image intensity,and minimizing noise.In addition to the detection of COVID-19,the severity of the infection needs to be estimated.The HSDC model is proposed to solve these problems,which will detect and classify the severity of COVID-19 from X-ray and CT-scan images.For CT-scan images,the histogram threshold of the input image is adaptively determined using the ICH Swarm Optimization Segmentation(ICHSeg)algorithm.Based on the Statistical and Shape-based feature vectors(FVs),the extracted regions are classified using a Hybrid model for CT images(HSDCCT)algorithm.When the infections are detected,it’s classified as Normal,Moderate,and Severe.A fused FHI is formed for X-ray images by extracting the features of Histogram-oriented gradient(HOG)and Image profile(IP).The FHI features of X-ray images are classified using Hybrid Support Vector Machine(SVM)and Deep Convolutional Neural Network(DCNN)HSDCX algorithm into COVID-19 or else Pneumonia,or Normal.Experimental results have shown that the accuracy of the HSDC model attains the highest of 94.6 for CT-scan images and 95.6 for X-ray images when compared to SVM and DCNN.This study thus significantly helps medical professionals and doctors diagnose COVID-19 infections quickly,which is the most needed in current years.
文摘Efficient routing protocols are crucial for enabling secure communication among the highly mobile and self-configurable nodes in Vehicular Ad-Hoc Networks(VANETs). In this work, we present a performance evaluation of different routing protocols in VANETs based on the currently available research. Our study focuses on analyzing the strength and weaknesses of the routing protocols, namely, Ad-Hoc On-demand Distance Vector(AODV), Dynamic Source Routing(DSR), and Destination-Sequenced Distance-Vector(DSDV), under varying network conditions. We examine the protocols’ performance based on key metrics such as throughput, delay, and energy consumption. We also highlight the advantages and limitations of each protocol in different scenarios, such as varying vehicular densities and mobility patterns. Our results show that AODV outperforms DSR and DSDV in terms of throughput and delay, while DSR consumes the least energy. We also observe that the performance of the routing protocols varies with the density of vehicles and the mobility patterns of the nodes. Our study highlights the importance of conducting real-world experiments to evaluate the performance of routing protocols in VANETs, as they provide more realistic and accurate results than simulation-based studies. Our findings can help in the selection and design of efficient and secure routing protocols for VANETs.
文摘The introduction of new technologies has increased communication network coverage and the number of associating nodes in dynamic communication networks(DCN).As the network has the characteristics like decentralized and dynamic,few nodes in the network may not associate with other nodes.These uncooperative nodes also known as selfish nodes corrupt the performance of the cooperative nodes.Namely,the nodes cause congestion,high delay,security concerns,and resource depletion.This study presents an effective selfish node detection method to address these problems.The Price of Anarchy(PoA)and the Price of Stability(PoS)in Game Theory with the Presence of Nash Equilibrium(NE)are discussed for the Selfish Node Detection.This is a novel experiment to detect selfish nodes in a network using PoA.Moreover,the least response dynamic-based Capacitated Selfish Resource Allocation(CSRA)game is introduced to improve resource usage among the nodes.The suggested strategy is simulated using the Solar Winds simulator,and the simulation results show that,when compared to earlier methods,the new scheme offers promising performance in terms of delivery rate,delay,and throughput.
文摘Fog computing is a rapidly growing technology that aids in pipelining the possibility of mitigating breaches between the cloud and edge servers.It facil-itates the benefits of the network edge with the maximized probability of offering interaction with the cloud.However,the fog computing characteristics are suscep-tible to counteract the challenges of security.The issues present with the Physical Layer Security(PLS)aspect in fog computing which included authentication,integrity,and confidentiality has been considered as a reason for the potential issues leading to the security breaches.In this work,the Octonion Algebra-inspired Non-Commutative Ring-based Fully Homomorphic Encryption Scheme(NCR-FHE)was proposed as a secrecy improvement technique to overcome the impersonation attack in cloud computing.The proposed approach was derived through the benefits of Octonion algebra to facilitate the maximum security for big data-based applications.The major issues in the physical layer security which may potentially lead to the possible security issues were identified.The potential issues causing the impersonation attack in the Fog computing environment were identified.The proposed approach was compared with the existing encryption approaches and claimed as a robust approach to identify the impersonation attack for the fog and edge network.The computation cost of the proposed NCR-FHE is identified to be significantly reduced by 7.18%,8.64%,9.42%,and 10.36%in terms of communication overhead for varying packet sizes,when compared to the benchmarked ECDH-DH,LHPPS,BF-PHE and SHE-PABF schemes.
文摘The creation of the 3D rendering model involves the prediction of an accurate depth map for the input images.A proposed approach of a modified semi-global block matching algorithm with variable window size and the gradient assessment of objects predicts the depth map.3D modeling and view synthesis algorithms could effectively handle the obtained disparity maps.This work uses the consistency check method to find an accurate depth map for identifying occluded pixels.The prediction of the disparity map by semi-global block matching has used the benchmark dataset of Middlebury stereo for evaluation.The improved depth map quality within a reasonable process-ing time outperforms the other existing depth map prediction algorithms.The experimental results have shown that the proposed depth map predictioncould identify the inter-object boundaryeven with the presence ofocclusion with less detection error and runtime.We observed that the Middlebury stereo dataset has very few images with occluded objects,which made the attainment of gain cumbersome.Considering this gain,we have created our dataset with occlu-sion using the structured lighting technique.The proposed regularization term as an optimization process in the graph cut algorithm handles occlusion for different smoothing coefficients.The experimented results demonstrated that our dataset had outperformed the Tsukuba dataset regarding the percentage of occluded pixels.
基金This work was supported by Spanish National Plan of R+D+i from Spanish Ministry of Education and Science(Ministerio de Educación y Ciencia)for the project Local Labour Markets:New Methods of Delineation and Analysis(No.8EJ2007-67767-C04-02)the European Social Fund(ESF)and the University of Alicante.
文摘This paper presents a new approach to the delineation of local labor markets based on evolutionary computation. The aim of the exercise is the division of a given territory into functional regions based on travel-to-work flows. Such regions are defined so that a high degree of inter-regional separation and of intra-regional integration in both cases in terms of commuting flows is guaranteed. Additional requirements include the absence of overlap between delineated regions and the exhaustive coverage of the whole territory. The procedure is based on the maximization of a fitness function that measures aggregate intra-region interaction under constraints of inter-region separation and minimum size. In the experimentation stage, two variations of the fitness function are used, and the process is also applied as a final stage for the optimization of the results from one of the most successful existing methods, which are used by the British authorities for the delineation of travel-to-work areas (TTWAs). The empirical exercise is conducted using real data for a sufficiently large territory that is considered to be representative given the density and variety of travel-to-work patterns that it embraces. The paper includes the quantitative comparison with alternative traditional methods, the assessment of the performance of the set of operators which has been specifically designed to handle the regionalization problem and the evaluation of the convergence process. The robustness of the solutions, something crucial in a research and policy-making context, is also discussed in the paper.
基金The authors extend their appreciation to King Saud University for funding this work through Researchers supporting project number(RSP-2020/133),King Saud University,Riyadh,Saudi Arabia.
文摘Clustering algorithms optimization can minimize topology maintenance overhead in large scale vehicular Ad hoc networks(VANETs)for smart transportation that results from dynamic topology,limited resources and noncentralized architecture.The performance of a clustering algorithm varies with the underlying mobility model to address the topology maintenance overhead issue in VANETs for smart transportation.To design a robust clustering algorithm,careful attention must be paid to components like mobility models and performance objectives.A clustering algorithm may not perform well with every mobility pattern.Therefore,we propose a supervisory protocol(SP)that observes the mobility pattern of vehicles and identies the realistic Mobility model through microscopic features.An analytical model can be used to determine an efcient clustering algorithm for a specic mobility model(MM).SP selects the best clustering scheme according to the mobility model and guarantees a consistent performance throughout VANET operations.The simulation has performed in three parts that is the central part simulation for setting up the clustering environment,In the second part the clustering algorithms are tested for efciency in a constrained atmosphere for some time and the third part represents the proposed scheme.The simulation results show that the proposed scheme outperforms clustering algorithms such as honey bee algorithm-based clustering and memetic clustering in terms of cluster count,re-afliation rate,control overhead and cluster lifetime.
基金Taif University Researchers Supporting Project No.(TURSP-2020/126),Taif University,Taif,Saudi Arabia。
文摘The world is rapidly changing with the advance of information technology.The expansion of the Internet of Things(IoT)is a huge step in the development of the smart city.The IoT consists of connected devices that transfer information.The IoT architecture permits on-demand services to a public pool of resources.Cloud computing plays a vital role in developing IoT-enabled smart applications.The integration of cloud computing enhances the offering of distributed resources in the smart city.Improper management of security requirements of cloud-assisted IoT systems can bring about risks to availability,security,performance,condentiality,and privacy.The key reason for cloud-and IoT-enabled smart city application failure is improper security practices at the early stages of development.This article proposes a framework to collect security requirements during the initial development phase of cloud-assisted IoT-enabled smart city applications.Its three-layered architecture includes privacy preserved stakeholder analysis(PPSA),security requirement modeling and validation(SRMV),and secure cloud-assistance(SCA).A case study highlights the applicability and effectiveness of the proposed framework.A hybrid survey enables the identication and evaluation of signicant challenges.
文摘An intrusion detection system(IDS)becomes an important tool for ensuring security in the network.In recent times,machine learning(ML)and deep learning(DL)models can be applied for the identification of intrusions over the network effectively.To resolve the security issues,this paper presents a new Binary Butterfly Optimization algorithm based on Feature Selection with DRL technique,called BBOFS-DRL for intrusion detection.The proposed BBOFSDRL model mainly accomplishes the recognition of intrusions in the network.To attain this,the BBOFS-DRL model initially designs the BBOFS algorithm based on the traditional butterfly optimization algorithm(BOA)to elect feature subsets.Besides,DRL model is employed for the proper identification and classification of intrusions that exist in the network.Furthermore,beetle antenna search(BAS)technique is applied to tune the DRL parameters for enhanced intrusion detection efficiency.For ensuring the superior intrusion detection outcomes of the BBOFS-DRL model,a wide-ranging experimental analysis is performed against benchmark dataset.The simulation results reported the supremacy of the BBOFS-DRL model over its recent state of art approaches.
文摘The Brain Tumor(BT)is created by an uncontrollable rise of anomalous cells in brain tissue,and it consists of 2 types of cancers they are malignant and benign tumors.The benevolent BT does not affect the neighbouring healthy and normal tissue;however,the malignant could affect the adjacent brain tissues,which results in death.Initial recognition of BT is highly significant to protecting the patient’s life.Generally,the BT can be identified through the magnetic resonance imaging(MRI)scanning technique.But the radiotherapists are not offering effective tumor segmentation in MRI images because of the position and unequal shape of the tumor in the brain.Recently,ML has prevailed against standard image processing techniques.Several studies denote the superiority of machine learning(ML)techniques over standard techniques.Therefore,this study develops novel brain tumor detection and classification model using met heuristic optimization with machine learning(BTDC-MOML)model.To accomplish the detection of brain tumor effectively,a Computer-Aided Design(CAD)model using Machine Learning(ML)technique is proposed in this research manuscript.Initially,the input image pre-processing is performed using Gaborfiltering(GF)based noise removal,contrast enhancement,and skull stripping.Next,mayfly optimization with the Kapur’s thresholding based segmentation process takes place.For feature extraction proposes,local diagonal extreme patterns(LDEP)are exploited.At last,the Extreme Gradient Boosting(XGBoost)model can be used for the BT classification process.The accuracy analysis is performed in terms of Learning accuracy,and the validation accuracy is performed to determine the efficiency of the proposed research work.The experimental validation of the proposed model demonstrates its promising performance over other existing methods.
基金This research was supported in part by Basic Science Research Program through the National Research Foundation of Korea(NRF),funded by the Ministry of Education(NRF-2021R1A6A1A03039493)in part by the NRF grant funded by the Korea government(MSIT)(NRF-2022R1A2C1004401).
文摘Phishing is a type of cybercrime in which cyber-attackers pose themselves as authorized persons or entities and hack the victims’sensitive data.E-mails,instant messages and phone calls are some of the common modes used in cyberattacks.Though the security models are continuously upgraded to prevent cyberattacks,hackers find innovative ways to target the victims.In this background,there is a drastic increase observed in the number of phishing emails sent to potential targets.This scenario necessitates the importance of designing an effective classification model.Though numerous conventional models are available in the literature for proficient classification of phishing emails,the Machine Learning(ML)techniques and the Deep Learning(DL)models have been employed in the literature.The current study presents an Intelligent Cuckoo Search(CS)Optimization Algorithm with a Deep Learning-based Phishing Email Detection and Classification(ICSOA-DLPEC)model.The aim of the proposed ICSOA-DLPEC model is to effectually distinguish the emails as either legitimate or phishing ones.At the initial stage,the pre-processing is performed through three stages such as email cleaning,tokenization and stop-word elimination.Then,the N-gram approach is;moreover,the CS algorithm is applied to extract the useful feature vectors.Moreover,the CS algorithm is employed with the Gated Recurrent Unit(GRU)model to detect and classify phishing emails.Furthermore,the CS algorithm is used to fine-tune the parameters involved in the GRU model.The performance of the proposed ICSOA-DLPEC model was experimentally validated using a benchmark dataset,and the results were assessed under several dimensions.Extensive comparative studies were conducted,and the results confirmed the superior performance of the proposed ICSOA-DLPEC model over other existing approaches.The proposed model achieved a maximum accuracy of 99.72%.
基金Natural Science Basic Research Plan in Shaanxi Province of China under Grant 2022JM-327 and in part by the CAAI-Huawei MindSpore Academic Open Fund.
文摘Because of its strong ability to solve problems,evolutionary multitask optimization(EMTO)algorithms have been widely studied recently.Evolutionary algorithms have the advantage of fast searching for the optimal solution,but it is easy to fall into local optimum and difficult to generalize.Combining evolutionary multitask algorithms with evolutionary optimization algorithms can be an effective method for solving these problems.Through the implicit parallelism of tasks themselves and the knowledge transfer between tasks,more promising individual algorithms can be generated in the evolution process,which can jump out of the local optimum.How to better combine the two has also been studied more and more.This paper explores the existing evolutionary multitasking theory and improvement scheme in detail.Then,it summarizes the application of EMTO in different scenarios.Finally,according to the existing research,the future research trends and potential exploration directions are revealed.
文摘Numerous methods are analysed in detail to improve task schedulingand data security performance in the cloud environment. The methodsinvolve scheduling according to the factors like makespan, waiting time,cost, deadline, and popularity. However, the methods are inappropriate forachieving higher scheduling performance. Regarding data security, existingmethods use various encryption schemes but introduce significant serviceinterruption. This article sketches a practical Real-time Application CentricTRS (Throughput-Resource utilization–Success) Scheduling with Data Security(RATRSDS) model by considering all these issues in task scheduling anddata security. The method identifies the required resource and their claim timeby receiving the service requests. Further, for the list of resources as services,the method computes throughput support (Thrs) according to the number ofstatements executed and the complete statements of the service. Similarly, themethod computes Resource utilization support (Ruts) according to the idletime on any duty cycle and total servicing time. Also, the method computesthe value of Success support (Sus) according to the number of completions forthe number of allocations. The method estimates the TRS score (ThroughputResource utilization Success) for different resources using all these supportmeasures. According to the value of the TRS score, the services are rankedand scheduled. On the other side, based on the requirement of service requests,the method computes Requirement Support (RS). The selection of service isperformed and allocated. Similarly, choosing the route according to the RouteSupport Measure (RSM) enforced route security. Finally, data security hasgets implemented with a service-based encryption technique. The RATRSDSscheme has claimed higher performance in data security and scheduling.
文摘The impact of a Distributed Denial of Service(DDoS)attack on Soft-ware Defined Networks(SDN)is briefly analyzed.Many approaches to detecting DDoS attacks exist,varying on the feature being considered and the method used.Still,the methods have a deficiency in the performance of detecting DDoS attacks and mitigating them.To improve the performance of SDN,an efficient Real-time Multi-Constrained Adaptive Replication and Traffic Approximation Model(RMCARTAM)is sketched in this article.The RMCARTAM considers different parameters or constraints in running different controllers responsible for handling incoming packets.The model is designed with multiple controllers to handle net-work traffic but can turn the controllers according to requirements.The multi-con-straint adaptive replication model monitors different features of network traffic like rate of packet reception,class-based packet reception and target-specific reception.According to these features,the method estimates the Replication Turn-ing Weight(RTW)based on which triggering controllers are performed.Similarly,the method applies Traffic Approximation(TA)in the detection of DDoS attacks.The detection of a DDoS attack is performed by approximating the incoming traf-fic to any service and using various features like hop count,payload,service fre-quency,and malformed frequency to compute various support measures on bandwidth access,data support,frequency support,malformed support,route sup-port,and so on.Using all these support measures,the method computes the value of legitimate weight to conclude the behavior of any source in identifying the mal-icious node.Identified node details are used in the mitigation of DDoS attacks.The method stimulates the network performance by reducing the power factor by switching the controller according to different factors,which also reduces the cost.In the same way,the proposed model improves the accuracy of detecting DDoS attacks by estimating the features of incoming traffic in different corners.
文摘Prediction of stock market value is highly risky because it is based on the concept of Time Series forecasting system that can be used for investments in a safe environment with minimized chances of loss.The proposed model uses a real time dataset offifteen Stocks as input into the system and based on the data,predicts or forecast future stock prices of different companies belonging to different sectors.The dataset includes approximatelyfifteen companies from different sectors and forecasts their results based on which the user can decide whether to invest in the particular company or not;the forecasting is done for the next quarter.Our model uses 3 main concepts for forecasting results.Thefirst one is for stocks that show periodic change throughout the season,the‘Holt-Winters Triple Exponential Smoothing’.3 basic things taken into conclusion by this algorithm are Base Level,Trend Level and Seasoning Factor.The value of all these are calculated by us and then decomposition of all these factors is done by the Holt-Winters Algorithm.The second concept is‘Recurrent Neural Network’.The specific model of recurrent neural network that is being used is Long-Short Term Memory and it’s the same as the Normal Neural Network,the only difference is that each intermediate cell is a memory cell and retails its value till the next feedback loop.The third concept is Recommendation System whichfilters and predict the rating based on the different factors.
文摘Current revelations in medical imaging have seen a slew of computer-aided diagnostic(CAD)tools for radiologists developed.Brain tumor classification is essential for radiologists to fully support and better interpret magnetic resonance imaging(MRI).In this work,we reported on new observations based on binary brain tumor categorization using HYBRID CNN-LSTM.Initially,the collected image is pre-processed and augmented using the following steps such as rotation,cropping,zooming,CLAHE(Contrast Limited Adaptive Histogram Equalization),and Random Rotation with panoramic stitching(RRPS).Then,a method called particle swarm optimization(PSO)is used to segment tumor regions in an MR image.After that,a hybrid CNN-LSTM classifier is applied to classify an image as a tumor or normal.In this proposed hybrid model,the CNN classifier is used for generating the feature map and the LSTM classifier is used for the classification process.The effectiveness of the proposed approach is analyzed based on the different metrics and outcomes compared to different methods.
文摘Handling service access in a cloud environment has been identified as a critical challenge in the modern internet world due to the increased rate of intrusion attacks.To address such threats towards cloud services,numerous techniques exist that mitigate the service threats according to different metrics.The rule-based approaches are unsuitable for new threats,whereas trust-based systems estimate trust value based on behavior,flow,and other features.However,the methods suffer from mitigating intrusion attacks at a higher rate.This article presents a novel Multi Fractal Trust Evaluation Model(MFTEM)to overcome these deficiencies.The method involves analyzing service growth,network growth,and quality of service growth.The process estimates the user’s trust in various ways and the support of the user in achieving higher service performance by calculating Trusted Service Support(TSS).Also,the user’s trust in supporting network stream by computing Trusted Network Support(TNS).Similarly,the user’s trust in achieving higher throughput is analyzed by computing Trusted QoS Support(TQS).Using all these measures,the method adds the Trust User Score(TUS)value to decide on the clearance of user requests.The proposed MFTEM model improves intrusion detection accuracy with higher performance.
文摘The deep learning models are identified as having a significant impact on various problems.The same can be adapted to the problem of brain tumor classification.However,several deep learning models are presented earlier,but they need better classification accuracy.An efficient Multi-Feature Approximation Based Convolution Neural Network(CNN)model(MFACNN)is proposed to handle this issue.The method reads the input 3D Magnetic Resonance Imaging(MRI)images and applies Gabor filters at multiple levels.The noise-removed image has been equalized for its quality by using histogram equalization.Further,the features like white mass,grey mass,texture,and shape are extracted from the images.Extracted features are trained with deep learning Convolution Neural Network(CNN).The network has been designed with a single convolution layer towards dimensionality reduction.The texture features obtained from the brain image have been transformed into a multi-dimensional feature matrix,which has been transformed into a single-dimensional feature vector at the convolution layer.The neurons of the intermediate layer are designed to measure White Mass Texture Support(WMTS),GrayMass Texture Support(GMTS),WhiteMass Covariance Support(WMCS),GrayMass Covariance Support(GMCS),and Class Texture Adhesive Support(CTAS).In the test phase,the neurons at the intermediate layer compute the support as mentioned above values towards various classes of images.Based on that,the method adds a Multi-Variate Feature Similarity Measure(MVFSM).Based on the importance ofMVFSM,the process finds the class of brain image given and produces an efficient result.
文摘In healthcare systems,the Internet of Things(IoT)innovation and development approached new ways to evaluate patient data.A cloud-based platform tends to process data generated by IoT medical devices instead of high storage,and computational hardware.In this paper,an intelligent healthcare system has been proposed for the prediction and severity analysis of lung disease from chest computer tomography(CT)images of patients with pneumonia,Covid-19,tuberculosis(TB),and cancer.Firstly,the CT images are captured and transmitted to the fog node through IoT devices.In the fog node,the image gets modified into a convenient and efficient format for further processing.advanced encryption Standard(AES)algorithm serves a substantial role in IoT and fog nodes for preventing data from being accessed by other operating systems.Finally,the preprocessed image can be classified automatically in the cloud by using various transfer and ensemble learning models.Herein different pre-trained deep learning architectures(Inception-ResNet-v2,VGG-19,ResNet-50)used transfer learning is adopted for feature extraction.The softmax of heterogeneous base classifiers assists to make individual predictions.As a meta-classifier,the ensemble approach is employed to obtain final optimal results.Disease predicted image is consigned to the recurrent neural network with long short-term memory(RNN-LSTM)for severity analysis,and the patient is directed to seek therapy based on the outcome.The proposed method achieved 98.6%accuracy,0.978 precision,0.982 recalls,and 0.974 F1-score on five class classifications.The experimental findings reveal that the proposed framework assists medical experts with lung disease screening and provides a valuable second perspective.
基金This research was supported by the Deanship of Scientific Research Project(RGP.2/162/43)King Khalid University,Kingdom of Saudi Arabia.
文摘Recently,energy harvesting wireless sensor networks(EHWSN)have increased significant attention among research communities.By harvesting energy from the neighboring environment,the sensors in EHWSN resolve the energy constraint problem and offers lengthened network lifetime.Clustering is one of the proficient ways for accomplishing even improved lifetime in EHWSN.The clustering process intends to appropriately elect the cluster heads(CHs)and construct clusters.Though several models are available in the literature,it is still needed to accomplish energy efficiency and security in EHWSN.In this view,this study develops a novel Chaotic Rider Optimization Based Clustering Protocol for Secure Energy Harvesting Wireless Sensor Networks(CROC-SEHWSN)model.The presented CROC-SEHWSN model aims to accomplish energy efficiency by clustering the node in EHWSN.The CROC-SEHWSN model is based on the integration of chaotic concepts with traditional rider optimization(RO)algorithm.Besides,the CROC-SEHWSN model derives a fitness function(FF)involving seven distinct parameters connected to WSN.To accomplish security,trust factor and link quality metrics are considered in the FF.The design of RO algorithm for secure clustering process shows the novelty of the work.In order to demonstrate the enhanced performance of the CROC-SEHWSN approach,a wide range of simulations are carried out and the outcomes are inspected in distinct aspects.The experimental outcome demonstrated the superior performance of the CROC-SEHWSN technique on the recent approaches with maximum network lifetime of 387.40 and 393.30 s under two scenarios.