Cloud storage has gained increasing popularity,as it helps cloud users arbitrarily store and access the related outsourced data.Numerous public audit buildings have been presented to ensure data transparency.However,m...Cloud storage has gained increasing popularity,as it helps cloud users arbitrarily store and access the related outsourced data.Numerous public audit buildings have been presented to ensure data transparency.However,modern developments have mostly been constructed on the public key infrastructure.To achieve data integrity,the auditor must first authenticate the legality of the public key certificate,which adds to an immense workload for the auditor,in order to ensure that data integrity is accomplished.The data facilities anticipate that the storage data quality should be regularly tracked to minimize disruption to the saved data in order to maintain the intactness of the stored data on the remote server.One of the main problems for individuals,though,is how to detect data integrity on a term where people have a backup of local files.Meanwhile,a system is often unlikely for a source-limited person to perform a data integrity inspection if the overall data file is retrieved.In this work,a stable and effective ID-based auditing setting that uses machine learning techniques is proposed to improve productivity and enhance the protection of ID-based audit protocols.The study tackles the issue of confidentiality and reliability in the public audit framework focused on identity.The idea has already been proved safe;its safety is very relevant to the traditional presumption of the Computational Diffie-Hellman security assumption.展开更多
This analysis investigates the widespread use of solar drying methods and designs in developing countries,particularly for agricultural products like fruits,vegetables,and bee pollen.Traditional techniques like hot ai...This analysis investigates the widespread use of solar drying methods and designs in developing countries,particularly for agricultural products like fruits,vegetables,and bee pollen.Traditional techniques like hot air oven drying and open sun drying have drawbacks,including nutrient loss and exposure to harmful particles.Solar and thermal drying are viewed as sustainable solutions because they rely on renewable resources.The article highlights the advantages of solar drying,including waste reduction,increased productivity,and improved pricing.It is also cost-effective and energy-efficient.The review study provides an overview of different solar drying systems and technologies used in poor nations,aiming to identify the most effective and efficient designs.The focus is on comparing current models of solar dryers for optimal performance.The review underscores the importance of solar drying as a long-term,eco-friendly approach to drying food in developing countries.This review aims to evaluate how using solar-powered drying techniques can enhance food preservation,minimize waste,and enhance the quality and marketability of agricultural goods.The paper will specifically focus on examining the efficacy of these methods for drying bee pollen and pinpointing where enhancements can be made in their advancement.展开更多
Cyber Defense is becoming a major issue for every organization to keep business continuity intact.The presented paper explores the effectiveness of a meta-heuristic optimization algorithm-Artificial Bees Colony Algori...Cyber Defense is becoming a major issue for every organization to keep business continuity intact.The presented paper explores the effectiveness of a meta-heuristic optimization algorithm-Artificial Bees Colony Algorithm(ABC)as an Nature Inspired Cyber Security mechanism to achieve adaptive defense.It experiments on the Denial-Of-Service attack scenarios which involves limiting the traffic flow for each node.Businesses today have adapted their service distribution models to include the use of the Internet,allowing them to effectively manage and interact with their customer data.This shift has created an increased reliance on online services to store vast amounts of confidential customer data,meaning any disruption or outage of these services could be disastrous for the business,leaving them without the knowledge to serve their customers.Adversaries can exploit such an event to gain unauthorized access to the confidential data of the customers.The proposed algorithm utilizes an Adaptive Defense approach to continuously select nodes that could present characteristics of a probable malicious entity.For any changes in network parameters,the cluster of nodes is selected in the prepared solution set as a probable malicious node and the traffic rate with the ratio of packet delivery is managed with respect to the properties of normal nodes to deliver a disaster recovery plan for potential businesses.展开更多
Phishing is a type of cybercrime in which cyber-attackers pose themselves as authorized persons or entities and hack the victims’sensitive data.E-mails,instant messages and phone calls are some of the common modes us...Phishing is a type of cybercrime in which cyber-attackers pose themselves as authorized persons or entities and hack the victims’sensitive data.E-mails,instant messages and phone calls are some of the common modes used in cyberattacks.Though the security models are continuously upgraded to prevent cyberattacks,hackers find innovative ways to target the victims.In this background,there is a drastic increase observed in the number of phishing emails sent to potential targets.This scenario necessitates the importance of designing an effective classification model.Though numerous conventional models are available in the literature for proficient classification of phishing emails,the Machine Learning(ML)techniques and the Deep Learning(DL)models have been employed in the literature.The current study presents an Intelligent Cuckoo Search(CS)Optimization Algorithm with a Deep Learning-based Phishing Email Detection and Classification(ICSOA-DLPEC)model.The aim of the proposed ICSOA-DLPEC model is to effectually distinguish the emails as either legitimate or phishing ones.At the initial stage,the pre-processing is performed through three stages such as email cleaning,tokenization and stop-word elimination.Then,the N-gram approach is;moreover,the CS algorithm is applied to extract the useful feature vectors.Moreover,the CS algorithm is employed with the Gated Recurrent Unit(GRU)model to detect and classify phishing emails.Furthermore,the CS algorithm is used to fine-tune the parameters involved in the GRU model.The performance of the proposed ICSOA-DLPEC model was experimentally validated using a benchmark dataset,and the results were assessed under several dimensions.Extensive comparative studies were conducted,and the results confirmed the superior performance of the proposed ICSOA-DLPEC model over other existing approaches.The proposed model achieved a maximum accuracy of 99.72%.展开更多
The brain of humans and other organisms is affected in various ways through the electromagneticfield(EMF)radiations generated by mobile phones and cell phone towers.Morphological variations in the brain are caused by t...The brain of humans and other organisms is affected in various ways through the electromagneticfield(EMF)radiations generated by mobile phones and cell phone towers.Morphological variations in the brain are caused by the neurological changes due to the revelation of EMF.Cellular level analysis is used to measure and detect the effect of mobile radiations,but its utilization seems very expensive,and it is a tedious process,where its analysis requires the preparation of cell suspension.In this regard,this research article proposes optimal broadcast-ing learning to detect changes in brain morphology due to the revelation of EMF.Here,Drosophila melanogaster acts as a specimen under the revelation of EMF.Automatic segmentation is performed for the brain to attain the microscopic images from the prejudicial geometrical characteristics that are removed to detect the effect of revelation of EMF.The geometrical characteristics of the brain image of that is microscopic segmented are analyzed.Analysis results reveal the occur-rence of several prejudicial characteristics that can be processed by machine learn-ing techniques.The important prejudicial characteristics are given to four varieties of classifiers such as naïve Bayes,artificial neural network,support vector machine,and unsystematic forest for the classification of open or nonopen micro-scopic image of D.melanogaster brain.The results are attained through various experimental evaluations,and the said classifiers perform well by achieving 96.44%using the prejudicial characteristics chosen by the feature selection meth-od.The proposed system is an optimal approach that automatically identifies the effect of revelation of EMF with minimal time complexity,where the machine learning techniques produce an effective framework for image processing.展开更多
In the development of technology in various fields like big data analysis,data mining,big data,cloud computing,and blockchain technology,security become more constrained.Blockchain is used in providing security by enc...In the development of technology in various fields like big data analysis,data mining,big data,cloud computing,and blockchain technology,security become more constrained.Blockchain is used in providing security by encrypting the sharing of information.Blockchain is applied in the peerto-peer(P2P)network and it has a decentralized ledger.Providing security against unauthorized breaches in the distributed network is required.To detect unauthorized breaches,there are numerous techniques were developed and those techniques are inefficient and have poor data integrity.Hence,a novel technique needs to be implemented to tackle the new breaches in the distributed network.This paper,proposed a hybrid technique of two fish with a ripple consensus algorithm(TF-RC).To improve the detection time and security,this paper uses efficient transmission of data in the distributed network.The experimental analysis of TF-RC by using the metric measures of performance in terms of latency,throughput,energy efficiency and it produced better performance.展开更多
The adoption of Sustainable Development Goals(SDGs)under Agenda 2030 is one of the most ambitious ventures by the United Nations for the betterment of humanity.The SDGs are a comprehensive and holistic approach to mak...The adoption of Sustainable Development Goals(SDGs)under Agenda 2030 is one of the most ambitious ventures by the United Nations for the betterment of humanity.The SDGs are a comprehensive and holistic approach to making the lives of humans,not only the current human population but even future generations,worth living and celebrating.SDGs aim to end poverty,hunger,and inequities,provide everyone with clean water,energy,and an environment,and make the planet more peaceful,just,and habitable[1].展开更多
Integrated CloudIoT is an emergingfield of study that integrates the Cloud and the Internet of Things(IoT)to make machines smarter and deal with real-world objects in a distributed manner.It collects data from various ...Integrated CloudIoT is an emergingfield of study that integrates the Cloud and the Internet of Things(IoT)to make machines smarter and deal with real-world objects in a distributed manner.It collects data from various devices and analyses it to increase efficiency and productivity.Because Cloud and IoT are complementary technologies with distinct areas of application,integrating them is difficult.This paper identifies various CloudIoT issues and analyzes them to make a relational model.The Interpretive Structural Modeling(ISM)approach establishes the interrelationship among the problems identified.The issues are categorised based on driving and dependent power,and a hierarchical model is presented.The ISM analysis shows that scheduling is an important aspect and has both(driving and dependence)power to improve the performance of the CloudIoT model.Therefore,existing CloudIoT job scheduling algorithms are ana-lysed,and a cloud-centric scheduling mechanism is proposed to execute IoT jobs on a suitable cloud.The cloud implementation using an open-source framework to simulate Cloud Computing(CloudSim),based on the job’s workload,is pre-sented.Simulation results of the proposed scheduling model indicate better per-formance in terms of Average Waiting Time(AWT)and makespan than existing cloud-based scheduling approaches.展开更多
Data offloading at the network with less time and reduced energy con-sumption are highly important for every technology.Smart applications process the data very quickly with less power consumption.As technology grows t...Data offloading at the network with less time and reduced energy con-sumption are highly important for every technology.Smart applications process the data very quickly with less power consumption.As technology grows towards 5G communication architecture,identifying a solution for QoS in 5G through energy-efficient computing is important.In this proposed model,we perform data offloading at 5G using the fuzzification concept.Mobile IoT devices create tasks in the network and are offloaded in the cloud or mobile edge nodes based on energy consumption.Two base stations,small(SB)and macro(MB)stations,are initialized and thefirst tasks randomly computed.Then,the tasks are pro-cessed using a fuzzification algorithm to select SB or MB in the central server.The optimization is performed using a grasshopper algorithm for improving the QoS of the 5G network.The result is compared with existing algorithms and indi-cates that the proposed system improves the performance of the system with a cost of 44.64 J for computing 250 benchmark tasks.展开更多
In this article,mathematical modeling for the evaluation of reliability is studied using two methods.One of the methods,is developed based on possibility theory.The performance of the reliability of the system is of p...In this article,mathematical modeling for the evaluation of reliability is studied using two methods.One of the methods,is developed based on possibility theory.The performance of the reliability of the system is of prime concern.In view of this,the outcomes for the failure are required to evaluate with utmost care.In possibility theory,the reliability information data determined from decision-making experts are subjective.The samemethod is also related to the survival possibilities as against the survival probabilities.The other method is the one that is developed using the concept of approximation of closed interval including the piecewise quadratic fuzzy numbers.In this method,a decision-making expert is not sure of his/her estimates of the reliability parameters.Numerical experiments are performed to illustrate the efficiency of the suggested methods in this research.In the end,the paper is concluded with some future research directions to be explored for the proposed approach.展开更多
Optimizing the performance of composite structures is a real-world application with significant benefits.In this paper,a high-fidelity finite element method(FEM)is combined with the iterative improvement capability of...Optimizing the performance of composite structures is a real-world application with significant benefits.In this paper,a high-fidelity finite element method(FEM)is combined with the iterative improvement capability of metaheuristic optimization algorithms to obtain optimized composite plates.The FEM module comprises of ninenode isoparametric plate bending element in conjunction with the first-order shear deformation theory(FSDT).A recently proposed memetic version of particle swarm optimization called RPSOLC is modified in the current research to carry out multi-objective Pareto optimization.The performance of the MO-RPSOLC is found to be comparable with the NSGA-III.This work successfully highlights the use of FEM-MO-RPSOLC in obtaining highfidelity Pareto solutions considering simultaneous maximization of the fundamental frequency and frequency separation in laminated composites by optimizing the stacking sequence.展开更多
Major fields such as military applications,medical fields,weather forecasting,and environmental applications use wireless sensor networks for major computing processes.Sensors play a vital role in emerging technologie...Major fields such as military applications,medical fields,weather forecasting,and environmental applications use wireless sensor networks for major computing processes.Sensors play a vital role in emerging technologies of the 20th century.Localization of sensors in needed locations is a very serious problem.The environment is home to every living being in the world.The growth of industries after the industrial revolution increased pollution across the environment.Owing to recent uncontrolled growth and development,sensors to measure pollution levels across industries and surroundings are needed.An interesting and challenging task is choosing the place to fit the sensors.Many meta-heuristic techniques have been introduced in node localization.Swarm intelligent algorithms have proven their efficiency in many studies on localization problems.In this article,we introduce an industrial-centric approach to solve the problem of node localization in the sensor network.First,our work aims at selecting industrial areas in the sensed location.We use random forest regression methodology to select the polluted area.Then,the elephant herding algorithm is used in sensor node localization.These two algorithms are combined to produce the best standard result in localizing the sensor nodes.To check the proposed performance,experiments are conducted with data from the KDD Cup 2018,which contain the name of 35 stations with concentrations of air pollutants such as PM,SO_(2),CO,NO_(2),and O_(3).These data are normalized and tested with algorithms.The results are comparatively analyzed with other swarm intelligence algorithms such as the elephant herding algorithm,particle swarm optimization,and machine learning algorithms such as decision tree regression and multi-layer perceptron.Results can indicate our proposed algorithm can suggest more meaningful locations for localizing the sensors in the topology.Our proposed method achieves a lower root mean square value with 0.06 to 0.08 for localizing with Stations 1 to 5.展开更多
Urban living in large modern cities exerts considerable adverse effectson health and thus increases the risk of contracting several chronic kidney diseases (CKD). The prediction of CKDs has become a major task in urb...Urban living in large modern cities exerts considerable adverse effectson health and thus increases the risk of contracting several chronic kidney diseases (CKD). The prediction of CKDs has become a major task in urbanizedcountries. The primary objective of this work is to introduce and develop predictive analytics for predicting CKDs. However, prediction of huge samples isbecoming increasingly difficult. Meanwhile, MapReduce provides a feasible framework for programming predictive algorithms with map and reduce functions.The relatively simple programming interface helps solve problems in the scalability and efficiency of predictive learning algorithms. In the proposed work, theiterative weighted map reduce framework is introduced for the effective management of large dataset samples. A binary classification problem is formulated usingensemble nonlinear support vector machines and random forests. Thus, instead ofusing the normal linear combination of kernel activations, the proposed work creates nonlinear combinations of kernel activations in prototype examples. Furthermore, different descriptors are combined in an ensemble of deep support vectormachines, where the product rule is used to combine probability estimates ofdifferent classifiers. Performance is evaluated in terms of the prediction accuracyand interpretability of the model and the results.展开更多
In contemporary medicine,cardiovascular disease is a major public health concern.Cardiovascular diseases are one of the leading causes of death worldwide.They are classified as vascular,ischemic,or hypertensive.Clinica...In contemporary medicine,cardiovascular disease is a major public health concern.Cardiovascular diseases are one of the leading causes of death worldwide.They are classified as vascular,ischemic,or hypertensive.Clinical information contained in patients’Electronic Health Records(EHR)enables clin-icians to identify and monitor heart illness.Heart failure rates have risen drama-tically in recent years as a result of changes in modern lifestyles.Heart diseases are becoming more prevalent in today’s medical setting.Each year,a substantial number of people die as a result of cardiac pain.The primary cause of these deaths is the improper use of pharmaceuticals without the supervision of a physician and the late detection of diseases.To improve the efficiency of the classification algo-rithms,we construct a data pre-processing stage using feature selection.Experi-ments using unidirectional and bidirectional neural network models found that a Deep Learning Modified Neural Network(DLMNN)model combined with the Pet Dog-Smell Sensing(PD-SS)algorithm predicted the highest classification performance on the UCI Machine Learning Heart Disease dataset.The DLMNN-based PDSS achieved an accuracy of 94.21%,an F-score of 92.38%,a recall of 94.62%,and a precision of 93.86%.These results are competitive and promising for a heart disease dataset.We demonstrated that a DLMNN framework based on deep models may be used to solve the categorization problem for an unbalanced heart disease dataset.Our proposed approach can result in exceptionally accurate models that can be utilized to analyze and diagnose clinical real-world data.展开更多
This research work is the novel state-of-the-art technology performed on multi-cylinder SI engine fueled compressed natural gas,emulsified fuel,and hydrogen as dual fuel.This work predicts the overall features of perf...This research work is the novel state-of-the-art technology performed on multi-cylinder SI engine fueled compressed natural gas,emulsified fuel,and hydrogen as dual fuel.This work predicts the overall features of performance,combustion,and exhaust emissions of individual fuels based on AVL Boost simulation technology.Three types of alternative fuels have been compared and analyzed.The results show that hydrogen produces 20%more brake power than CNG and 25%more power than micro-emulsion fuel at 1500 r/min,which further increases the brake power of hydrogen,CNG,and micro-emulsions in the range of 25%,20%,and 15%at higher engine speeds of 2500-4000 r/min,respectively.In addition,the brake-specific fuel consumption is the lowest for 100%hydrogen,followed by CNG 100%and then micro-emulsions at 1500 r/min.At 2500-5000 r/min,there is a significant drop in brake-specific fuel consumption due to a lean mixture at higher engine speeds.The CO,HC,and NOx emissions significantly improve for hydrogen,CNG,and micro-emulsion fuel.Hydrogen fuel shows zero CO and HC emissions and is the main objective of this research to produce 0%carbon-based emissions with a slight increase in NOx emissions,and CNG shows 30%lower CO emissions than micro-emulsions and 21.5%less hydrocarbon emissions than micro-emulsion fuel at stoichiometric air/fuel ratio.展开更多
New atypical pneumonia caused by a virus called Coronavirus(COVID-19)appeared in Wuhan,China in December 2019.Unlike previous epidemics due to the severe acute respiratory syndrome(SARS)and the Middle East respiratory...New atypical pneumonia caused by a virus called Coronavirus(COVID-19)appeared in Wuhan,China in December 2019.Unlike previous epidemics due to the severe acute respiratory syndrome(SARS)and the Middle East respiratory syndrome coronavirus(MERS-CoV),COVID-19 has the particularity that it is more contagious than the other previous ones.In this paper,we try to predict the COVID-19 epidemic peak in Japan with the help of real-time data from January 15 to February 29,2020 with the uses of fractional derivatives,namely,Caputo derivatives,the Caputo–Fabrizio derivatives,and Atangana–Baleanu derivatives in the Caputo sense.The fixed point theory and Picard–Lindel of approach used in this study provide the proof for the existence and uniqueness analysis of the solutions to the noninteger-order models under the investi-gations.For each fractional model,we propose a numerical scheme as well as prove its stability.Using parameter values estimated from the Japan COVID-19 epidemic real data,we perform numerical simulations to confirm the effectiveness of used approxima-tion methods by numerical simulations for different values of the fractional-orderγ,and to give the predictions of COVID-19 epidemic peaks in Japan in a specific range of time intervals.展开更多
A stochastic SIR influenza vertical transmission model is examined in this paper where vaccination and an incidence rate that is not linear are considered.To determine whether testosterone regulates lower sintering HP...A stochastic SIR influenza vertical transmission model is examined in this paper where vaccination and an incidence rate that is not linear are considered.To determine whether testosterone regulates lower sintering HPA axis function in males,we used a stochastic SIR epidemic procedure with divergent influences on ACTH and cortisol.The suppressive effects on cortisol can be attributed to a peripheral(adrenal)locus.Following that,we came to the conclusion that experimental solutions have been discovered and the requisite statistical findings have been examined.Finally,we deduce that the given mathematical model and the results are relevant to medical research.In the future,this research can be further extended to simulate more results in the medical field.展开更多
文摘Cloud storage has gained increasing popularity,as it helps cloud users arbitrarily store and access the related outsourced data.Numerous public audit buildings have been presented to ensure data transparency.However,modern developments have mostly been constructed on the public key infrastructure.To achieve data integrity,the auditor must first authenticate the legality of the public key certificate,which adds to an immense workload for the auditor,in order to ensure that data integrity is accomplished.The data facilities anticipate that the storage data quality should be regularly tracked to minimize disruption to the saved data in order to maintain the intactness of the stored data on the remote server.One of the main problems for individuals,though,is how to detect data integrity on a term where people have a backup of local files.Meanwhile,a system is often unlikely for a source-limited person to perform a data integrity inspection if the overall data file is retrieved.In this work,a stable and effective ID-based auditing setting that uses machine learning techniques is proposed to improve productivity and enhance the protection of ID-based audit protocols.The study tackles the issue of confidentiality and reliability in the public audit framework focused on identity.The idea has already been proved safe;its safety is very relevant to the traditional presumption of the Computational Diffie-Hellman security assumption.
文摘This analysis investigates the widespread use of solar drying methods and designs in developing countries,particularly for agricultural products like fruits,vegetables,and bee pollen.Traditional techniques like hot air oven drying and open sun drying have drawbacks,including nutrient loss and exposure to harmful particles.Solar and thermal drying are viewed as sustainable solutions because they rely on renewable resources.The article highlights the advantages of solar drying,including waste reduction,increased productivity,and improved pricing.It is also cost-effective and energy-efficient.The review study provides an overview of different solar drying systems and technologies used in poor nations,aiming to identify the most effective and efficient designs.The focus is on comparing current models of solar dryers for optimal performance.The review underscores the importance of solar drying as a long-term,eco-friendly approach to drying food in developing countries.This review aims to evaluate how using solar-powered drying techniques can enhance food preservation,minimize waste,and enhance the quality and marketability of agricultural goods.The paper will specifically focus on examining the efficacy of these methods for drying bee pollen and pinpointing where enhancements can be made in their advancement.
文摘Cyber Defense is becoming a major issue for every organization to keep business continuity intact.The presented paper explores the effectiveness of a meta-heuristic optimization algorithm-Artificial Bees Colony Algorithm(ABC)as an Nature Inspired Cyber Security mechanism to achieve adaptive defense.It experiments on the Denial-Of-Service attack scenarios which involves limiting the traffic flow for each node.Businesses today have adapted their service distribution models to include the use of the Internet,allowing them to effectively manage and interact with their customer data.This shift has created an increased reliance on online services to store vast amounts of confidential customer data,meaning any disruption or outage of these services could be disastrous for the business,leaving them without the knowledge to serve their customers.Adversaries can exploit such an event to gain unauthorized access to the confidential data of the customers.The proposed algorithm utilizes an Adaptive Defense approach to continuously select nodes that could present characteristics of a probable malicious entity.For any changes in network parameters,the cluster of nodes is selected in the prepared solution set as a probable malicious node and the traffic rate with the ratio of packet delivery is managed with respect to the properties of normal nodes to deliver a disaster recovery plan for potential businesses.
基金This research was supported in part by Basic Science Research Program through the National Research Foundation of Korea(NRF),funded by the Ministry of Education(NRF-2021R1A6A1A03039493)in part by the NRF grant funded by the Korea government(MSIT)(NRF-2022R1A2C1004401).
文摘Phishing is a type of cybercrime in which cyber-attackers pose themselves as authorized persons or entities and hack the victims’sensitive data.E-mails,instant messages and phone calls are some of the common modes used in cyberattacks.Though the security models are continuously upgraded to prevent cyberattacks,hackers find innovative ways to target the victims.In this background,there is a drastic increase observed in the number of phishing emails sent to potential targets.This scenario necessitates the importance of designing an effective classification model.Though numerous conventional models are available in the literature for proficient classification of phishing emails,the Machine Learning(ML)techniques and the Deep Learning(DL)models have been employed in the literature.The current study presents an Intelligent Cuckoo Search(CS)Optimization Algorithm with a Deep Learning-based Phishing Email Detection and Classification(ICSOA-DLPEC)model.The aim of the proposed ICSOA-DLPEC model is to effectually distinguish the emails as either legitimate or phishing ones.At the initial stage,the pre-processing is performed through three stages such as email cleaning,tokenization and stop-word elimination.Then,the N-gram approach is;moreover,the CS algorithm is applied to extract the useful feature vectors.Moreover,the CS algorithm is employed with the Gated Recurrent Unit(GRU)model to detect and classify phishing emails.Furthermore,the CS algorithm is used to fine-tune the parameters involved in the GRU model.The performance of the proposed ICSOA-DLPEC model was experimentally validated using a benchmark dataset,and the results were assessed under several dimensions.Extensive comparative studies were conducted,and the results confirmed the superior performance of the proposed ICSOA-DLPEC model over other existing approaches.The proposed model achieved a maximum accuracy of 99.72%.
文摘The brain of humans and other organisms is affected in various ways through the electromagneticfield(EMF)radiations generated by mobile phones and cell phone towers.Morphological variations in the brain are caused by the neurological changes due to the revelation of EMF.Cellular level analysis is used to measure and detect the effect of mobile radiations,but its utilization seems very expensive,and it is a tedious process,where its analysis requires the preparation of cell suspension.In this regard,this research article proposes optimal broadcast-ing learning to detect changes in brain morphology due to the revelation of EMF.Here,Drosophila melanogaster acts as a specimen under the revelation of EMF.Automatic segmentation is performed for the brain to attain the microscopic images from the prejudicial geometrical characteristics that are removed to detect the effect of revelation of EMF.The geometrical characteristics of the brain image of that is microscopic segmented are analyzed.Analysis results reveal the occur-rence of several prejudicial characteristics that can be processed by machine learn-ing techniques.The important prejudicial characteristics are given to four varieties of classifiers such as naïve Bayes,artificial neural network,support vector machine,and unsystematic forest for the classification of open or nonopen micro-scopic image of D.melanogaster brain.The results are attained through various experimental evaluations,and the said classifiers perform well by achieving 96.44%using the prejudicial characteristics chosen by the feature selection meth-od.The proposed system is an optimal approach that automatically identifies the effect of revelation of EMF with minimal time complexity,where the machine learning techniques produce an effective framework for image processing.
文摘In the development of technology in various fields like big data analysis,data mining,big data,cloud computing,and blockchain technology,security become more constrained.Blockchain is used in providing security by encrypting the sharing of information.Blockchain is applied in the peerto-peer(P2P)network and it has a decentralized ledger.Providing security against unauthorized breaches in the distributed network is required.To detect unauthorized breaches,there are numerous techniques were developed and those techniques are inefficient and have poor data integrity.Hence,a novel technique needs to be implemented to tackle the new breaches in the distributed network.This paper,proposed a hybrid technique of two fish with a ripple consensus algorithm(TF-RC).To improve the detection time and security,this paper uses efficient transmission of data in the distributed network.The experimental analysis of TF-RC by using the metric measures of performance in terms of latency,throughput,energy efficiency and it produced better performance.
文摘The adoption of Sustainable Development Goals(SDGs)under Agenda 2030 is one of the most ambitious ventures by the United Nations for the betterment of humanity.The SDGs are a comprehensive and holistic approach to making the lives of humans,not only the current human population but even future generations,worth living and celebrating.SDGs aim to end poverty,hunger,and inequities,provide everyone with clean water,energy,and an environment,and make the planet more peaceful,just,and habitable[1].
文摘Integrated CloudIoT is an emergingfield of study that integrates the Cloud and the Internet of Things(IoT)to make machines smarter and deal with real-world objects in a distributed manner.It collects data from various devices and analyses it to increase efficiency and productivity.Because Cloud and IoT are complementary technologies with distinct areas of application,integrating them is difficult.This paper identifies various CloudIoT issues and analyzes them to make a relational model.The Interpretive Structural Modeling(ISM)approach establishes the interrelationship among the problems identified.The issues are categorised based on driving and dependent power,and a hierarchical model is presented.The ISM analysis shows that scheduling is an important aspect and has both(driving and dependence)power to improve the performance of the CloudIoT model.Therefore,existing CloudIoT job scheduling algorithms are ana-lysed,and a cloud-centric scheduling mechanism is proposed to execute IoT jobs on a suitable cloud.The cloud implementation using an open-source framework to simulate Cloud Computing(CloudSim),based on the job’s workload,is pre-sented.Simulation results of the proposed scheduling model indicate better per-formance in terms of Average Waiting Time(AWT)and makespan than existing cloud-based scheduling approaches.
文摘Data offloading at the network with less time and reduced energy con-sumption are highly important for every technology.Smart applications process the data very quickly with less power consumption.As technology grows towards 5G communication architecture,identifying a solution for QoS in 5G through energy-efficient computing is important.In this proposed model,we perform data offloading at 5G using the fuzzification concept.Mobile IoT devices create tasks in the network and are offloaded in the cloud or mobile edge nodes based on energy consumption.Two base stations,small(SB)and macro(MB)stations,are initialized and thefirst tasks randomly computed.Then,the tasks are pro-cessed using a fuzzification algorithm to select SB or MB in the central server.The optimization is performed using a grasshopper algorithm for improving the QoS of the 5G network.The result is compared with existing algorithms and indi-cates that the proposed system improves the performance of the system with a cost of 44.64 J for computing 250 benchmark tasks.
文摘In this article,mathematical modeling for the evaluation of reliability is studied using two methods.One of the methods,is developed based on possibility theory.The performance of the reliability of the system is of prime concern.In view of this,the outcomes for the failure are required to evaluate with utmost care.In possibility theory,the reliability information data determined from decision-making experts are subjective.The samemethod is also related to the survival possibilities as against the survival probabilities.The other method is the one that is developed using the concept of approximation of closed interval including the piecewise quadratic fuzzy numbers.In this method,a decision-making expert is not sure of his/her estimates of the reliability parameters.Numerical experiments are performed to illustrate the efficiency of the suggested methods in this research.In the end,the paper is concluded with some future research directions to be explored for the proposed approach.
文摘Optimizing the performance of composite structures is a real-world application with significant benefits.In this paper,a high-fidelity finite element method(FEM)is combined with the iterative improvement capability of metaheuristic optimization algorithms to obtain optimized composite plates.The FEM module comprises of ninenode isoparametric plate bending element in conjunction with the first-order shear deformation theory(FSDT).A recently proposed memetic version of particle swarm optimization called RPSOLC is modified in the current research to carry out multi-objective Pareto optimization.The performance of the MO-RPSOLC is found to be comparable with the NSGA-III.This work successfully highlights the use of FEM-MO-RPSOLC in obtaining highfidelity Pareto solutions considering simultaneous maximization of the fundamental frequency and frequency separation in laminated composites by optimizing the stacking sequence.
文摘Major fields such as military applications,medical fields,weather forecasting,and environmental applications use wireless sensor networks for major computing processes.Sensors play a vital role in emerging technologies of the 20th century.Localization of sensors in needed locations is a very serious problem.The environment is home to every living being in the world.The growth of industries after the industrial revolution increased pollution across the environment.Owing to recent uncontrolled growth and development,sensors to measure pollution levels across industries and surroundings are needed.An interesting and challenging task is choosing the place to fit the sensors.Many meta-heuristic techniques have been introduced in node localization.Swarm intelligent algorithms have proven their efficiency in many studies on localization problems.In this article,we introduce an industrial-centric approach to solve the problem of node localization in the sensor network.First,our work aims at selecting industrial areas in the sensed location.We use random forest regression methodology to select the polluted area.Then,the elephant herding algorithm is used in sensor node localization.These two algorithms are combined to produce the best standard result in localizing the sensor nodes.To check the proposed performance,experiments are conducted with data from the KDD Cup 2018,which contain the name of 35 stations with concentrations of air pollutants such as PM,SO_(2),CO,NO_(2),and O_(3).These data are normalized and tested with algorithms.The results are comparatively analyzed with other swarm intelligence algorithms such as the elephant herding algorithm,particle swarm optimization,and machine learning algorithms such as decision tree regression and multi-layer perceptron.Results can indicate our proposed algorithm can suggest more meaningful locations for localizing the sensors in the topology.Our proposed method achieves a lower root mean square value with 0.06 to 0.08 for localizing with Stations 1 to 5.
文摘Urban living in large modern cities exerts considerable adverse effectson health and thus increases the risk of contracting several chronic kidney diseases (CKD). The prediction of CKDs has become a major task in urbanizedcountries. The primary objective of this work is to introduce and develop predictive analytics for predicting CKDs. However, prediction of huge samples isbecoming increasingly difficult. Meanwhile, MapReduce provides a feasible framework for programming predictive algorithms with map and reduce functions.The relatively simple programming interface helps solve problems in the scalability and efficiency of predictive learning algorithms. In the proposed work, theiterative weighted map reduce framework is introduced for the effective management of large dataset samples. A binary classification problem is formulated usingensemble nonlinear support vector machines and random forests. Thus, instead ofusing the normal linear combination of kernel activations, the proposed work creates nonlinear combinations of kernel activations in prototype examples. Furthermore, different descriptors are combined in an ensemble of deep support vectormachines, where the product rule is used to combine probability estimates ofdifferent classifiers. Performance is evaluated in terms of the prediction accuracyand interpretability of the model and the results.
文摘In contemporary medicine,cardiovascular disease is a major public health concern.Cardiovascular diseases are one of the leading causes of death worldwide.They are classified as vascular,ischemic,or hypertensive.Clinical information contained in patients’Electronic Health Records(EHR)enables clin-icians to identify and monitor heart illness.Heart failure rates have risen drama-tically in recent years as a result of changes in modern lifestyles.Heart diseases are becoming more prevalent in today’s medical setting.Each year,a substantial number of people die as a result of cardiac pain.The primary cause of these deaths is the improper use of pharmaceuticals without the supervision of a physician and the late detection of diseases.To improve the efficiency of the classification algo-rithms,we construct a data pre-processing stage using feature selection.Experi-ments using unidirectional and bidirectional neural network models found that a Deep Learning Modified Neural Network(DLMNN)model combined with the Pet Dog-Smell Sensing(PD-SS)algorithm predicted the highest classification performance on the UCI Machine Learning Heart Disease dataset.The DLMNN-based PDSS achieved an accuracy of 94.21%,an F-score of 92.38%,a recall of 94.62%,and a precision of 93.86%.These results are competitive and promising for a heart disease dataset.We demonstrated that a DLMNN framework based on deep models may be used to solve the categorization problem for an unbalanced heart disease dataset.Our proposed approach can result in exceptionally accurate models that can be utilized to analyze and diagnose clinical real-world data.
文摘This research work is the novel state-of-the-art technology performed on multi-cylinder SI engine fueled compressed natural gas,emulsified fuel,and hydrogen as dual fuel.This work predicts the overall features of performance,combustion,and exhaust emissions of individual fuels based on AVL Boost simulation technology.Three types of alternative fuels have been compared and analyzed.The results show that hydrogen produces 20%more brake power than CNG and 25%more power than micro-emulsion fuel at 1500 r/min,which further increases the brake power of hydrogen,CNG,and micro-emulsions in the range of 25%,20%,and 15%at higher engine speeds of 2500-4000 r/min,respectively.In addition,the brake-specific fuel consumption is the lowest for 100%hydrogen,followed by CNG 100%and then micro-emulsions at 1500 r/min.At 2500-5000 r/min,there is a significant drop in brake-specific fuel consumption due to a lean mixture at higher engine speeds.The CO,HC,and NOx emissions significantly improve for hydrogen,CNG,and micro-emulsion fuel.Hydrogen fuel shows zero CO and HC emissions and is the main objective of this research to produce 0%carbon-based emissions with a slight increase in NOx emissions,and CNG shows 30%lower CO emissions than micro-emulsions and 21.5%less hydrocarbon emissions than micro-emulsion fuel at stoichiometric air/fuel ratio.
文摘New atypical pneumonia caused by a virus called Coronavirus(COVID-19)appeared in Wuhan,China in December 2019.Unlike previous epidemics due to the severe acute respiratory syndrome(SARS)and the Middle East respiratory syndrome coronavirus(MERS-CoV),COVID-19 has the particularity that it is more contagious than the other previous ones.In this paper,we try to predict the COVID-19 epidemic peak in Japan with the help of real-time data from January 15 to February 29,2020 with the uses of fractional derivatives,namely,Caputo derivatives,the Caputo–Fabrizio derivatives,and Atangana–Baleanu derivatives in the Caputo sense.The fixed point theory and Picard–Lindel of approach used in this study provide the proof for the existence and uniqueness analysis of the solutions to the noninteger-order models under the investi-gations.For each fractional model,we propose a numerical scheme as well as prove its stability.Using parameter values estimated from the Japan COVID-19 epidemic real data,we perform numerical simulations to confirm the effectiveness of used approxima-tion methods by numerical simulations for different values of the fractional-orderγ,and to give the predictions of COVID-19 epidemic peaks in Japan in a specific range of time intervals.
文摘A stochastic SIR influenza vertical transmission model is examined in this paper where vaccination and an incidence rate that is not linear are considered.To determine whether testosterone regulates lower sintering HPA axis function in males,we used a stochastic SIR epidemic procedure with divergent influences on ACTH and cortisol.The suppressive effects on cortisol can be attributed to a peripheral(adrenal)locus.Following that,we came to the conclusion that experimental solutions have been discovered and the requisite statistical findings have been examined.Finally,we deduce that the given mathematical model and the results are relevant to medical research.In the future,this research can be further extended to simulate more results in the medical field.