期刊文献+
共找到18篇文章
< 1 >
每页显示 20 50 100
A Novel Approach to Breast Tumor Detection: Enhanced Speckle Reduction and Hybrid Classification in Ultrasound Imaging
1
作者 K.Umapathi S.Shobana +5 位作者 anand nayyar Judith Justin R.Vanithamani Miguel Villagómez Galindo Mushtaq Ahmad Ansari Hitesh Panchal 《Computers, Materials & Continua》 SCIE EI 2024年第5期1875-1901,共27页
Breast cancer detection heavily relies on medical imaging, particularly ultrasound, for early diagnosis and effectivetreatment. This research addresses the challenges associated with computer-aided diagnosis (CAD) of ... Breast cancer detection heavily relies on medical imaging, particularly ultrasound, for early diagnosis and effectivetreatment. This research addresses the challenges associated with computer-aided diagnosis (CAD) of breastcancer fromultrasound images. The primary challenge is accurately distinguishing between malignant and benigntumors, complicated by factors such as speckle noise, variable image quality, and the need for precise segmentationand classification. The main objective of the research paper is to develop an advanced methodology for breastultrasound image classification, focusing on speckle noise reduction, precise segmentation, feature extraction, andmachine learning-based classification. A unique approach is introduced that combines Enhanced Speckle ReducedAnisotropic Diffusion (SRAD) filters for speckle noise reduction, U-NET-based segmentation, Genetic Algorithm(GA)-based feature selection, and Random Forest and Bagging Tree classifiers, resulting in a novel and efficientmodel. To test and validate the hybrid model, rigorous experimentations were performed and results state thatthe proposed hybrid model achieved accuracy rate of 99.9%, outperforming other existing techniques, and alsosignificantly reducing computational time. This enhanced accuracy, along with improved sensitivity and specificity,makes the proposed hybrid model a valuable addition to CAD systems in breast cancer diagnosis, ultimatelyenhancing diagnostic accuracy in clinical applications. 展开更多
关键词 Ultrasound images breast cancer tumor classification SEGMENTATION deep learning lesion detection
下载PDF
MADCR:Mobility Aware Dynamic Clustering-Based Routing Protocol in Internet of Vehicles 被引量:9
2
作者 Sankar Sennan Somula Ramasubbareddy +3 位作者 Sathiyabhama Balasubramaniyam anand nayyar Chaker Abdelaziz Kerrache Muhammad Bilal 《China Communications》 SCIE CSCD 2021年第7期69-85,共17页
Internet of Vehicles(IoV)is an evolution of the Internet of Things(IoT)to improve the capabilities of vehicular ad-hoc networks(VANETs)in intelligence transport systems.The network topology in IoV paradigm is highly d... Internet of Vehicles(IoV)is an evolution of the Internet of Things(IoT)to improve the capabilities of vehicular ad-hoc networks(VANETs)in intelligence transport systems.The network topology in IoV paradigm is highly dynamic.Clustering is one of the promising solutions to maintain the route stability in the dynamic network.However,existing algorithms consume a considerable amount of time in the cluster head(CH)selection process.Thus,this study proposes a mobility aware dynamic clustering-based routing(MADCR)protocol in IoV to maximize the lifespan of networks and reduce the end-to-end delay of vehicles.The MADCR protocol consists of cluster formation and CH selection processes.A cluster is formed on the basis of Euclidean distance.The CH is then chosen using the mayfly optimization algorithm(MOA).The CH subsequently receives vehicle data and forwards such data to the Road Side Unit(RSU).The performance of the MADCR protocol is compared with that ofAnt Colony Optimization(ACO),Comprehensive Learning Particle Swarm Optimization(CLPSO),and Clustering Algorithm for Internet of Vehicles based on Dragonfly Optimizer(CAVDO).The proposed MADCR protocol decreases the end-toend delay by 5–80 ms and increases the packet delivery ratio by 5%–15%. 展开更多
关键词 clustering protocol Internet of things Internet of vehicles optimization algorithm Mayfly algorithm
下载PDF
System Performance of Wireless Sensor Network Using LoRa–Zigbee Hybrid Communication 被引量:4
3
作者 Van-Truong Truong anand nayyar Showkat Ahmad Lone 《Computers, Materials & Continua》 SCIE EI 2021年第8期1615-1635,共21页
Wireless sensor network(WSN)is considered as the fastest growing technology pattern in recent years because of its applicability in varied domains.Many sensor nodes with different sensing functionalities are deployed ... Wireless sensor network(WSN)is considered as the fastest growing technology pattern in recent years because of its applicability in varied domains.Many sensor nodes with different sensing functionalities are deployed in the monitoring area to collect suitable data and transmit it to the gateway.Ensuring communications in heterogeneous WSNs,is a critical issue that needs to be studied.In this research paper,we study the system performance of a heterogeneous WSN using LoRa–Zigbee hybrid communication.Specifically,two Zigbee sensor clusters and two LoRa sensor clusters are used and combined with two Zigbee-to-LoRa converters to communicate in a network managed by a LoRa gateway.The overall system integrates many different sensors in terms of types,communication protocols,and accuracy,which can be used in many applications in realistic environments such as on land,under water,or in the air.In addition to this,a synchronous management software on ThingSpeak Web server and Blynk app is designed.In the proposed system,the token ring protocol in Zigbee network and polling mechanism in LoRa network is used.The system can operate with a packet loss rate of less than 0.5%when the communication range of the Zigbee network is 630 m,and the communication range of the LoRa network is 3.7 km.On the basis of the digital results collected on the management software,this study proves tremendous improvements in the system performance. 展开更多
关键词 Wireless sensor network ZIGBEE LoRa communication protocol CONVERTER packet loss
下载PDF
Role of Fuzzy Approach towards Fault Detection for Distributed Components 被引量:3
4
作者 Yaser Hafeez Sadia Ali +3 位作者 Nz Jhanjhi Mamoona Humayun anand nayyar Mehedi Masud 《Computers, Materials & Continua》 SCIE EI 2021年第5期1979-1996,共18页
Component-based software development is rapidly introducing numerous new paradigms and possibilities to deliver highly customized software in a distributed environment.Among other communication,teamwork,and coordinati... Component-based software development is rapidly introducing numerous new paradigms and possibilities to deliver highly customized software in a distributed environment.Among other communication,teamwork,and coordination problems in global software development,the detection of faults is seen as the key challenge.Thus,there is a need to ensure the reliability of component-based applications requirements.Distributed device detection faults applied to tracked components from various sources and failed to keep track of all the large number of components from different locations.In this study,we propose an approach for fault detection from componentbased systems requirements using the fuzzy logic approach and historical information during acceptance testing.This approach identified error-prone components selection for test case extraction and for prioritization of test cases to validate components in acceptance testing.For the evaluation,we used empirical study,and results depicted that the proposed approach significantly outperforms in component selection and acceptance testing.The comparison to the conventional procedures,i.e.,requirement criteria,and communication coverage criteria without irrelevancy and redundancy successfully outperform other procedures.Consequently,the F-measures of the proposed approach define the accurate selection of components,and faults identification increases in components using the proposed approach were higher(i.e.,more than 80 percent)than requirement criteria,and code coverage criteria procedures(i.e.,less than 80 percent),respectively.Similarly,the rate of fault detection in the proposed approach increases,i.e.,92.80 compared to existing methods i.e.,less than 80 percent.The proposed approach will provide a comprehensive guideline and roadmap for practitioners and researchers. 展开更多
关键词 Component-based software SELECTION acceptance testing fault detection
下载PDF
P-ACOHONEYBEE: A Novel Load Balancer for Cloud Computing Using Mathematical Approach 被引量:1
5
作者 Sunday Adeola Ajagbe Mayowa O.Oyediran +2 位作者 anand nayyar Jinmisayo A.Awokola Jehad F.Al-Amri 《Computers, Materials & Continua》 SCIE EI 2022年第10期1943-1959,共17页
Cloud computing is a collection of disparate resources or services,a web of massive infrastructures,which is aimed at achieving maximum utilization with higher availability at a minimized cost.One of the most attracti... Cloud computing is a collection of disparate resources or services,a web of massive infrastructures,which is aimed at achieving maximum utilization with higher availability at a minimized cost.One of the most attractive applications for cloud computing is the concept of distributed information processing.Security,privacy,energy saving,reliability and load balancing are the major challenges facing cloud computing and most information technology innovations.Load balancing is the process of redistributing workload among all nodes in a network;to improve resource utilization and job response time,while avoiding overloading some nodes when other nodes are underloaded or idle is a major challenge.Thus,this research aims to design a novel load balancing systems in a cloud computing environment.The research is based on the modification of the existing approaches,namely;particle swarm optimization(PSO),honeybee,and ant colony optimization(ACO)with mathematical expression to form a novel approach called PACOHONEYBEE.The experiments were conducted on response time and throughput.The results of the response time of honeybee,PSO,SASOS,round-robin,PSO-ACO,and P-ACOHONEYBEE are:2791,2780,2784,2767,2727,and 2599(ms)respectively.The outcome of throughput of honeybee,PSO,SASOS,round-robin,PSO-ACO,and P-ACOHONEYBEE are:7451,7425,7398,7357,7387 and 7482(bps)respectively.It is observed that P-ACOHONEYBEE approach produces the lowest response time,high throughput and overall improved performance for the 10 nodes.The research has helped in managing the imbalance drawback by maximizing throughput,and reducing response time with scalability and reliability. 展开更多
关键词 ACO cloud computing load balancing swarm intelligence PSO P-ACOHONEYBE honeybee swarm
下载PDF
Exploration of IoT Nodes Communication Using LoRaWAN in Forest Environment 被引量:1
6
作者 Anshul Sharma Divneet Singh Kapoor +3 位作者 anand nayyar Basit Qureshi Kiran Jot Singh Khushal Thakur 《Computers, Materials & Continua》 SCIE EI 2022年第6期6239-6256,共18页
The simultaneous advances in the Internet of Things(IoT),Artificial intelligence(AI)and Robotics is going to revolutionize our world in the near future.In recent years,LoRa(Long Range)wireless powered by LoRaWAN(LoRa ... The simultaneous advances in the Internet of Things(IoT),Artificial intelligence(AI)and Robotics is going to revolutionize our world in the near future.In recent years,LoRa(Long Range)wireless powered by LoRaWAN(LoRa Wide Area Network)protocol has attracted the attention of researchers for numerous applications in the IoT domain.LoRa is a low power,unlicensed Industrial,Scientific,and Medical(ISM)bandequipped wireless technology that utilizes a wide area network protocol,i.e.,LoRaWAN,to incorporate itself into the network infrastructure.In this paper,we have evaluated the LoRaWAN communication protocol for the implementation of the IoT(Internet of Things)nodes’communication in a forest scenario.The outdoor performance of LoRa wireless in LoRaWAN,i.e.,the physical layer,has been evaluated in the forest area of Kashirampur Uttarakhand,India.Hence,the present paper aims towards analyzing the performance level of the LoRaWAN technology by observing the changes in Signal to Noise Ratio(SNR),Packet Reception Ratio(PRR)and Received Signal Strength Indicator(RSSI),with respect to the distance between IoT nodes.The article focuses on estimating network lifetime for a specific set of LoRa configuration parameters,hardware selection and power constraints.From the experimental results,it has been observed that transmissions can propagate to a distance of 300 m in the forest environment,while consuming approx.63%less energy for spreading factor 7 at 2 dBm,without incurring significant packet loss with PRR greater than 80%. 展开更多
关键词 LoRa LoRaWAN IOT communication protocol wireless sensor networks packet reception ratio
下载PDF
Predicting Resource Availability in Local Mobile Crowd Computing Using Convolutional GRU
7
作者 Pijush Kanti Dutta Pramanik Nilanjan Sinhababu +2 位作者 anand nayyar Mehedi Masud Prasenjit Choudhury 《Computers, Materials & Continua》 SCIE EI 2022年第3期5199-5212,共14页
In mobile crowd computing(MCC),people’s smart mobile devices(SMDs)are utilized as computing resources.Considering the ever-growing computing capabilities of today’s SMDs,a collection of them can offer significantly ... In mobile crowd computing(MCC),people’s smart mobile devices(SMDs)are utilized as computing resources.Considering the ever-growing computing capabilities of today’s SMDs,a collection of them can offer significantly high-performance computing services.In a localMCC,the SMDs are typically connected to a local Wi-Fi network.Organizations and institutions can leverage the SMDs available within the campus to form local MCCs to cater to their computing needs without any financial and operational burden.Though it offers an economical and sustainable computing solution,users’mobility poses a serious issue in the QoS of MCC.To address this,before submitting a job to an SMD,we suggest estimating that particular SMD’s availability in the network until the job is finished.For this,we propose a convolutional GRU-based prediction model to assess how long an SMD is likely to be available in the network from any given point of time.For experimental purposes,we collected real users’mobility data(in-time and outtime)with respect to a Wi-Fi access point.To build the prediction model,we presented a novel feature extraction method to be applied to the time-series data.The experimental results prove that the proposed convolutional GRU model outperforms the conventional GRU model. 展开更多
关键词 Resource selection resource availability mobile grid mobile cloud ad-hoc cloud crowd computing deep learning GRU CNN RNN
下载PDF
LOA-RPL:Novel Energy-Efficient Routing Protocol for the Internet of Things Using Lion Optimization Algorithm to Maximize Network Lifetime
8
作者 Sankar Sennan Somula Ramasubbareddy +2 位作者 anand nayyar Yunyoung Nam Mohamed Abouhawwash 《Computers, Materials & Continua》 SCIE EI 2021年第10期351-371,共21页
Energy conservation is a significant task in the Internet of Things(IoT)because IoT involves highly resource-constrained devices.Clustering is an effective technique for saving energy by reducing duplicate data.In a c... Energy conservation is a significant task in the Internet of Things(IoT)because IoT involves highly resource-constrained devices.Clustering is an effective technique for saving energy by reducing duplicate data.In a clustering protocol,the selection of a cluster head(CH)plays a key role in prolonging the lifetime of a network.However,most cluster-based protocols,including routing protocols for low-power and lossy networks(RPLs),have used fuzzy logic and probabilistic approaches to select the CH node.Consequently,early battery depletion is produced near the sink.To overcome this issue,a lion optimization algorithm(LOA)for selecting CH in RPL is proposed in this study.LOA-RPL comprises three processes:cluster formation,CH selection,and route establishment.A cluster is formed using the Euclidean distance.CH selection is performed using LOA.Route establishment is implemented using residual energy information.An extensive simulation is conducted in the network simulator ns-3 on various parameters,such as network lifetime,power consumption,packet delivery ratio(PDR),and throughput.The performance of LOA-RPL is also compared with those of RPL,fuzzy rule-based energyefficient clustering and immune-inspired routing(FEEC-IIR),and the routing scheme for IoT that uses shuffled frog-leaping optimization algorithm(RISARPL).The performance evaluation metrics used in this study are network lifetime,power consumption,PDR,and throughput.The proposed LOARPL increases network lifetime by 20%and PDR by 5%–10%compared with RPL,FEEC-IIR,and RISA-RPL.LOA-RPL is also highly energy-efficient compared with other similar routing protocols. 展开更多
关键词 Internet of things cluster head clustering protocol optimization algorithm lion optimization algorithm network lifetime routing protocol wireless sensor networks energy consumption low-power and lossy networks
下载PDF
Generating A New Shilling Attack for Recommendation Systems
9
作者 Pradeep Kumar Singh Pijush Kanti Dutta Pramanik +3 位作者 Madhumita Sardar anand nayyar Mehedi Masud Prasenjit Choudhury 《Computers, Materials & Continua》 SCIE EI 2022年第5期2827-2846,共20页
A collaborative filtering-based recommendation system has been an integral part of e-commerce and e-servicing.To keep the recommendation systems reliable,authentic,and superior,the security of these systems is very cr... A collaborative filtering-based recommendation system has been an integral part of e-commerce and e-servicing.To keep the recommendation systems reliable,authentic,and superior,the security of these systems is very crucial.Though the existing shilling attack detection methods in collaborative filtering are able to detect the standard attacks,in this paper,we prove that they fail to detect a new or unknown attack.We develop a new attack model,named Obscure attack,with unknown features and observed that it has been successful in biasing the overall top-N list of the target users as intended.The Obscure attack is able to push target items to the top-N list as well as remove the actual rated items from the list.Our proposed attack is more effective at a smaller number of k in top-k similar user as compared to other existing attacks.The effectivity of the proposed attack model is tested on the MovieLens dataset,where various classifiers like SVM,J48,random forest,and naïve Bayes are utilized. 展开更多
关键词 Shilling attack recommendation system collaborative filtering top-N recommendation BIASING SHUFFLING hit ratio
下载PDF
Using DEMATEL for Contextual Learner Modeling in Personalized and Ubiquitous Learning
10
作者 Saurabh Pal Pijush Kanti Dutta Pramanik +3 位作者 Musleh Alsulami anand nayyar Mohammad Zarour Prasenjit Choudhury 《Computers, Materials & Continua》 SCIE EI 2021年第12期3981-4001,共21页
With the popularity of e-learning,personalization and ubiquity have become important aspects of online learning.To make learning more personalized and ubiquitous,we propose a learner model for a query-based personaliz... With the popularity of e-learning,personalization and ubiquity have become important aspects of online learning.To make learning more personalized and ubiquitous,we propose a learner model for a query-based personalized learning recommendation system.Several contextual attributes characterize a learner,but considering all of them is costly for a ubiquitous learning system.In this paper,a set of optimal intrinsic and extrinsic contexts of a learner are identified for learner modeling.A total of 208 students are surveyed.DEMATEL(Decision Making Trial and Evaluation Laboratory)technique is used to establish the validity and importance of the identified contexts and find the interdependency among them.The acquiring methods of these contexts are also defined.On the basis of these contexts,the learner model is designed.A layered architecture is presented for interfacing the learner model with a query-based personalized learning recommendation system.In a ubiquitous learning scenario,the necessary adaptive decisions are identified to make a personalized recommendation to a learner. 展开更多
关键词 Personalized e-learning DEMATEL learner model ONTOLOGY learner context personalized recommendation adaptive decisions
下载PDF
Low Area PRESENT Cryptography in FPGA Using TRNG-PRNG Key Generation
11
作者 T.Kowsalya R.Ganesh Babu +2 位作者 B.D.Parameshachari anand nayyar Raja Majid Mehmood 《Computers, Materials & Continua》 SCIE EI 2021年第8期1447-1465,共19页
Lightweight Cryptography(LWC)is widely used to provide integrity,secrecy and authentication for the sensitive applications.However,the LWC is vulnerable to various constraints such as high-power consumption,time consu... Lightweight Cryptography(LWC)is widely used to provide integrity,secrecy and authentication for the sensitive applications.However,the LWC is vulnerable to various constraints such as high-power consumption,time consumption,and hardware utilization and susceptible to the malicious attackers.In order to overcome this,a lightweight block cipher namely PRESENT architecture is proposed to provide the security against malicious attacks.The True Random Number Generator-Pseudo Random Number Generator(TRNG-PRNG)based key generation is proposed to generate the unpredictable keys,being highly difficult to predict by the hackers.Moreover,the hardware utilization of PRESENT architecture is optimized using the Dual port Read Only Memory(DROM).The proposed PRESENT-TRNGPRNG architecture supports the 64-bit input with 80-bit of key value.The performance of the PRESENT-TRNG-PRNG architecture is evaluated by means of number of slice registers,flip flops,number of slices Look Up Table(LUT),number of logical elements,slices,bonded input/output block(IOB),frequency,power and delay.The input retrieval performances analyzed in this PRESENT-TRNG-PRNG architecture are Peak Signal to Noise Ratio(PSNR),Structural Similarity Index(SSIM)and Mean-Square Error(MSE).The PRESENT-TRNG-PRNG architecture is compared with three different existing PRESENT architectures such as PRESENT On-TheFly(PERSENT-OTF),PRESENT Self-Test Structure(PRESENT-STS)and PRESENT-Round Keys(PRESENT-RK).The operating frequency of the PRESENT-TRNG-PRNG is 612.208 MHz for Virtex 5,which is high as compared to the PRESENT-RK. 展开更多
关键词 Dual port read only memory hardware utilization lightweight cryptography malicious attackers present block cipher pseudo random number generator true random number generator
下载PDF
Stock Prediction Based on Technical Indicators Using Deep Learning Model
12
作者 Manish Agrawal Piyush Kumar Shukla +2 位作者 Rajit Nair anand nayyar Mehedi Masud 《Computers, Materials & Continua》 SCIE EI 2022年第1期287-304,共18页
Stock market trends forecast is one of the most current topics and a significant research challenge due to its dynamic and unstable nature.The stock data is usually non-stationary,and attributes are non-correlative to... Stock market trends forecast is one of the most current topics and a significant research challenge due to its dynamic and unstable nature.The stock data is usually non-stationary,and attributes are non-correlative to each other.Several traditional Stock Technical Indicators(STIs)may incorrectly predict the stockmarket trends.To study the stock market characteristics using STIs and make efficient trading decisions,a robust model is built.This paper aims to build up an Evolutionary Deep Learning Model(EDLM)to identify stock trends’prices by using STIs.The proposed model has implemented the Deep Learning(DL)model to establish the concept of Correlation-Tensor.The analysis of the dataset of three most popular banking organizations obtained from the live stock market based on the National Stock exchange(NSE)-India,a Long Short Term Memory(LSTM)is used.The datasets encompassed the trading days from the 17^(th) of Nov 2008 to the 15^(th) of Nov 2018.This work also conducted exhaustive experiments to study the correlation of various STIs with stock price trends.The model built with an EDLM has shown significant improvements over two benchmark ML models and a deep learning one.The proposed model aids investors in making profitable investment decisions as it presents trend-based forecasting and has achieved a prediction accuracy of 63.59%,56.25%,and 57.95%on the datasets of HDFC,Yes Bank,and SBI,respectively.Results indicate that the proposed EDLA with a combination of STIs can often provide improved results than the other state-of-the-art algorithms. 展开更多
关键词 Long short term memory evolutionary deep learning model national stock exchange stock technical indicators predictive modelling prediction accuracy
下载PDF
Multiway Relay Based Framework for Network Coding in Multi-Hop WSNs
13
作者 Vinod Kumar Menaria anand nayyar +1 位作者 Sandeep Kumar Ketan Kotecha 《Computers, Materials & Continua》 SCIE EI 2023年第1期1199-1216,共18页
In today’s information technology(IT)world,the multi-hop wireless sensor networks(MHWSNs)are considered the building block for the Internet of Things(IoT)enabled communication systems for controlling everyday tasks o... In today’s information technology(IT)world,the multi-hop wireless sensor networks(MHWSNs)are considered the building block for the Internet of Things(IoT)enabled communication systems for controlling everyday tasks of organizations and industry to provide quality of service(QoS)in a stipulated time slot to end-user over the Internet.Smart city(SC)is an example of one such application which can automate a group of civil services like automatic control of traffic lights,weather prediction,surveillance,etc.,in our daily life.These IoT-based networks with multi-hop communication and multiple sink nodes provide efficient communication in terms of performance parameters such as throughput,energy efficiency,and end-to-end delay,wherein low latency is considered a challenging issue in next-generation networks(NGN).This paper introduces a single and parallels stable server queuing model with amulti-class of packets and native and coded packet flowto illustrate the simple chain topology and complexmultiway relay(MWR)node with specific neighbor topology.Further,for improving data transmission capacity inMHWSNs,an analytical framework for packet transmission using network coding at the MWR node in the network layer with opportunistic listening is performed by considering bi-directional network flow at the MWR node.Finally,the accuracy of the proposed multi-server multi-class queuing model is evaluated with and without network coding at the network layer by transmitting data packets.The results of the proposed analytical framework are validated and proved effective by comparing these analytical results to simulation results. 展开更多
关键词 Multi-hop wireless sensor networks network coding multiway relay node THROUGHPUT multi-server multi-class queuing models
下载PDF
Mutation Prediction for Coronaviruses Using Genome Sequence and Recurrent Neural Networks
14
作者 Pranav Pushkar Christo Ananth +3 位作者 Preeti Nagrath Jehad F.Al-Amri Vividha anand nayyar 《Computers, Materials & Continua》 SCIE EI 2022年第10期1601-1619,共19页
The study of viruses and their genetics has been an opportunity as well as a challenge for the scientific community.The recent ongoing SARSCov2(Severe Acute Respiratory Syndrome)pandemic proved the unpreparedness for ... The study of viruses and their genetics has been an opportunity as well as a challenge for the scientific community.The recent ongoing SARSCov2(Severe Acute Respiratory Syndrome)pandemic proved the unpreparedness for these situations.Not only the countermeasures for the effect caused by virus need to be tackled but the mutation taking place in the very genome of the virus is needed to be kept in check frequently.One major way to find out more information about such pathogens is by extracting the genetic data of such viruses.Though genetic data of viruses have been cultured and stored as well as isolated in form of their genome sequences,there is still limited methods on what new viruses can be generated in future due to mutation.This research proposes a deep learning model to predict the genome sequences of the SARS-Cov2 virus using only the previous viruses of the coronaviridae family with the help of RNN-LSTM(Recurrent Neural Network-Long ShortTerm Memory)and RNN-GRU(Gated Recurrent Unit)so that in the future,several counter measures can be taken by predicting possible changes in the genome with the help of existing mutations in the virus.After the process of testing the model,the F1-recall came out to be more than 0.95.The mutation detection’s accuracy of both the models come out about 98.5%which shows the capability of the recurrent neural network to predict future changes in the genome of virus. 展开更多
关键词 COVID-19 genome sequence CORONAVIRIDAE RNN-LSTM RNN-GRU
下载PDF
Big Data Analytics Using Swarm-Based Long Short-Term Memory for Temperature Forecasting
15
作者 Malini M.Patil P.M.Rekha +2 位作者 Arun Solanki anand nayyar Basit Qureshi 《Computers, Materials & Continua》 SCIE EI 2022年第5期2347-2361,共15页
In the past few decades,climatic changes led by environmental pollution,the emittance of greenhouse gases,and the emergence of brown energy utilization have led to global warming.Global warming increases the Earth’s ... In the past few decades,climatic changes led by environmental pollution,the emittance of greenhouse gases,and the emergence of brown energy utilization have led to global warming.Global warming increases the Earth’s temperature,thereby causing severe effects on human and environmental conditions and threatening the livelihoods of millions of people.Global warming issues are the increase in global temperatures that lead to heat strokes and high-temperature-related diseases during the summer,causing the untimely death of thousands of people.To forecast weather conditions,researchers have utilized machine learning algorithms,such as autoregressive integrated moving average,ensemble learning,and long short-term memory network.These techniques have been widely used for the prediction of temperature.In this paper,we present a swarm-based approach called Cauchy particle swarm optimization(CPSO)to find the hyperparameters of the long shortterm memory(LSTM)network.The hyperparameters were determined by minimizing the LSTM validationmean square error rate.The optimized hyperparameters of the LSTM were used to forecast the temperature of Chennai City.The proposed CPSO-LSTM model was tested on the openly available 25-year Chennai temperature dataset.The experimental evaluation on MATLABR2020a analyzed the root mean square error rate and mean absolute error to evaluate the forecasted output.The proposed CPSO-LSTM outperforms the traditional LSTM algorithm by reducing its computational time to 25 min under 200 epochs and 150 hidden neurons during training.The proposed hyperparameter-based LSTM can predict the temperature accurately by having a root mean square error(RMSE)value of 0.250 compared with the traditional LSTM of 0.35 RMSE. 展开更多
关键词 Climatic change big data TEMPERATURE forecasting swarm intelligence deep learning
下载PDF
Medical Image Analysis Using Deep Learning and Distribution Pattern Matching Algorithm
16
作者 Mustafa Musa Jaber Salman Yussof +3 位作者 Amer S.Elameer Leong Yeng Weng Sura Khalil Abd anand nayyar 《Computers, Materials & Continua》 SCIE EI 2022年第8期2175-2190,共16页
Artificial intelligence plays an essential role in the medical and health industries.Deep convolution networks offer valuable services and help create automated systems to perform medical image analysis.However,convol... Artificial intelligence plays an essential role in the medical and health industries.Deep convolution networks offer valuable services and help create automated systems to perform medical image analysis.However,convolution networks examine medical images effectively;such systems require high computational complexity when recognizing the same disease-affected region.Therefore,an optimized deep convolution network is utilized for analyzing disease-affected regions in this work.Different disease-relatedmedical images are selected and examined pixel by pixel;this analysis uses the gray wolf optimized deep learning network.This method identifies affected pixels by the gray wolf hunting process.The convolution network uses an automatic learning function that predicts the disease affected by previous imaging analysis.The optimized algorithm-based selected regions are further examined using the distribution pattern-matching rule.The pattern-matching process recognizes the disease effectively,and the system’s efficiency is evaluated using theMATLAB implementation process.This process ensures high accuracy of up to 99.02%to 99.37%and reduces computational complexity. 展开更多
关键词 Artificial intelligence medical field gray wolf-optimized deep convolution networks distribution pattern-matching rule
下载PDF
An Improved Ensemble Learning Approach for Heart Disease Prediction Using Boosting Algorithms
17
作者 ShahidMohammad Ganie Pijush Kanti Dutta Pramanik +2 位作者 Majid BashirMalik anand nayyar Kyung Sup Kwak 《Computer Systems Science & Engineering》 SCIE EI 2023年第9期3993-4006,共14页
Cardiovascular disease is among the top five fatal diseases that affect lives worldwide.Therefore,its early prediction and detection are crucial,allowing one to take proper and necessary measures at earlier stages.Mac... Cardiovascular disease is among the top five fatal diseases that affect lives worldwide.Therefore,its early prediction and detection are crucial,allowing one to take proper and necessary measures at earlier stages.Machine learning(ML)techniques are used to assist healthcare providers in better diagnosing heart disease.This study employed three boosting algorithms,namely,gradient boost,XGBoost,and AdaBoost,to predict heart disease.The dataset contained heart disease-related clinical features and was sourced from the publicly available UCI ML repository.Exploratory data analysis is performed to find the characteristics of data samples about descriptive and inferential statistics.Specifically,it was carried out to identify and replace outliers using the interquartile range and detect and replace the missing values using the imputation method.Results were recorded before and after the data preprocessing techniques were applied.Out of all the algorithms,gradient boosting achieved the highest accuracy rate of 92.20%for the proposed model.The proposed model yielded better results with gradient boosting in terms of precision,recall,and f1-score.It attained better prediction performance than the existing works and can be used for other diseases that share common features using transfer learning. 展开更多
关键词 Heart disease prediction machine learning classifiers ensemble approach XGBoost ADABOOST gradient boost
下载PDF
Convergence of blockchain and Internet of Things:integration, security, and use cases
18
作者 Robertas DAMASEVICIUS Sanjay MISRA +1 位作者 Rytis MASKELIUNAS anand nayyar 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2024年第10期1295-1321,共27页
Internet of Things(IoT) devices are becoming increasingly ubiquitous, and their adoption is growing at an exponential rate. However, they are vulnerable to security breaches, and traditional security mechanisms are no... Internet of Things(IoT) devices are becoming increasingly ubiquitous, and their adoption is growing at an exponential rate. However, they are vulnerable to security breaches, and traditional security mechanisms are not enough to protect them. The massive amounts of data generated by IoT devices can be easily manipulated or stolen, posing significant privacy concerns. This paper is to provide a comprehensive overview of the integration of blockchain and IoT technologies and their potential to enhance the security and privacy of IoT systems. The paper examines various security issues and vulnerabilities in IoT and explores how blockchain-based solutions can be used to address them. It provides insights into the various security issues and vulnerabilities in IoT and explores how blockchain can be used to enhance security and privacy. The paper also discusses the potential applications of blockchain-based IoT(B-IoT) systems in various sectors, such as healthcare, transportation, and supply chain management. The paper reveals that the integration of blockchain and IoT has the potential to enhance the security,privacy, and trustworthiness of IoT systems. The multi-layered architecture of B-IoT, consisting of perception, network, data processing, and application layers, provides a comprehensive framework for the integration of blockchain and IoT technologies.The study identifies various security solutions for B-IoT, including smart contracts, decentralized control, immutable data storage,identity and access management(IAM), and consensus mechanisms. The study also discusses the challenges and future research directions in the field of B-IoT. 展开更多
关键词 Blockchain Internet of Things(loT) Blockchain-based IoT(B-IoT) SECURITY SCALABILITY PRIVACY
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部