期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
Automated File Labeling for Heterogeneous Files Organization Using Machine Learning
1
作者 Sagheer Abbas Syed Ali Raza +4 位作者 MAKhan Muhammad Adnan Khan Atta-ur-Rahman kiran sultan Amir Mosavi 《Computers, Materials & Continua》 SCIE EI 2023年第2期3263-3278,共16页
File labeling techniques have a long history in analyzing the anthological trends in computational linguistics.The situation becomes worse in the case of files downloaded into systems from the Internet.Currently,most ... File labeling techniques have a long history in analyzing the anthological trends in computational linguistics.The situation becomes worse in the case of files downloaded into systems from the Internet.Currently,most users either have to change file names manually or leave a meaningless name of the files,which increases the time to search required files and results in redundancy and duplications of user files.Currently,no significant work is done on automated file labeling during the organization of heterogeneous user files.A few attempts have been made in topic modeling.However,one major drawback of current topic modeling approaches is better results.They rely on specific language types and domain similarity of the data.In this research,machine learning approaches have been employed to analyze and extract the information from heterogeneous corpus.A different file labeling technique has also been used to get the meaningful and`cohesive topic of the files.The results show that the proposed methodology can generate relevant and context-sensitive names for heterogeneous data files and provide additional insight into automated file labeling in operating systems. 展开更多
关键词 Automated file labeling file organization machine learning topic modeling
下载PDF
A Fused Machine Learning Approach for Intrusion Detection System
2
作者 Muhammad Sajid Farooq Sagheer Abbas +3 位作者 Atta-ur-Rahman kiran sultan Muhammad Adnan Khan Amir Mosavi 《Computers, Materials & Continua》 SCIE EI 2023年第2期2607-2623,共17页
The rapid growth in data generation and increased use of computer network devices has amplified the infrastructures of internet.The interconnectivity of networks has brought various complexities in maintaining network... The rapid growth in data generation and increased use of computer network devices has amplified the infrastructures of internet.The interconnectivity of networks has brought various complexities in maintaining network availability,consistency,and discretion.Machine learning based intrusion detection systems have become essential to monitor network traffic for malicious and illicit activities.An intrusion detection system controls the flow of network traffic with the help of computer systems.Various deep learning algorithms in intrusion detection systems have played a prominent role in identifying and analyzing intrusions in network traffic.For this purpose,when the network traffic encounters known or unknown intrusions in the network,a machine-learning framework is needed to identify and/or verify network intrusion.The Intrusion detection scheme empowered with a fused machine learning technique(IDS-FMLT)is proposed to detect intrusion in a heterogeneous network that consists of different source networks and to protect the network from malicious attacks.The proposed IDS-FMLT system model obtained 95.18%validation accuracy and a 4.82%miss rate in intrusion detection. 展开更多
关键词 Fused machine learning heterogeneous network intrusion detection
下载PDF
Transmitter-Receiver Path Selection for Cell Range Extension Using Multi-Hop D2D
3
作者 Farah Akif kiran sultan +2 位作者 Aqdas N.Malik Ijaz M.Qureshi Saba Mahmood 《Computers, Materials & Continua》 SCIE EI 2021年第8期2075-2093,共19页
Conventional approach of dealing with more users per coverage area in cellular networks implies densifying the amount of(Access Point)AP which will eventually result in a larger carbon footprint.In this paper,we propo... Conventional approach of dealing with more users per coverage area in cellular networks implies densifying the amount of(Access Point)AP which will eventually result in a larger carbon footprint.In this paper,we propose a base station off-loading and cell range extension(CRE)scheme based on multi-hop device-to-device(MHD2D)path selection between transmitter and receiver node.The paper also provides derivations of upper and lower bounds for energy efficiency,capacity,and transmit power.The proposed path selection scheme is inspired by the foraging behavior of honey bees.We present the algorithm as a modified variant of the artificial bee colony algorithm(MVABC).The proposed optimization problem is modeled as a minimization problem where we optimize the Energy Efficiency(EE).The proposed path selection MVABC is compared with the Genetic Algorithm(GA)and also with classical artificial bee colony(ABC)through simulations and statistical analysis.The student’s t-test,p-value,and standard error of means(SEM)clearly show that MVABC based path selection out-performs the GA and classical ABC schemes.MVABC based approach is 66%more efficient when compared with classic ABC and about 62%efficient when compared with GA based scheme. 展开更多
关键词 MULTI-HOP ultra-dense D2D path-selection ABC GA
下载PDF
Supervised Machine Learning-Based Prediction of COVID-19
4
作者 Atta-ur-Rahman kiran sultan +7 位作者 Iftikhar Naseer Rizwan Majeed Dhiaa Musleh Mohammed Abdul Salam Gollapalli Sghaier Chabani Nehad Ibrahim Shahan Yamin Siddiqui Muhammad Adnan Khan 《Computers, Materials & Continua》 SCIE EI 2021年第10期21-34,共14页
COVID-19 turned out to be an infectious and life-threatening viral disease,and its swift and overwhelming spread has become one of the greatest challenges for the world.As yet,no satisfactory vaccine or medication has... COVID-19 turned out to be an infectious and life-threatening viral disease,and its swift and overwhelming spread has become one of the greatest challenges for the world.As yet,no satisfactory vaccine or medication has been developed that could guarantee its mitigation,though several efforts and trials are underway.Countries around the globe are striving to overcome the COVID-19 spread and while they are finding out ways for early detection and timely treatment.In this regard,healthcare experts,researchers and scientists have delved into the investigation of existing as well as new technologies.The situation demands development of a clinical decision support system to equip the medical staff ways to timely detect this disease.The state-of-the-art research in Artificial intelligence(AI),Machine learning(ML)and cloud computing have encouraged healthcare experts to find effective detection schemes.This study aims to provide a comprehensive review of the role of AI&ML in investigating prediction techniques for the COVID-19.A mathematical model has been formulated to analyze and detect its potential threat.The proposed model is a cloud-based smart detection algorithm using support vector machine(CSDC-SVM)with cross-fold validation testing.The experimental results have achieved an accuracy of 98.4%with 15-fold cross-validation strategy.The comparison with similar state-of-the-art methods reveals that the proposed CSDC-SVM model possesses better accuracy and efficiency. 展开更多
关键词 COVID-19 CSDC-SVM artificial intelligence machine learning cloud computing support vector machine
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部