期刊文献+
共找到196篇文章
< 1 2 10 >
每页显示 20 50 100
Challenges and Limitations in Speech Recognition Technology:A Critical Review of Speech Signal Processing Algorithms,Tools and Systems
1
作者 Sneha Basak Himanshi Agrawal +4 位作者 Shreya Jena Shilpa Gite Mrinal Bachute Biswajeet Pradhan Mazen Assiri 《Computer Modeling in Engineering & Sciences》 SCIE EI 2023年第5期1053-1089,共37页
Speech recognition systems have become a unique human-computer interaction(HCI)family.Speech is one of the most naturally developed human abilities;speech signal processing opens up a transparent and hand-free computa... Speech recognition systems have become a unique human-computer interaction(HCI)family.Speech is one of the most naturally developed human abilities;speech signal processing opens up a transparent and hand-free computation experience.This paper aims to present a retrospective yet modern approach to the world of speech recognition systems.The development journey of ASR(Automatic Speech Recognition)has seen quite a few milestones and breakthrough technologies that have been highlighted in this paper.A step-by-step rundown of the fundamental stages in developing speech recognition systems has been presented,along with a brief discussion of various modern-day developments and applications in this domain.This review paper aims to summarize and provide a beginning point for those starting in the vast field of speech signal processing.Since speech recognition has a vast potential in various industries like telecommunication,emotion recognition,healthcare,etc.,this review would be helpful to researchers who aim at exploring more applications that society can quickly adopt in future years of evolution. 展开更多
关键词 Speech recognition automatic speech recognition(ASR) mel-frequency cepstral coefficients(MFCC) hidden Markov model(HMM) artificial neural network(ANN)
下载PDF
THAPE: A Tunable Hybrid Associative Predictive Engine Approach for Enhancing Rule Interpretability in Association Rule Learning for the Retail Sector
2
作者 Monerah Alawadh Ahmed Barnawi 《Computers, Materials & Continua》 SCIE EI 2024年第6期4995-5015,共21页
Association rule learning(ARL)is a widely used technique for discovering relationships within datasets.However,it often generates excessive irrelevant or ambiguous rules.Therefore,post-processing is crucial not only f... Association rule learning(ARL)is a widely used technique for discovering relationships within datasets.However,it often generates excessive irrelevant or ambiguous rules.Therefore,post-processing is crucial not only for removing irrelevant or redundant rules but also for uncovering hidden associations that impact other factors.Recently,several post-processing methods have been proposed,each with its own strengths and weaknesses.In this paper,we propose THAPE(Tunable Hybrid Associative Predictive Engine),which combines descriptive and predictive techniques.By leveraging both techniques,our aim is to enhance the quality of analyzing generated rules.This includes removing irrelevant or redundant rules,uncovering interesting and useful rules,exploring hidden association rules that may affect other factors,and providing backtracking ability for a given product.The proposed approach offers a tailored method that suits specific goals for retailers,enabling them to gain a better understanding of customer behavior based on factual transactions in the target market.We applied THAPE to a real dataset as a case study in this paper to demonstrate its effectiveness.Through this application,we successfully mined a concise set of highly interesting and useful association rules.Out of the 11,265 rules generated,we identified 125 rules that are particularly relevant to the business context.These identified rules significantly improve the interpretability and usefulness of association rules for decision-making purposes. 展开更多
关键词 Association rule learning POST-PROCESSING PREDICTIVE machine learning rule interpretability
下载PDF
AMachine Learning Approach to User Profiling for Data Annotation of Online Behavior
3
作者 Moona Kanwal Najeed AKhan Aftab A.Khan 《Computers, Materials & Continua》 SCIE EI 2024年第2期2419-2440,共22页
The user’s intent to seek online information has been an active area of research in user profiling.User profiling considers user characteristics,behaviors,activities,and preferences to sketch user intentions,interest... The user’s intent to seek online information has been an active area of research in user profiling.User profiling considers user characteristics,behaviors,activities,and preferences to sketch user intentions,interests,and motivations.Determining user characteristics can help capture implicit and explicit preferences and intentions for effective user-centric and customized content presentation.The user’s complete online experience in seeking information is a blend of activities such as searching,verifying,and sharing it on social platforms.However,a combination of multiple behaviors in profiling users has yet to be considered.This research takes a novel approach and explores user intent types based on multidimensional online behavior in information acquisition.This research explores information search,verification,and dissemination behavior and identifies diverse types of users based on their online engagement using machine learning.The research proposes a generic user profile template that explains the user characteristics based on the internet experience and uses it as ground truth for data annotation.User feedback is based on online behavior and practices collected by using a survey method.The participants include both males and females from different occupation sectors and different ages.The data collected is subject to feature engineering,and the significant features are presented to unsupervised machine learning methods to identify user intent classes or profiles and their characteristics.Different techniques are evaluated,and the K-Mean clustering method successfully generates five user groups observing different user characteristics with an average silhouette of 0.36 and a distortion score of 1136.Feature average is computed to identify user intent type characteristics.The user intent classes are then further generalized to create a user intent template with an Inter-Rater Reliability of 75%.This research successfully extracts different user types based on their preferences in online content,platforms,criteria,and frequency.The study also validates the proposed template on user feedback data through Inter-Rater Agreement process using an external human rater. 展开更多
关键词 User intent CLUSTER user profile online search information sharing user behavior search reasons
下载PDF
Fine-Tuning Cyber Security Defenses: Evaluating Supervised Machine Learning Classifiers for Windows Malware Detection
4
作者 Islam Zada Mohammed Naif Alatawi +4 位作者 Syed Muhammad Saqlain Abdullah Alshahrani Adel Alshamran Kanwal Imran Hessa Alfraihi 《Computers, Materials & Continua》 SCIE EI 2024年第8期2917-2939,共23页
Malware attacks on Windows machines pose significant cybersecurity threats,necessitating effective detection and prevention mechanisms.Supervised machine learning classifiers have emerged as promising tools for malwar... Malware attacks on Windows machines pose significant cybersecurity threats,necessitating effective detection and prevention mechanisms.Supervised machine learning classifiers have emerged as promising tools for malware detection.However,there remains a need for comprehensive studies that compare the performance of different classifiers specifically for Windows malware detection.Addressing this gap can provide valuable insights for enhancing cybersecurity strategies.While numerous studies have explored malware detection using machine learning techniques,there is a lack of systematic comparison of supervised classifiers for Windows malware detection.Understanding the relative effectiveness of these classifiers can inform the selection of optimal detection methods and improve overall security measures.This study aims to bridge the research gap by conducting a comparative analysis of supervised machine learning classifiers for detecting malware on Windows systems.The objectives include Investigating the performance of various classifiers,such as Gaussian Naïve Bayes,K Nearest Neighbors(KNN),Stochastic Gradient Descent Classifier(SGDC),and Decision Tree,in detecting Windows malware.Evaluating the accuracy,efficiency,and suitability of each classifier for real-world malware detection scenarios.Identifying the strengths and limitations of different classifiers to provide insights for cybersecurity practitioners and researchers.Offering recommendations for selecting the most effective classifier for Windows malware detection based on empirical evidence.The study employs a structured methodology consisting of several phases:exploratory data analysis,data preprocessing,model training,and evaluation.Exploratory data analysis involves understanding the dataset’s characteristics and identifying preprocessing requirements.Data preprocessing includes cleaning,feature encoding,dimensionality reduction,and optimization to prepare the data for training.Model training utilizes various supervised classifiers,and their performance is evaluated using metrics such as accuracy,precision,recall,and F1 score.The study’s outcomes comprise a comparative analysis of supervised machine learning classifiers for Windows malware detection.Results reveal the effectiveness and efficiency of each classifier in detecting different types of malware.Additionally,insights into their strengths and limitations provide practical guidance for enhancing cybersecurity defenses.Overall,this research contributes to advancing malware detection techniques and bolstering the security posture of Windows systems against evolving cyber threats. 展开更多
关键词 Security and privacy challenges in the context of requirements engineering supervisedmachine learning malware detection windows systems comparative analysis Gaussian Naive Bayes K Nearest Neighbors Stochastic Gradient Descent Classifier Decision Tree
下载PDF
A comparative in vitro study on the effect of SGLT2 inhibitors on chemosensitivity to doxorubicin in MCF-7 breast cancer cells
5
作者 SHAHID KARIM ALANOUD NAHER ALGHANMI +5 位作者 MAHA JAMAL HUDA ALKREATHY ALAM JAMAL HIND A.ALKHATABI MOHAMMED BAZUHAIR AFTAB AHMAD 《Oncology Research》 SCIE 2024年第5期817-830,共14页
Cancer frequently develops resistance to the majority of chemotherapy treatments.This study aimed to examine the synergistic cytotoxic and antitumor effects of SGLT2 inhibitors,specifically Canagliflozin(CAN),Dapaglif... Cancer frequently develops resistance to the majority of chemotherapy treatments.This study aimed to examine the synergistic cytotoxic and antitumor effects of SGLT2 inhibitors,specifically Canagliflozin(CAN),Dapagliflozin(DAP),Empagliflozin(EMP),and Doxorubicin(DOX),using in vitro experimentation.The precise combination of CAN+DOX has been found to greatly enhance the cytotoxic effects of doxorubicin(DOX)in MCF-7 cells.Interestingly,it was shown that cancer cells exhibit an increased demand for glucose and ATP in order to support their growth.Notably,when these medications were combined with DOX,there was a considerable inhibition of glucose consumption,as well as reductions in intracellular ATP and lactate levels.Moreover,this effect was found to be dependent on the dosages of the drugs.In addition to effectively inhibiting the cell cycle,the combination of CAN+DOX induces substantial modifications in both cell cycle and apoptotic gene expression.This work represents the initial report on the beneficial impact of SGLT2 inhibitor medications,namely CAN,DAP,and EMP,on the responsiveness to the anticancer properties of DOX.The underlying molecular mechanisms potentially involve the suppression of the function of SGLT2. 展开更多
关键词 SGLT2 Cancer CYTOTOXICITY ATP Cell cycle
下载PDF
Ensemble Deep Learning Based Air Pollution Prediction for Sustainable Smart Cities
6
作者 Maha Farouk Sabir Mahmoud Ragab +2 位作者 Adil O.Khadidos Khaled H.Alyoubi Alaa O.Khadidos 《Computer Systems Science & Engineering》 2024年第3期627-643,共17页
Big data and information and communication technologies can be important to the effectiveness of smart cities.Based on the maximal attention on smart city sustainability,developing data-driven smart cities is newly ob... Big data and information and communication technologies can be important to the effectiveness of smart cities.Based on the maximal attention on smart city sustainability,developing data-driven smart cities is newly obtained attention as a vital technology for addressing sustainability problems.Real-time monitoring of pollution allows local authorities to analyze the present traffic condition of cities and make decisions.Relating to air pollution occurs a main environmental problem in smart city environments.The effect of the deep learning(DL)approach quickly increased and penetrated almost every domain,comprising air pollution forecast.Therefore,this article develops a new Coot Optimization Algorithm with an Ensemble Deep Learning based Air Pollution Prediction(COAEDL-APP)system for Sustainable Smart Cities.The projected COAEDL-APP algorithm accurately forecasts the presence of air quality in the sustainable smart city environment.To achieve this,the COAEDL-APP technique initially performs a linear scaling normalization(LSN)approach to pre-process the input data.For air quality prediction,an ensemble of three DL models has been involved,namely autoencoder(AE),long short-term memory(LSTM),and deep belief network(DBN).Furthermore,the COA-based hyperparameter tuning procedure can be designed to adjust the hyperparameter values of the DL models.The simulation outcome of the COAEDL-APP algorithm was tested on the air quality database,and the outcomes stated the improved performance of the COAEDL-APP algorithm over other existing systems with maximum accuracy of 98.34%. 展开更多
关键词 SUSTAINABILITY smart cities air pollution prediction ensemble learning coot optimization algorithm
下载PDF
Security and Privacy Challenges in SDN-Enabled IoT Systems: Causes, Proposed Solutions,and Future Directions
7
作者 Ahmad Rahdari Ahmad Jalili +8 位作者 Mehdi Esnaashari Mehdi Gheisari Alisa A.Vorobeva Zhaoxi Fang Panjun Sun Viktoriia M.Korzhuk Ilya Popov Zongda Wu Hamid Tahaei 《Computers, Materials & Continua》 SCIE EI 2024年第8期2511-2533,共23页
Software-Defined Networking(SDN)represents a significant paradigm shift in network architecture,separating network logic from the underlying forwarding devices to enhance flexibility and centralize deployment.Concur-r... Software-Defined Networking(SDN)represents a significant paradigm shift in network architecture,separating network logic from the underlying forwarding devices to enhance flexibility and centralize deployment.Concur-rently,the Internet of Things(IoT)connects numerous devices to the Internet,enabling autonomous interactions with minimal human intervention.However,implementing and managing an SDN-IoT system is inherently complex,particularly for those with limited resources,as the dynamic and distributed nature of IoT infrastructures creates security and privacy challenges during SDN integration.The findings of this study underscore the primary security and privacy challenges across application,control,and data planes.A comprehensive review evaluates the root causes of these challenges and the defense techniques employed in prior works to establish sufficient secrecy and privacy protection.Recent investigations have explored cutting-edge methods,such as leveraging blockchain for transaction recording to enhance security and privacy,along with applying machine learning and deep learning approaches to identify and mitigate the impacts of Denial of Service(DoS)and Distributed DoS(DDoS)attacks.Moreover,the analysis indicates that encryption and hashing techniques are prevalent in the data plane,whereas access control and certificate authorization are prominently considered in the control plane,and authentication is commonly employed within the application plane.Additionally,this paper outlines future directions,offering insights into potential strategies and technological advancements aimed at fostering a more secure and privacy-conscious SDN-based IoT ecosystem. 展开更多
关键词 Security PRIVACY-PRESERVING software-defined network internet of things
下载PDF
Optimizing Spatial Pattern Analysis in Serial Remote Sensing Images through Empirical Mode Decomposition and Ant Colony Optimization
8
作者 J Srinivasan S Uma +1 位作者 Saleem Raja Abdul Samad Jayabrabu Ramakrishnan 《Journal of Harbin Institute of Technology(New Series)》 CAS 2024年第4期52-60,共9页
Serial remote sensing images offer a valuable means of tracking the evolutionary changes and growth of a specific geographical area over time.Although the original images may provide limited insights,they harbor consi... Serial remote sensing images offer a valuable means of tracking the evolutionary changes and growth of a specific geographical area over time.Although the original images may provide limited insights,they harbor considerable potential for identifying clusters and patterns.The aggregation of these serial remote sensing images(SRSI)becomes increasingly viable as distinct patterns emerge in diverse scenarios,such as suburbanization,the expansion of native flora,and agricultural activities.In a novel approach,we propose an innovative method for extracting sequential patterns by combining Ant Colony Optimization(ACD)and Empirical Mode Decomposition(EMD).This integration of the newly developed EMD and ACO techniques proves remarkably effective in identifying the most significant characteristic features within serial remote sensing images,guided by specific criteria.Our findings highlight a substantial improvement in the efficiency of sequential pattern mining through the application of this unique hybrid method,seamlessly integrating EMD and ACO for feature selection.This study exposes the potential of our innovative methodology,particularly in the realms of urbanization,native vegetation expansion,and agricultural activities. 展开更多
关键词 spatial pattern analysis EMD ACO
下载PDF
Robust Machine Learning Technique to Classify COVID-19 Using Fusion of Texture and Vesselness of X-Ray Images
9
作者 Shaik Mahaboob Basha Victor Hugo Cde Albuquerque +3 位作者 Samia Allaoua Chelloug Mohamed Abd Elaziz Shaik Hashmitha Mohisin Suhail Parvaze Pathan 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第2期1981-2004,共24页
Manual investigation of chest radiography(CXR)images by physicians is crucial for effective decision-making in COVID-19 diagnosis.However,the high demand during the pandemic necessitates auxiliary help through image a... Manual investigation of chest radiography(CXR)images by physicians is crucial for effective decision-making in COVID-19 diagnosis.However,the high demand during the pandemic necessitates auxiliary help through image analysis and machine learning techniques.This study presents a multi-threshold-based segmentation technique to probe high pixel intensity regions in CXR images of various pathologies,including normal cases.Texture information is extracted using gray co-occurrence matrix(GLCM)-based features,while vessel-like features are obtained using Frangi,Sato,and Meijering filters.Machine learning models employing Decision Tree(DT)and RandomForest(RF)approaches are designed to categorize CXR images into common lung infections,lung opacity(LO),COVID-19,and viral pneumonia(VP).The results demonstrate that the fusion of texture and vesselbased features provides an effective ML model for aiding diagnosis.The ML model validation using performance measures,including an accuracy of approximately 91.8%with an RF-based classifier,supports the usefulness of the feature set and classifier model in categorizing the four different pathologies.Furthermore,the study investigates the importance of the devised features in identifying the underlying pathology and incorporates histogrambased analysis.This analysis reveals varying natural pixel distributions in CXR images belonging to the normal,COVID-19,LO,and VP groups,motivating the incorporation of additional features such as mean,standard deviation,skewness,and percentile based on the filtered images.Notably,the study achieves a considerable improvement in categorizing COVID-19 from LO,with a true positive rate of 97%,further substantiating the effectiveness of the methodology implemented. 展开更多
关键词 Chest radiography(CXR)image COVID-19 CLASSIFIER machine learning random forest texture analysis
下载PDF
Enhanced Steganalysis for Color Images Using Curvelet Features and Support Vector Machine
10
作者 Arslan Akram Imran Khan +4 位作者 Javed Rashid Mubbashar Saddique Muhammad Idrees Yazeed Yasin Ghadi Abdulmohsen Algarni 《Computers, Materials & Continua》 SCIE EI 2024年第1期1311-1328,共18页
Algorithms for steganography are methods of hiding data transfers in media files.Several machine learning architectures have been presented recently to improve stego image identification performance by using spatial i... Algorithms for steganography are methods of hiding data transfers in media files.Several machine learning architectures have been presented recently to improve stego image identification performance by using spatial information,and these methods have made it feasible to handle a wide range of problems associated with image analysis.Images with little information or low payload are used by information embedding methods,but the goal of all contemporary research is to employ high-payload images for classification.To address the need for both low-and high-payload images,this work provides a machine-learning approach to steganography image classification that uses Curvelet transformation to efficiently extract characteristics from both type of images.Support Vector Machine(SVM),a commonplace classification technique,has been employed to determine whether the image is a stego or cover.The Wavelet Obtained Weights(WOW),Spatial Universal Wavelet Relative Distortion(S-UNIWARD),Highly Undetectable Steganography(HUGO),and Minimizing the Power of Optimal Detector(MiPOD)steganography techniques are used in a variety of experimental scenarios to evaluate the performance of the proposedmethod.Using WOW at several payloads,the proposed approach proves its classification accuracy of 98.60%.It exhibits its superiority over SOTA methods. 展开更多
关键词 CURVELETS fast fourier transformation support vector machine high pass filters STEGANOGRAPHY
下载PDF
The User Analysis of Amazon Using Artificial Intelligence at Customer Churn
11
作者 Mohammed Ali Alzahrani 《Journal of Data Analysis and Information Processing》 2024年第1期40-48,共9页
Customer churns remains a key focus in this research, using artificial intelligence-based technique of machine learning. Research is based on the feature-based analysis four main features were used that are selected o... Customer churns remains a key focus in this research, using artificial intelligence-based technique of machine learning. Research is based on the feature-based analysis four main features were used that are selected on the basis of our customer churn to deduct the meaning full analysis of the data set. Data-set is taken from the Kaggle that is about the fine food review having more than half a million records in it. This research remains on feature based analysis that is further concluded using confusion matrix. In this research we are using confusion matrix to conclude the customer churn results. Such specific analysis helps e-commerce business for real time growth in their specific products focusing more sales and to analyze which product is getting outage. Moreover, after applying the techniques, Support Vector Machine and K-Nearest Neighbour perform better than the random forest in this particular scenario. Using confusion matrix for obtaining the results three things are obtained that are precision, recall and accuracy. The result explains feature-based analysis on fine food reviews, Amazon at customer churn Support Vector Machine performed better as in overall comparison. 展开更多
关键词 Customer Churn Machine Learning Amazon Fine Food Reviews Data Science Artificial Intelligence
下载PDF
An IoT-Aware System for Managing Patients’ Waiting Time Using Bluetooth Low-Energy Technology
12
作者 Reham Alabduljabbar 《Computer Systems Science & Engineering》 SCIE EI 2022年第1期1-16,共16页
It is a common observation that whenever patients arrives at the front desk of a hospital,outpatient clinic,or other health-associated centers,they have to first queue up in a line and wait to fill in their registrati... It is a common observation that whenever patients arrives at the front desk of a hospital,outpatient clinic,or other health-associated centers,they have to first queue up in a line and wait to fill in their registration form to get admitted.The long waiting time without any status updates is the most common complaint,concerning health officials.In this paper,UrNext,a location-aware mobile-based solution using Bluetooth low-energy(BLE)technology is presented to solve the problem.Recently,a technology-oriented method,the Internet of Things(IoT),has been gaining popularity in helping to solve some of the healthcare sector’s problems.The implementation of this solution could be illustrated through a simple example of when a patient arrives at a clinic for a consultation.Instead of having to wait in long lines,that patient will be greeted automatically,receive a push notification of an admittance along with an estimated waiting time for a consultation session.This will not only provide the patients with a sense of freedom but would also reduce the uncertainty levels that are generally observed,thus saving both time and money.This work aims to improve the clinics’quality of services,organize queues and minimize waiting times,leading to patients’comfort while reducing the burden on nurses and receptionists.The results demonstrate that the presented system is successful in its performance and helps achieves a plea-sant and conducive clinic visitation process with higher productivity. 展开更多
关键词 Internetofthings(IoT) LOCATION-AWARE Bluetoothlowenergy BEACON
下载PDF
Analysis of a Two Category Confidentiality Modeling Information Security
13
作者 Marn-Ling Shing Chen-Chi Shing Lee-Pin Shing 《通讯和计算机(中英文版)》 2012年第5期538-546,共9页
关键词 保密性 建模分析 信息安全 网络供应商 马尔可夫链 最佳组合 相关信息 系统入侵
下载PDF
Fuzzy VIKOR Approach to Evaluate the Information Security Policies and Analyze the Content of Press Agencies in Gulf Countries
14
作者 Amir Mohamed Talib 《Journal of Information Security》 2020年第4期189-200,共12页
A news agency is an organization that gathers news reports and sells them to subscribing news organization, such as newspapers, magazines, radio and television broadcasters. A news agency may also be referred to as a ... A news agency is an organization that gathers news reports and sells them to subscribing news organization, such as newspapers, magazines, radio and television broadcasters. A news agency may also be referred to as a wire service, newswire, or news service. The main purpose of this paper is to evaluate the security policies and analyze the content of five press agencies in gulf countries which are (Kuwait News Agency (KUNA), Emirates News Agency (WAM), Saudi Press Agency (SPA), Bahrain News Agency (BNA), and Oman News Agency (OMA)) by using a fuzzy VIKOR approach where linguistic variables are applied to solve the uncertainties and subjectivities in expert decision making. Fuzzy VIKOR approach is one of the best Multi-Criteria Decision Making (MCDM) techniques working in fuzzy environment. This study benefits security and content analysis experts know which press agency has the mandate and the competence to educate the public on news agencies. Besides, this paper contributes to Gulf agencies in helping them in their resolve to ensure the quality of content information and information security policies over the internet. 展开更多
关键词 Content Analysis Fuzzy VIKOR Approach Gulf Countries Information Security Policy Press Agencies Multi-Criteria Decision Making (MCDM) Online Information Quality
下载PDF
Optimal Machine Learning Driven Sentiment Analysis on COVID-19 Twitter Data 被引量:1
15
作者 Bahjat Fakieh Abdullah S.AL-Malaise AL-Ghamdi +1 位作者 Farrukh Saleem Mahmoud Ragab 《Computers, Materials & Continua》 SCIE EI 2023年第4期81-97,共17页
The outbreak of the pandemic,caused by Coronavirus Disease 2019(COVID-19),has affected the daily activities of people across the globe.During COVID-19 outbreak and the successive lockdowns,Twitter was heavily used and... The outbreak of the pandemic,caused by Coronavirus Disease 2019(COVID-19),has affected the daily activities of people across the globe.During COVID-19 outbreak and the successive lockdowns,Twitter was heavily used and the number of tweets regarding COVID-19 increased tremendously.Several studies used Sentiment Analysis(SA)to analyze the emotions expressed through tweets upon COVID-19.Therefore,in current study,a new Artificial Bee Colony(ABC)with Machine Learning-driven SA(ABCMLSA)model is developed for conducting Sentiment Analysis of COVID-19 Twitter data.The prime focus of the presented ABCML-SA model is to recognize the sentiments expressed in tweets made uponCOVID-19.It involves data pre-processing at the initial stage followed by n-gram based feature extraction to derive the feature vectors.For identification and classification of the sentiments,the Support Vector Machine(SVM)model is exploited.At last,the ABC algorithm is applied to fine tune the parameters involved in SVM.To demonstrate the improved performance of the proposed ABCML-SA model,a sequence of simulations was conducted.The comparative assessment results confirmed the effectual performance of the proposed ABCML-SA model over other approaches. 展开更多
关键词 Sentiment analysis twitter data data mining COVID-19 machine learning artificial bee colony
下载PDF
Latency-Aware Dynamic Second Offloading Service in SDN-Based Fog Architecture
16
作者 Samah Ibrahim AlShathri Dina S.M.Hassan Samia Allaoua Chelloug 《Computers, Materials & Continua》 SCIE EI 2023年第4期1501-1526,共26页
Task offloading is a key strategy in Fog Computing (FC). Thedefinition of resource-constrained devices no longer applies to sensors andInternet of Things (IoT) embedded system devices alone. Smart and mobileunits can ... Task offloading is a key strategy in Fog Computing (FC). Thedefinition of resource-constrained devices no longer applies to sensors andInternet of Things (IoT) embedded system devices alone. Smart and mobileunits can also be viewed as resource-constrained devices if the power, cloudapplications, and data cloud are included in the set of required resources. Ina cloud-fog-based architecture, a task instance running on an end device mayneed to be offloaded to a fog node to complete its execution. However, ina busy network, a second offloading decision is required when the fog nodebecomes overloaded. The possibility of offloading a task, for the second time,to a fog or a cloud node depends to a great extent on task importance, latencyconstraints, and required resources. This paper presents a dynamic service thatdetermines which tasks can endure a second offloading. The task type, latencyconstraints, and amount of required resources are used to select the offloadingdestination node. This study proposes three heuristic offloading algorithms.Each algorithm targets a specific task type. An overloaded fog node can onlyissue one offloading request to execute one of these algorithms accordingto the task offloading priority. Offloading requests are sent to a SoftwareDefined Networking (SDN) controller. The fog node and controller determinethe number of offloaded tasks. Simulation results show that the average timerequired to select offloading nodes was improved by 33% when compared tothe dynamic fog-to-fog offloading algorithm. The distribution of workloadconverges to a uniform distribution when offloading latency-sensitive nonurgenttasks. The lowest offloading priority is assigned to latency-sensitivetasks with hard deadlines. At least 70% of these tasks are offloaded to fognodes that are one to three hops away from the overloaded node. 展开更多
关键词 Fog computing offloading algorithm latency-aware software defined networking SDN
下载PDF
Solar Energy Harvesting Using a Timer-Based Relay Selection
17
作者 Raed Alhamad Hatem Boujemaa 《Computers, Materials & Continua》 SCIE EI 2023年第1期2149-2159,共11页
In this paper,the throughput and delay of cooperative communications are derived when solar energy is used and relay node is selected using a timer.The source and relays harvest energy from sun using a photo voltaic s... In this paper,the throughput and delay of cooperative communications are derived when solar energy is used and relay node is selected using a timer.The source and relays harvest energy from sun using a photo voltaic system.The harvested power is used by the source to transmit data to the relays.Then,a selected relay amplifies the signal to the destination.Opportunistic,partial and reactive relay selection are used.The relay transmits when its timer elapses.The timer is set to a value proportional to the inverse of its Signal to Noise Ratio(SNR).Therefore,the relay with largest SNR will transmit first and its signal will be detected by the other relays that will remain idle to avoid collisions.Harvesting duration is optimized to maximize the throughput.Packet’s waiting time and total delay are also computed.We also derive the statistics of SNR when solar energy is used.The harvested power from sun is proportional to the sum of a deterministic radiation intensity and a random attenuation due to weather effects and clouds occlusion.The fixed radiation intensity depends on season,month and time t in hour.The throughput of cooperative communications with energy harvesting from sun was not yet studied. 展开更多
关键词 Solar energy harvesting timer based relay selection relaying techniques throughput and delay analysis
下载PDF
Stochastic Models to Mitigate Sparse Sensor Attacks in Continuous-Time Non-Linear Cyber-Physical Systems
18
作者 Borja Bordel Sánchez Ramón Alcarria Tomás Robles 《Computers, Materials & Continua》 SCIE EI 2023年第9期3189-3218,共30页
Cyber-Physical Systems are very vulnerable to sparse sensor attacks.But current protection mechanisms employ linear and deterministic models which cannot detect attacks precisely.Therefore,in this paper,we propose a n... Cyber-Physical Systems are very vulnerable to sparse sensor attacks.But current protection mechanisms employ linear and deterministic models which cannot detect attacks precisely.Therefore,in this paper,we propose a new non-linear generalized model to describe Cyber-Physical Systems.This model includes unknown multivariable discrete and continuous-time functions and different multiplicative noises to represent the evolution of physical processes and randomeffects in the physical and computationalworlds.Besides,the digitalization stage in hardware devices is represented too.Attackers and most critical sparse sensor attacks are described through a stochastic process.The reconstruction and protectionmechanisms are based on aweighted stochasticmodel.Error probability in data samples is estimated through different indicators commonly employed in non-linear dynamics(such as the Fourier transform,first-return maps,or the probability density function).A decision algorithm calculates the final reconstructed value considering the previous error probability.An experimental validation based on simulation tools and real deployments is also carried out.Both,the new technology performance and scalability are studied.Results prove that the proposed solution protects Cyber-Physical Systems against up to 92%of attacks and perturbations,with a computational delay below 2.5 s.The proposed model shows a linear complexity,as recursive or iterative structures are not employed,just algebraic and probabilistic functions.In conclusion,the new model and reconstructionmechanism can protect successfully Cyber-Physical Systems against sparse sensor attacks,even in dense or pervasive deployments and scenarios. 展开更多
关键词 Cyber-physical systems sparse sensor attack non-linear models stochastic models security
下载PDF
Quantum Cat Swarm Optimization Based Clustering with Intrusion Detection Technique for Future Internet of Things Environment
19
作者 Mohammed Basheri Mahmoud Ragab 《Computer Systems Science & Engineering》 SCIE EI 2023年第9期3783-3798,共16页
The Internet of Things(IoT)is one of the emergent technologies with advanced developments in several applications like creating smart environments,enabling Industry 4.0,etc.As IoT devices operate via an inbuilt and li... The Internet of Things(IoT)is one of the emergent technologies with advanced developments in several applications like creating smart environments,enabling Industry 4.0,etc.As IoT devices operate via an inbuilt and limited power supply,the effective utilization of available energy plays a vital role in designing the IoT environment.At the same time,the communication of IoT devices in wireless mediums poses security as a challenging issue.Recently,intrusion detection systems(IDS)have paved the way to detect the presence of intrusions in the IoT environment.With this motivation,this article introduces a novel QuantumCat SwarmOptimization based Clustering with Intrusion Detection Technique(QCSOBC-IDT)for IoT environment.The QCSOBC-IDT model aims to achieve energy efficiency by clustering the nodes and security by intrusion detection.Primarily,the QCSOBC-IDT model presents a new QCSO algorithm for effectively choosing cluster heads(CHs)and organizing a set of clusters in the IoT environment.Besides,the QCSO algorithm computes a fitness function involving four parameters,namely energy efficiency,inter-cluster distance,intra-cluster distance,and node density.A harmony search algorithm(HSA)with a cascaded recurrent neural network(CRNN)model can be used for an effective intrusion detection process.The design of HSA assists in the optimal selection of hyperparameters related to the CRNN model.A detailed experimental analysis of the QCSOBC-IDT model ensured its promising efficiency compared to existing models. 展开更多
关键词 Internet of things energy efficiency CLUSTERING intrusion detection deep learning security
下载PDF
Latent Space Representational Learning of Deep Features for Acute Lymphoblastic Leukemia Diagnosis
20
作者 Ghada Emam Atteia 《Computer Systems Science & Engineering》 SCIE EI 2023年第4期361-376,共16页
Acute Lymphoblastic Leukemia(ALL)is a fatal malignancy that is featured by the abnormal increase of immature lymphocytes in blood or bone marrow.Early prognosis of ALL is indispensable for the effectual remediation of... Acute Lymphoblastic Leukemia(ALL)is a fatal malignancy that is featured by the abnormal increase of immature lymphocytes in blood or bone marrow.Early prognosis of ALL is indispensable for the effectual remediation of this disease.Initial screening of ALL is conducted through manual examination of stained blood smear microscopic images,a process which is time-consuming and prone to errors.Therefore,many deep learning-based computer-aided diagnosis(CAD)systems have been established to automatically diagnose ALL.This paper proposes a novel hybrid deep learning system for ALL diagnosis in blood smear images.The introduced system integrates the proficiency of autoencoder networks in feature representational learning in latent space with the superior feature extraction capability of standard pretrained convolutional neural networks(CNNs)to identify the existence of ALL in blood smears.An augmented set of deep image features are formed from the features extracted by GoogleNet and Inception-v3 CNNs from a hybrid dataset of microscopic blood smear images.A sparse autoencoder network is designed to create an abstract set of significant latent features from the enlarged image feature set.The latent features are used to perform image classification using Support Vector Machine(SVM)classifier.The obtained results show that the latent features improve the classification performance of the proposed ALL diagnosis system over the original image features.Moreover,the classification performance of the system with various sizes of the latent feature set is evaluated.The retrieved results reveal that the introduced ALL diagnosis system superiorly compete the state of the art. 展开更多
关键词 Autoencoder deep learning CNN LEUKEMIA diagnosis computeraided diagnosis
下载PDF
上一页 1 2 10 下一页 到第
使用帮助 返回顶部