Speech recognition systems have become a unique human-computer interaction(HCI)family.Speech is one of the most naturally developed human abilities;speech signal processing opens up a transparent and hand-free computa...Speech recognition systems have become a unique human-computer interaction(HCI)family.Speech is one of the most naturally developed human abilities;speech signal processing opens up a transparent and hand-free computation experience.This paper aims to present a retrospective yet modern approach to the world of speech recognition systems.The development journey of ASR(Automatic Speech Recognition)has seen quite a few milestones and breakthrough technologies that have been highlighted in this paper.A step-by-step rundown of the fundamental stages in developing speech recognition systems has been presented,along with a brief discussion of various modern-day developments and applications in this domain.This review paper aims to summarize and provide a beginning point for those starting in the vast field of speech signal processing.Since speech recognition has a vast potential in various industries like telecommunication,emotion recognition,healthcare,etc.,this review would be helpful to researchers who aim at exploring more applications that society can quickly adopt in future years of evolution.展开更多
Association rule learning(ARL)is a widely used technique for discovering relationships within datasets.However,it often generates excessive irrelevant or ambiguous rules.Therefore,post-processing is crucial not only f...Association rule learning(ARL)is a widely used technique for discovering relationships within datasets.However,it often generates excessive irrelevant or ambiguous rules.Therefore,post-processing is crucial not only for removing irrelevant or redundant rules but also for uncovering hidden associations that impact other factors.Recently,several post-processing methods have been proposed,each with its own strengths and weaknesses.In this paper,we propose THAPE(Tunable Hybrid Associative Predictive Engine),which combines descriptive and predictive techniques.By leveraging both techniques,our aim is to enhance the quality of analyzing generated rules.This includes removing irrelevant or redundant rules,uncovering interesting and useful rules,exploring hidden association rules that may affect other factors,and providing backtracking ability for a given product.The proposed approach offers a tailored method that suits specific goals for retailers,enabling them to gain a better understanding of customer behavior based on factual transactions in the target market.We applied THAPE to a real dataset as a case study in this paper to demonstrate its effectiveness.Through this application,we successfully mined a concise set of highly interesting and useful association rules.Out of the 11,265 rules generated,we identified 125 rules that are particularly relevant to the business context.These identified rules significantly improve the interpretability and usefulness of association rules for decision-making purposes.展开更多
The user’s intent to seek online information has been an active area of research in user profiling.User profiling considers user characteristics,behaviors,activities,and preferences to sketch user intentions,interest...The user’s intent to seek online information has been an active area of research in user profiling.User profiling considers user characteristics,behaviors,activities,and preferences to sketch user intentions,interests,and motivations.Determining user characteristics can help capture implicit and explicit preferences and intentions for effective user-centric and customized content presentation.The user’s complete online experience in seeking information is a blend of activities such as searching,verifying,and sharing it on social platforms.However,a combination of multiple behaviors in profiling users has yet to be considered.This research takes a novel approach and explores user intent types based on multidimensional online behavior in information acquisition.This research explores information search,verification,and dissemination behavior and identifies diverse types of users based on their online engagement using machine learning.The research proposes a generic user profile template that explains the user characteristics based on the internet experience and uses it as ground truth for data annotation.User feedback is based on online behavior and practices collected by using a survey method.The participants include both males and females from different occupation sectors and different ages.The data collected is subject to feature engineering,and the significant features are presented to unsupervised machine learning methods to identify user intent classes or profiles and their characteristics.Different techniques are evaluated,and the K-Mean clustering method successfully generates five user groups observing different user characteristics with an average silhouette of 0.36 and a distortion score of 1136.Feature average is computed to identify user intent type characteristics.The user intent classes are then further generalized to create a user intent template with an Inter-Rater Reliability of 75%.This research successfully extracts different user types based on their preferences in online content,platforms,criteria,and frequency.The study also validates the proposed template on user feedback data through Inter-Rater Agreement process using an external human rater.展开更多
Malware attacks on Windows machines pose significant cybersecurity threats,necessitating effective detection and prevention mechanisms.Supervised machine learning classifiers have emerged as promising tools for malwar...Malware attacks on Windows machines pose significant cybersecurity threats,necessitating effective detection and prevention mechanisms.Supervised machine learning classifiers have emerged as promising tools for malware detection.However,there remains a need for comprehensive studies that compare the performance of different classifiers specifically for Windows malware detection.Addressing this gap can provide valuable insights for enhancing cybersecurity strategies.While numerous studies have explored malware detection using machine learning techniques,there is a lack of systematic comparison of supervised classifiers for Windows malware detection.Understanding the relative effectiveness of these classifiers can inform the selection of optimal detection methods and improve overall security measures.This study aims to bridge the research gap by conducting a comparative analysis of supervised machine learning classifiers for detecting malware on Windows systems.The objectives include Investigating the performance of various classifiers,such as Gaussian Naïve Bayes,K Nearest Neighbors(KNN),Stochastic Gradient Descent Classifier(SGDC),and Decision Tree,in detecting Windows malware.Evaluating the accuracy,efficiency,and suitability of each classifier for real-world malware detection scenarios.Identifying the strengths and limitations of different classifiers to provide insights for cybersecurity practitioners and researchers.Offering recommendations for selecting the most effective classifier for Windows malware detection based on empirical evidence.The study employs a structured methodology consisting of several phases:exploratory data analysis,data preprocessing,model training,and evaluation.Exploratory data analysis involves understanding the dataset’s characteristics and identifying preprocessing requirements.Data preprocessing includes cleaning,feature encoding,dimensionality reduction,and optimization to prepare the data for training.Model training utilizes various supervised classifiers,and their performance is evaluated using metrics such as accuracy,precision,recall,and F1 score.The study’s outcomes comprise a comparative analysis of supervised machine learning classifiers for Windows malware detection.Results reveal the effectiveness and efficiency of each classifier in detecting different types of malware.Additionally,insights into their strengths and limitations provide practical guidance for enhancing cybersecurity defenses.Overall,this research contributes to advancing malware detection techniques and bolstering the security posture of Windows systems against evolving cyber threats.展开更多
Cancer frequently develops resistance to the majority of chemotherapy treatments.This study aimed to examine the synergistic cytotoxic and antitumor effects of SGLT2 inhibitors,specifically Canagliflozin(CAN),Dapaglif...Cancer frequently develops resistance to the majority of chemotherapy treatments.This study aimed to examine the synergistic cytotoxic and antitumor effects of SGLT2 inhibitors,specifically Canagliflozin(CAN),Dapagliflozin(DAP),Empagliflozin(EMP),and Doxorubicin(DOX),using in vitro experimentation.The precise combination of CAN+DOX has been found to greatly enhance the cytotoxic effects of doxorubicin(DOX)in MCF-7 cells.Interestingly,it was shown that cancer cells exhibit an increased demand for glucose and ATP in order to support their growth.Notably,when these medications were combined with DOX,there was a considerable inhibition of glucose consumption,as well as reductions in intracellular ATP and lactate levels.Moreover,this effect was found to be dependent on the dosages of the drugs.In addition to effectively inhibiting the cell cycle,the combination of CAN+DOX induces substantial modifications in both cell cycle and apoptotic gene expression.This work represents the initial report on the beneficial impact of SGLT2 inhibitor medications,namely CAN,DAP,and EMP,on the responsiveness to the anticancer properties of DOX.The underlying molecular mechanisms potentially involve the suppression of the function of SGLT2.展开更多
Big data and information and communication technologies can be important to the effectiveness of smart cities.Based on the maximal attention on smart city sustainability,developing data-driven smart cities is newly ob...Big data and information and communication technologies can be important to the effectiveness of smart cities.Based on the maximal attention on smart city sustainability,developing data-driven smart cities is newly obtained attention as a vital technology for addressing sustainability problems.Real-time monitoring of pollution allows local authorities to analyze the present traffic condition of cities and make decisions.Relating to air pollution occurs a main environmental problem in smart city environments.The effect of the deep learning(DL)approach quickly increased and penetrated almost every domain,comprising air pollution forecast.Therefore,this article develops a new Coot Optimization Algorithm with an Ensemble Deep Learning based Air Pollution Prediction(COAEDL-APP)system for Sustainable Smart Cities.The projected COAEDL-APP algorithm accurately forecasts the presence of air quality in the sustainable smart city environment.To achieve this,the COAEDL-APP technique initially performs a linear scaling normalization(LSN)approach to pre-process the input data.For air quality prediction,an ensemble of three DL models has been involved,namely autoencoder(AE),long short-term memory(LSTM),and deep belief network(DBN).Furthermore,the COA-based hyperparameter tuning procedure can be designed to adjust the hyperparameter values of the DL models.The simulation outcome of the COAEDL-APP algorithm was tested on the air quality database,and the outcomes stated the improved performance of the COAEDL-APP algorithm over other existing systems with maximum accuracy of 98.34%.展开更多
Software-Defined Networking(SDN)represents a significant paradigm shift in network architecture,separating network logic from the underlying forwarding devices to enhance flexibility and centralize deployment.Concur-r...Software-Defined Networking(SDN)represents a significant paradigm shift in network architecture,separating network logic from the underlying forwarding devices to enhance flexibility and centralize deployment.Concur-rently,the Internet of Things(IoT)connects numerous devices to the Internet,enabling autonomous interactions with minimal human intervention.However,implementing and managing an SDN-IoT system is inherently complex,particularly for those with limited resources,as the dynamic and distributed nature of IoT infrastructures creates security and privacy challenges during SDN integration.The findings of this study underscore the primary security and privacy challenges across application,control,and data planes.A comprehensive review evaluates the root causes of these challenges and the defense techniques employed in prior works to establish sufficient secrecy and privacy protection.Recent investigations have explored cutting-edge methods,such as leveraging blockchain for transaction recording to enhance security and privacy,along with applying machine learning and deep learning approaches to identify and mitigate the impacts of Denial of Service(DoS)and Distributed DoS(DDoS)attacks.Moreover,the analysis indicates that encryption and hashing techniques are prevalent in the data plane,whereas access control and certificate authorization are prominently considered in the control plane,and authentication is commonly employed within the application plane.Additionally,this paper outlines future directions,offering insights into potential strategies and technological advancements aimed at fostering a more secure and privacy-conscious SDN-based IoT ecosystem.展开更多
Serial remote sensing images offer a valuable means of tracking the evolutionary changes and growth of a specific geographical area over time.Although the original images may provide limited insights,they harbor consi...Serial remote sensing images offer a valuable means of tracking the evolutionary changes and growth of a specific geographical area over time.Although the original images may provide limited insights,they harbor considerable potential for identifying clusters and patterns.The aggregation of these serial remote sensing images(SRSI)becomes increasingly viable as distinct patterns emerge in diverse scenarios,such as suburbanization,the expansion of native flora,and agricultural activities.In a novel approach,we propose an innovative method for extracting sequential patterns by combining Ant Colony Optimization(ACD)and Empirical Mode Decomposition(EMD).This integration of the newly developed EMD and ACO techniques proves remarkably effective in identifying the most significant characteristic features within serial remote sensing images,guided by specific criteria.Our findings highlight a substantial improvement in the efficiency of sequential pattern mining through the application of this unique hybrid method,seamlessly integrating EMD and ACO for feature selection.This study exposes the potential of our innovative methodology,particularly in the realms of urbanization,native vegetation expansion,and agricultural activities.展开更多
Manual investigation of chest radiography(CXR)images by physicians is crucial for effective decision-making in COVID-19 diagnosis.However,the high demand during the pandemic necessitates auxiliary help through image a...Manual investigation of chest radiography(CXR)images by physicians is crucial for effective decision-making in COVID-19 diagnosis.However,the high demand during the pandemic necessitates auxiliary help through image analysis and machine learning techniques.This study presents a multi-threshold-based segmentation technique to probe high pixel intensity regions in CXR images of various pathologies,including normal cases.Texture information is extracted using gray co-occurrence matrix(GLCM)-based features,while vessel-like features are obtained using Frangi,Sato,and Meijering filters.Machine learning models employing Decision Tree(DT)and RandomForest(RF)approaches are designed to categorize CXR images into common lung infections,lung opacity(LO),COVID-19,and viral pneumonia(VP).The results demonstrate that the fusion of texture and vesselbased features provides an effective ML model for aiding diagnosis.The ML model validation using performance measures,including an accuracy of approximately 91.8%with an RF-based classifier,supports the usefulness of the feature set and classifier model in categorizing the four different pathologies.Furthermore,the study investigates the importance of the devised features in identifying the underlying pathology and incorporates histogrambased analysis.This analysis reveals varying natural pixel distributions in CXR images belonging to the normal,COVID-19,LO,and VP groups,motivating the incorporation of additional features such as mean,standard deviation,skewness,and percentile based on the filtered images.Notably,the study achieves a considerable improvement in categorizing COVID-19 from LO,with a true positive rate of 97%,further substantiating the effectiveness of the methodology implemented.展开更多
Algorithms for steganography are methods of hiding data transfers in media files.Several machine learning architectures have been presented recently to improve stego image identification performance by using spatial i...Algorithms for steganography are methods of hiding data transfers in media files.Several machine learning architectures have been presented recently to improve stego image identification performance by using spatial information,and these methods have made it feasible to handle a wide range of problems associated with image analysis.Images with little information or low payload are used by information embedding methods,but the goal of all contemporary research is to employ high-payload images for classification.To address the need for both low-and high-payload images,this work provides a machine-learning approach to steganography image classification that uses Curvelet transformation to efficiently extract characteristics from both type of images.Support Vector Machine(SVM),a commonplace classification technique,has been employed to determine whether the image is a stego or cover.The Wavelet Obtained Weights(WOW),Spatial Universal Wavelet Relative Distortion(S-UNIWARD),Highly Undetectable Steganography(HUGO),and Minimizing the Power of Optimal Detector(MiPOD)steganography techniques are used in a variety of experimental scenarios to evaluate the performance of the proposedmethod.Using WOW at several payloads,the proposed approach proves its classification accuracy of 98.60%.It exhibits its superiority over SOTA methods.展开更多
Customer churns remains a key focus in this research, using artificial intelligence-based technique of machine learning. Research is based on the feature-based analysis four main features were used that are selected o...Customer churns remains a key focus in this research, using artificial intelligence-based technique of machine learning. Research is based on the feature-based analysis four main features were used that are selected on the basis of our customer churn to deduct the meaning full analysis of the data set. Data-set is taken from the Kaggle that is about the fine food review having more than half a million records in it. This research remains on feature based analysis that is further concluded using confusion matrix. In this research we are using confusion matrix to conclude the customer churn results. Such specific analysis helps e-commerce business for real time growth in their specific products focusing more sales and to analyze which product is getting outage. Moreover, after applying the techniques, Support Vector Machine and K-Nearest Neighbour perform better than the random forest in this particular scenario. Using confusion matrix for obtaining the results three things are obtained that are precision, recall and accuracy. The result explains feature-based analysis on fine food reviews, Amazon at customer churn Support Vector Machine performed better as in overall comparison.展开更多
It is a common observation that whenever patients arrives at the front desk of a hospital,outpatient clinic,or other health-associated centers,they have to first queue up in a line and wait to fill in their registrati...It is a common observation that whenever patients arrives at the front desk of a hospital,outpatient clinic,or other health-associated centers,they have to first queue up in a line and wait to fill in their registration form to get admitted.The long waiting time without any status updates is the most common complaint,concerning health officials.In this paper,UrNext,a location-aware mobile-based solution using Bluetooth low-energy(BLE)technology is presented to solve the problem.Recently,a technology-oriented method,the Internet of Things(IoT),has been gaining popularity in helping to solve some of the healthcare sector’s problems.The implementation of this solution could be illustrated through a simple example of when a patient arrives at a clinic for a consultation.Instead of having to wait in long lines,that patient will be greeted automatically,receive a push notification of an admittance along with an estimated waiting time for a consultation session.This will not only provide the patients with a sense of freedom but would also reduce the uncertainty levels that are generally observed,thus saving both time and money.This work aims to improve the clinics’quality of services,organize queues and minimize waiting times,leading to patients’comfort while reducing the burden on nurses and receptionists.The results demonstrate that the presented system is successful in its performance and helps achieves a plea-sant and conducive clinic visitation process with higher productivity.展开更多
A news agency is an organization that gathers news reports and sells them to subscribing news organization, such as newspapers, magazines, radio and television broadcasters. A news agency may also be referred to as a ...A news agency is an organization that gathers news reports and sells them to subscribing news organization, such as newspapers, magazines, radio and television broadcasters. A news agency may also be referred to as a wire service, newswire, or news service. The main purpose of this paper is to evaluate the security policies and analyze the content of five press agencies in gulf countries which are (Kuwait News Agency (KUNA), Emirates News Agency (WAM), Saudi Press Agency (SPA), Bahrain News Agency (BNA), and Oman News Agency (OMA)) by using a fuzzy VIKOR approach where linguistic variables are applied to solve the uncertainties and subjectivities in expert decision making. Fuzzy VIKOR approach is one of the best Multi-Criteria Decision Making (MCDM) techniques working in fuzzy environment. This study benefits security and content analysis experts know which press agency has the mandate and the competence to educate the public on news agencies. Besides, this paper contributes to Gulf agencies in helping them in their resolve to ensure the quality of content information and information security policies over the internet.展开更多
The outbreak of the pandemic,caused by Coronavirus Disease 2019(COVID-19),has affected the daily activities of people across the globe.During COVID-19 outbreak and the successive lockdowns,Twitter was heavily used and...The outbreak of the pandemic,caused by Coronavirus Disease 2019(COVID-19),has affected the daily activities of people across the globe.During COVID-19 outbreak and the successive lockdowns,Twitter was heavily used and the number of tweets regarding COVID-19 increased tremendously.Several studies used Sentiment Analysis(SA)to analyze the emotions expressed through tweets upon COVID-19.Therefore,in current study,a new Artificial Bee Colony(ABC)with Machine Learning-driven SA(ABCMLSA)model is developed for conducting Sentiment Analysis of COVID-19 Twitter data.The prime focus of the presented ABCML-SA model is to recognize the sentiments expressed in tweets made uponCOVID-19.It involves data pre-processing at the initial stage followed by n-gram based feature extraction to derive the feature vectors.For identification and classification of the sentiments,the Support Vector Machine(SVM)model is exploited.At last,the ABC algorithm is applied to fine tune the parameters involved in SVM.To demonstrate the improved performance of the proposed ABCML-SA model,a sequence of simulations was conducted.The comparative assessment results confirmed the effectual performance of the proposed ABCML-SA model over other approaches.展开更多
Task offloading is a key strategy in Fog Computing (FC). Thedefinition of resource-constrained devices no longer applies to sensors andInternet of Things (IoT) embedded system devices alone. Smart and mobileunits can ...Task offloading is a key strategy in Fog Computing (FC). Thedefinition of resource-constrained devices no longer applies to sensors andInternet of Things (IoT) embedded system devices alone. Smart and mobileunits can also be viewed as resource-constrained devices if the power, cloudapplications, and data cloud are included in the set of required resources. Ina cloud-fog-based architecture, a task instance running on an end device mayneed to be offloaded to a fog node to complete its execution. However, ina busy network, a second offloading decision is required when the fog nodebecomes overloaded. The possibility of offloading a task, for the second time,to a fog or a cloud node depends to a great extent on task importance, latencyconstraints, and required resources. This paper presents a dynamic service thatdetermines which tasks can endure a second offloading. The task type, latencyconstraints, and amount of required resources are used to select the offloadingdestination node. This study proposes three heuristic offloading algorithms.Each algorithm targets a specific task type. An overloaded fog node can onlyissue one offloading request to execute one of these algorithms accordingto the task offloading priority. Offloading requests are sent to a SoftwareDefined Networking (SDN) controller. The fog node and controller determinethe number of offloaded tasks. Simulation results show that the average timerequired to select offloading nodes was improved by 33% when compared tothe dynamic fog-to-fog offloading algorithm. The distribution of workloadconverges to a uniform distribution when offloading latency-sensitive nonurgenttasks. The lowest offloading priority is assigned to latency-sensitivetasks with hard deadlines. At least 70% of these tasks are offloaded to fognodes that are one to three hops away from the overloaded node.展开更多
In this paper,the throughput and delay of cooperative communications are derived when solar energy is used and relay node is selected using a timer.The source and relays harvest energy from sun using a photo voltaic s...In this paper,the throughput and delay of cooperative communications are derived when solar energy is used and relay node is selected using a timer.The source and relays harvest energy from sun using a photo voltaic system.The harvested power is used by the source to transmit data to the relays.Then,a selected relay amplifies the signal to the destination.Opportunistic,partial and reactive relay selection are used.The relay transmits when its timer elapses.The timer is set to a value proportional to the inverse of its Signal to Noise Ratio(SNR).Therefore,the relay with largest SNR will transmit first and its signal will be detected by the other relays that will remain idle to avoid collisions.Harvesting duration is optimized to maximize the throughput.Packet’s waiting time and total delay are also computed.We also derive the statistics of SNR when solar energy is used.The harvested power from sun is proportional to the sum of a deterministic radiation intensity and a random attenuation due to weather effects and clouds occlusion.The fixed radiation intensity depends on season,month and time t in hour.The throughput of cooperative communications with energy harvesting from sun was not yet studied.展开更多
Cyber-Physical Systems are very vulnerable to sparse sensor attacks.But current protection mechanisms employ linear and deterministic models which cannot detect attacks precisely.Therefore,in this paper,we propose a n...Cyber-Physical Systems are very vulnerable to sparse sensor attacks.But current protection mechanisms employ linear and deterministic models which cannot detect attacks precisely.Therefore,in this paper,we propose a new non-linear generalized model to describe Cyber-Physical Systems.This model includes unknown multivariable discrete and continuous-time functions and different multiplicative noises to represent the evolution of physical processes and randomeffects in the physical and computationalworlds.Besides,the digitalization stage in hardware devices is represented too.Attackers and most critical sparse sensor attacks are described through a stochastic process.The reconstruction and protectionmechanisms are based on aweighted stochasticmodel.Error probability in data samples is estimated through different indicators commonly employed in non-linear dynamics(such as the Fourier transform,first-return maps,or the probability density function).A decision algorithm calculates the final reconstructed value considering the previous error probability.An experimental validation based on simulation tools and real deployments is also carried out.Both,the new technology performance and scalability are studied.Results prove that the proposed solution protects Cyber-Physical Systems against up to 92%of attacks and perturbations,with a computational delay below 2.5 s.The proposed model shows a linear complexity,as recursive or iterative structures are not employed,just algebraic and probabilistic functions.In conclusion,the new model and reconstructionmechanism can protect successfully Cyber-Physical Systems against sparse sensor attacks,even in dense or pervasive deployments and scenarios.展开更多
The Internet of Things(IoT)is one of the emergent technologies with advanced developments in several applications like creating smart environments,enabling Industry 4.0,etc.As IoT devices operate via an inbuilt and li...The Internet of Things(IoT)is one of the emergent technologies with advanced developments in several applications like creating smart environments,enabling Industry 4.0,etc.As IoT devices operate via an inbuilt and limited power supply,the effective utilization of available energy plays a vital role in designing the IoT environment.At the same time,the communication of IoT devices in wireless mediums poses security as a challenging issue.Recently,intrusion detection systems(IDS)have paved the way to detect the presence of intrusions in the IoT environment.With this motivation,this article introduces a novel QuantumCat SwarmOptimization based Clustering with Intrusion Detection Technique(QCSOBC-IDT)for IoT environment.The QCSOBC-IDT model aims to achieve energy efficiency by clustering the nodes and security by intrusion detection.Primarily,the QCSOBC-IDT model presents a new QCSO algorithm for effectively choosing cluster heads(CHs)and organizing a set of clusters in the IoT environment.Besides,the QCSO algorithm computes a fitness function involving four parameters,namely energy efficiency,inter-cluster distance,intra-cluster distance,and node density.A harmony search algorithm(HSA)with a cascaded recurrent neural network(CRNN)model can be used for an effective intrusion detection process.The design of HSA assists in the optimal selection of hyperparameters related to the CRNN model.A detailed experimental analysis of the QCSOBC-IDT model ensured its promising efficiency compared to existing models.展开更多
Acute Lymphoblastic Leukemia(ALL)is a fatal malignancy that is featured by the abnormal increase of immature lymphocytes in blood or bone marrow.Early prognosis of ALL is indispensable for the effectual remediation of...Acute Lymphoblastic Leukemia(ALL)is a fatal malignancy that is featured by the abnormal increase of immature lymphocytes in blood or bone marrow.Early prognosis of ALL is indispensable for the effectual remediation of this disease.Initial screening of ALL is conducted through manual examination of stained blood smear microscopic images,a process which is time-consuming and prone to errors.Therefore,many deep learning-based computer-aided diagnosis(CAD)systems have been established to automatically diagnose ALL.This paper proposes a novel hybrid deep learning system for ALL diagnosis in blood smear images.The introduced system integrates the proficiency of autoencoder networks in feature representational learning in latent space with the superior feature extraction capability of standard pretrained convolutional neural networks(CNNs)to identify the existence of ALL in blood smears.An augmented set of deep image features are formed from the features extracted by GoogleNet and Inception-v3 CNNs from a hybrid dataset of microscopic blood smear images.A sparse autoencoder network is designed to create an abstract set of significant latent features from the enlarged image feature set.The latent features are used to perform image classification using Support Vector Machine(SVM)classifier.The obtained results show that the latent features improve the classification performance of the proposed ALL diagnosis system over the original image features.Moreover,the classification performance of the system with various sizes of the latent feature set is evaluated.The retrieved results reveal that the introduced ALL diagnosis system superiorly compete the state of the art.展开更多
文摘Speech recognition systems have become a unique human-computer interaction(HCI)family.Speech is one of the most naturally developed human abilities;speech signal processing opens up a transparent and hand-free computation experience.This paper aims to present a retrospective yet modern approach to the world of speech recognition systems.The development journey of ASR(Automatic Speech Recognition)has seen quite a few milestones and breakthrough technologies that have been highlighted in this paper.A step-by-step rundown of the fundamental stages in developing speech recognition systems has been presented,along with a brief discussion of various modern-day developments and applications in this domain.This review paper aims to summarize and provide a beginning point for those starting in the vast field of speech signal processing.Since speech recognition has a vast potential in various industries like telecommunication,emotion recognition,healthcare,etc.,this review would be helpful to researchers who aim at exploring more applications that society can quickly adopt in future years of evolution.
文摘Association rule learning(ARL)is a widely used technique for discovering relationships within datasets.However,it often generates excessive irrelevant or ambiguous rules.Therefore,post-processing is crucial not only for removing irrelevant or redundant rules but also for uncovering hidden associations that impact other factors.Recently,several post-processing methods have been proposed,each with its own strengths and weaknesses.In this paper,we propose THAPE(Tunable Hybrid Associative Predictive Engine),which combines descriptive and predictive techniques.By leveraging both techniques,our aim is to enhance the quality of analyzing generated rules.This includes removing irrelevant or redundant rules,uncovering interesting and useful rules,exploring hidden association rules that may affect other factors,and providing backtracking ability for a given product.The proposed approach offers a tailored method that suits specific goals for retailers,enabling them to gain a better understanding of customer behavior based on factual transactions in the target market.We applied THAPE to a real dataset as a case study in this paper to demonstrate its effectiveness.Through this application,we successfully mined a concise set of highly interesting and useful association rules.Out of the 11,265 rules generated,we identified 125 rules that are particularly relevant to the business context.These identified rules significantly improve the interpretability and usefulness of association rules for decision-making purposes.
文摘The user’s intent to seek online information has been an active area of research in user profiling.User profiling considers user characteristics,behaviors,activities,and preferences to sketch user intentions,interests,and motivations.Determining user characteristics can help capture implicit and explicit preferences and intentions for effective user-centric and customized content presentation.The user’s complete online experience in seeking information is a blend of activities such as searching,verifying,and sharing it on social platforms.However,a combination of multiple behaviors in profiling users has yet to be considered.This research takes a novel approach and explores user intent types based on multidimensional online behavior in information acquisition.This research explores information search,verification,and dissemination behavior and identifies diverse types of users based on their online engagement using machine learning.The research proposes a generic user profile template that explains the user characteristics based on the internet experience and uses it as ground truth for data annotation.User feedback is based on online behavior and practices collected by using a survey method.The participants include both males and females from different occupation sectors and different ages.The data collected is subject to feature engineering,and the significant features are presented to unsupervised machine learning methods to identify user intent classes or profiles and their characteristics.Different techniques are evaluated,and the K-Mean clustering method successfully generates five user groups observing different user characteristics with an average silhouette of 0.36 and a distortion score of 1136.Feature average is computed to identify user intent type characteristics.The user intent classes are then further generalized to create a user intent template with an Inter-Rater Reliability of 75%.This research successfully extracts different user types based on their preferences in online content,platforms,criteria,and frequency.The study also validates the proposed template on user feedback data through Inter-Rater Agreement process using an external human rater.
基金This researchwork is supported by Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2024R411),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘Malware attacks on Windows machines pose significant cybersecurity threats,necessitating effective detection and prevention mechanisms.Supervised machine learning classifiers have emerged as promising tools for malware detection.However,there remains a need for comprehensive studies that compare the performance of different classifiers specifically for Windows malware detection.Addressing this gap can provide valuable insights for enhancing cybersecurity strategies.While numerous studies have explored malware detection using machine learning techniques,there is a lack of systematic comparison of supervised classifiers for Windows malware detection.Understanding the relative effectiveness of these classifiers can inform the selection of optimal detection methods and improve overall security measures.This study aims to bridge the research gap by conducting a comparative analysis of supervised machine learning classifiers for detecting malware on Windows systems.The objectives include Investigating the performance of various classifiers,such as Gaussian Naïve Bayes,K Nearest Neighbors(KNN),Stochastic Gradient Descent Classifier(SGDC),and Decision Tree,in detecting Windows malware.Evaluating the accuracy,efficiency,and suitability of each classifier for real-world malware detection scenarios.Identifying the strengths and limitations of different classifiers to provide insights for cybersecurity practitioners and researchers.Offering recommendations for selecting the most effective classifier for Windows malware detection based on empirical evidence.The study employs a structured methodology consisting of several phases:exploratory data analysis,data preprocessing,model training,and evaluation.Exploratory data analysis involves understanding the dataset’s characteristics and identifying preprocessing requirements.Data preprocessing includes cleaning,feature encoding,dimensionality reduction,and optimization to prepare the data for training.Model training utilizes various supervised classifiers,and their performance is evaluated using metrics such as accuracy,precision,recall,and F1 score.The study’s outcomes comprise a comparative analysis of supervised machine learning classifiers for Windows malware detection.Results reveal the effectiveness and efficiency of each classifier in detecting different types of malware.Additionally,insights into their strengths and limitations provide practical guidance for enhancing cybersecurity defenses.Overall,this research contributes to advancing malware detection techniques and bolstering the security posture of Windows systems against evolving cyber threats.
基金funded by the Deanship of Scientific Research(DSR),King Abdulaziz University,Jeddah,Saudi Arabia,under Grant No.KEP-1-166-41The authors,therefore,acknowledge DSR,with thanks for their technical and financial support.
文摘Cancer frequently develops resistance to the majority of chemotherapy treatments.This study aimed to examine the synergistic cytotoxic and antitumor effects of SGLT2 inhibitors,specifically Canagliflozin(CAN),Dapagliflozin(DAP),Empagliflozin(EMP),and Doxorubicin(DOX),using in vitro experimentation.The precise combination of CAN+DOX has been found to greatly enhance the cytotoxic effects of doxorubicin(DOX)in MCF-7 cells.Interestingly,it was shown that cancer cells exhibit an increased demand for glucose and ATP in order to support their growth.Notably,when these medications were combined with DOX,there was a considerable inhibition of glucose consumption,as well as reductions in intracellular ATP and lactate levels.Moreover,this effect was found to be dependent on the dosages of the drugs.In addition to effectively inhibiting the cell cycle,the combination of CAN+DOX induces substantial modifications in both cell cycle and apoptotic gene expression.This work represents the initial report on the beneficial impact of SGLT2 inhibitor medications,namely CAN,DAP,and EMP,on the responsiveness to the anticancer properties of DOX.The underlying molecular mechanisms potentially involve the suppression of the function of SGLT2.
基金funded by the Deanship of Scientific Research(DSR),King Abdulaziz University(KAU),Jeddah,Saudi Arabia under Grant No.(IFPIP:631-612-1443).
文摘Big data and information and communication technologies can be important to the effectiveness of smart cities.Based on the maximal attention on smart city sustainability,developing data-driven smart cities is newly obtained attention as a vital technology for addressing sustainability problems.Real-time monitoring of pollution allows local authorities to analyze the present traffic condition of cities and make decisions.Relating to air pollution occurs a main environmental problem in smart city environments.The effect of the deep learning(DL)approach quickly increased and penetrated almost every domain,comprising air pollution forecast.Therefore,this article develops a new Coot Optimization Algorithm with an Ensemble Deep Learning based Air Pollution Prediction(COAEDL-APP)system for Sustainable Smart Cities.The projected COAEDL-APP algorithm accurately forecasts the presence of air quality in the sustainable smart city environment.To achieve this,the COAEDL-APP technique initially performs a linear scaling normalization(LSN)approach to pre-process the input data.For air quality prediction,an ensemble of three DL models has been involved,namely autoencoder(AE),long short-term memory(LSTM),and deep belief network(DBN).Furthermore,the COA-based hyperparameter tuning procedure can be designed to adjust the hyperparameter values of the DL models.The simulation outcome of the COAEDL-APP algorithm was tested on the air quality database,and the outcomes stated the improved performance of the COAEDL-APP algorithm over other existing systems with maximum accuracy of 98.34%.
基金This work was supported by National Natural Science Foundation of China(Grant No.62341208)Natural Science Foundation of Zhejiang Province(Grant Nos.LY23F020006 and LR23F020001)Moreover,it has been supported by Islamic Azad University with the Grant No.133713281361.
文摘Software-Defined Networking(SDN)represents a significant paradigm shift in network architecture,separating network logic from the underlying forwarding devices to enhance flexibility and centralize deployment.Concur-rently,the Internet of Things(IoT)connects numerous devices to the Internet,enabling autonomous interactions with minimal human intervention.However,implementing and managing an SDN-IoT system is inherently complex,particularly for those with limited resources,as the dynamic and distributed nature of IoT infrastructures creates security and privacy challenges during SDN integration.The findings of this study underscore the primary security and privacy challenges across application,control,and data planes.A comprehensive review evaluates the root causes of these challenges and the defense techniques employed in prior works to establish sufficient secrecy and privacy protection.Recent investigations have explored cutting-edge methods,such as leveraging blockchain for transaction recording to enhance security and privacy,along with applying machine learning and deep learning approaches to identify and mitigate the impacts of Denial of Service(DoS)and Distributed DoS(DDoS)attacks.Moreover,the analysis indicates that encryption and hashing techniques are prevalent in the data plane,whereas access control and certificate authorization are prominently considered in the control plane,and authentication is commonly employed within the application plane.Additionally,this paper outlines future directions,offering insights into potential strategies and technological advancements aimed at fostering a more secure and privacy-conscious SDN-based IoT ecosystem.
文摘Serial remote sensing images offer a valuable means of tracking the evolutionary changes and growth of a specific geographical area over time.Although the original images may provide limited insights,they harbor considerable potential for identifying clusters and patterns.The aggregation of these serial remote sensing images(SRSI)becomes increasingly viable as distinct patterns emerge in diverse scenarios,such as suburbanization,the expansion of native flora,and agricultural activities.In a novel approach,we propose an innovative method for extracting sequential patterns by combining Ant Colony Optimization(ACD)and Empirical Mode Decomposition(EMD).This integration of the newly developed EMD and ACO techniques proves remarkably effective in identifying the most significant characteristic features within serial remote sensing images,guided by specific criteria.Our findings highlight a substantial improvement in the efficiency of sequential pattern mining through the application of this unique hybrid method,seamlessly integrating EMD and ACO for feature selection.This study exposes the potential of our innovative methodology,particularly in the realms of urbanization,native vegetation expansion,and agricultural activities.
文摘Manual investigation of chest radiography(CXR)images by physicians is crucial for effective decision-making in COVID-19 diagnosis.However,the high demand during the pandemic necessitates auxiliary help through image analysis and machine learning techniques.This study presents a multi-threshold-based segmentation technique to probe high pixel intensity regions in CXR images of various pathologies,including normal cases.Texture information is extracted using gray co-occurrence matrix(GLCM)-based features,while vessel-like features are obtained using Frangi,Sato,and Meijering filters.Machine learning models employing Decision Tree(DT)and RandomForest(RF)approaches are designed to categorize CXR images into common lung infections,lung opacity(LO),COVID-19,and viral pneumonia(VP).The results demonstrate that the fusion of texture and vesselbased features provides an effective ML model for aiding diagnosis.The ML model validation using performance measures,including an accuracy of approximately 91.8%with an RF-based classifier,supports the usefulness of the feature set and classifier model in categorizing the four different pathologies.Furthermore,the study investigates the importance of the devised features in identifying the underlying pathology and incorporates histogrambased analysis.This analysis reveals varying natural pixel distributions in CXR images belonging to the normal,COVID-19,LO,and VP groups,motivating the incorporation of additional features such as mean,standard deviation,skewness,and percentile based on the filtered images.Notably,the study achieves a considerable improvement in categorizing COVID-19 from LO,with a true positive rate of 97%,further substantiating the effectiveness of the methodology implemented.
基金financially supported by the Deanship of Scientific Research at King Khalid University under Research Grant Number(R.G.P.2/549/44).
文摘Algorithms for steganography are methods of hiding data transfers in media files.Several machine learning architectures have been presented recently to improve stego image identification performance by using spatial information,and these methods have made it feasible to handle a wide range of problems associated with image analysis.Images with little information or low payload are used by information embedding methods,but the goal of all contemporary research is to employ high-payload images for classification.To address the need for both low-and high-payload images,this work provides a machine-learning approach to steganography image classification that uses Curvelet transformation to efficiently extract characteristics from both type of images.Support Vector Machine(SVM),a commonplace classification technique,has been employed to determine whether the image is a stego or cover.The Wavelet Obtained Weights(WOW),Spatial Universal Wavelet Relative Distortion(S-UNIWARD),Highly Undetectable Steganography(HUGO),and Minimizing the Power of Optimal Detector(MiPOD)steganography techniques are used in a variety of experimental scenarios to evaluate the performance of the proposedmethod.Using WOW at several payloads,the proposed approach proves its classification accuracy of 98.60%.It exhibits its superiority over SOTA methods.
文摘Customer churns remains a key focus in this research, using artificial intelligence-based technique of machine learning. Research is based on the feature-based analysis four main features were used that are selected on the basis of our customer churn to deduct the meaning full analysis of the data set. Data-set is taken from the Kaggle that is about the fine food review having more than half a million records in it. This research remains on feature based analysis that is further concluded using confusion matrix. In this research we are using confusion matrix to conclude the customer churn results. Such specific analysis helps e-commerce business for real time growth in their specific products focusing more sales and to analyze which product is getting outage. Moreover, after applying the techniques, Support Vector Machine and K-Nearest Neighbour perform better than the random forest in this particular scenario. Using confusion matrix for obtaining the results three things are obtained that are precision, recall and accuracy. The result explains feature-based analysis on fine food reviews, Amazon at customer churn Support Vector Machine performed better as in overall comparison.
基金The author extends her appreciation to the Deanship of Scientific Research at King Saud University for funding this work through the Undergraduate Research Support Program,Project no.(URSP-3-18-89).
文摘It is a common observation that whenever patients arrives at the front desk of a hospital,outpatient clinic,or other health-associated centers,they have to first queue up in a line and wait to fill in their registration form to get admitted.The long waiting time without any status updates is the most common complaint,concerning health officials.In this paper,UrNext,a location-aware mobile-based solution using Bluetooth low-energy(BLE)technology is presented to solve the problem.Recently,a technology-oriented method,the Internet of Things(IoT),has been gaining popularity in helping to solve some of the healthcare sector’s problems.The implementation of this solution could be illustrated through a simple example of when a patient arrives at a clinic for a consultation.Instead of having to wait in long lines,that patient will be greeted automatically,receive a push notification of an admittance along with an estimated waiting time for a consultation session.This will not only provide the patients with a sense of freedom but would also reduce the uncertainty levels that are generally observed,thus saving both time and money.This work aims to improve the clinics’quality of services,organize queues and minimize waiting times,leading to patients’comfort while reducing the burden on nurses and receptionists.The results demonstrate that the presented system is successful in its performance and helps achieves a plea-sant and conducive clinic visitation process with higher productivity.
文摘A news agency is an organization that gathers news reports and sells them to subscribing news organization, such as newspapers, magazines, radio and television broadcasters. A news agency may also be referred to as a wire service, newswire, or news service. The main purpose of this paper is to evaluate the security policies and analyze the content of five press agencies in gulf countries which are (Kuwait News Agency (KUNA), Emirates News Agency (WAM), Saudi Press Agency (SPA), Bahrain News Agency (BNA), and Oman News Agency (OMA)) by using a fuzzy VIKOR approach where linguistic variables are applied to solve the uncertainties and subjectivities in expert decision making. Fuzzy VIKOR approach is one of the best Multi-Criteria Decision Making (MCDM) techniques working in fuzzy environment. This study benefits security and content analysis experts know which press agency has the mandate and the competence to educate the public on news agencies. Besides, this paper contributes to Gulf agencies in helping them in their resolve to ensure the quality of content information and information security policies over the internet.
基金The Deanship of ScientificResearch (DSR)at King Abdulaziz University,Jeddah,Saudi Arabia has funded this project,under Grant No. (FP-205-43).
文摘The outbreak of the pandemic,caused by Coronavirus Disease 2019(COVID-19),has affected the daily activities of people across the globe.During COVID-19 outbreak and the successive lockdowns,Twitter was heavily used and the number of tweets regarding COVID-19 increased tremendously.Several studies used Sentiment Analysis(SA)to analyze the emotions expressed through tweets upon COVID-19.Therefore,in current study,a new Artificial Bee Colony(ABC)with Machine Learning-driven SA(ABCMLSA)model is developed for conducting Sentiment Analysis of COVID-19 Twitter data.The prime focus of the presented ABCML-SA model is to recognize the sentiments expressed in tweets made uponCOVID-19.It involves data pre-processing at the initial stage followed by n-gram based feature extraction to derive the feature vectors.For identification and classification of the sentiments,the Support Vector Machine(SVM)model is exploited.At last,the ABC algorithm is applied to fine tune the parameters involved in SVM.To demonstrate the improved performance of the proposed ABCML-SA model,a sequence of simulations was conducted.The comparative assessment results confirmed the effectual performance of the proposed ABCML-SA model over other approaches.
基金funded by the Deanship of Scientific Research,Princess Nourah bint Abdulrahman University,through the Program of Research Funding after Publication,Grant No. (PRFA–P–42–10).
文摘Task offloading is a key strategy in Fog Computing (FC). Thedefinition of resource-constrained devices no longer applies to sensors andInternet of Things (IoT) embedded system devices alone. Smart and mobileunits can also be viewed as resource-constrained devices if the power, cloudapplications, and data cloud are included in the set of required resources. Ina cloud-fog-based architecture, a task instance running on an end device mayneed to be offloaded to a fog node to complete its execution. However, ina busy network, a second offloading decision is required when the fog nodebecomes overloaded. The possibility of offloading a task, for the second time,to a fog or a cloud node depends to a great extent on task importance, latencyconstraints, and required resources. This paper presents a dynamic service thatdetermines which tasks can endure a second offloading. The task type, latencyconstraints, and amount of required resources are used to select the offloadingdestination node. This study proposes three heuristic offloading algorithms.Each algorithm targets a specific task type. An overloaded fog node can onlyissue one offloading request to execute one of these algorithms accordingto the task offloading priority. Offloading requests are sent to a SoftwareDefined Networking (SDN) controller. The fog node and controller determinethe number of offloaded tasks. Simulation results show that the average timerequired to select offloading nodes was improved by 33% when compared tothe dynamic fog-to-fog offloading algorithm. The distribution of workloadconverges to a uniform distribution when offloading latency-sensitive nonurgenttasks. The lowest offloading priority is assigned to latency-sensitivetasks with hard deadlines. At least 70% of these tasks are offloaded to fognodes that are one to three hops away from the overloaded node.
基金the Deanship of Scientific Research at Saudi Electronic University for funding this research work through the project number 8092.
文摘In this paper,the throughput and delay of cooperative communications are derived when solar energy is used and relay node is selected using a timer.The source and relays harvest energy from sun using a photo voltaic system.The harvested power is used by the source to transmit data to the relays.Then,a selected relay amplifies the signal to the destination.Opportunistic,partial and reactive relay selection are used.The relay transmits when its timer elapses.The timer is set to a value proportional to the inverse of its Signal to Noise Ratio(SNR).Therefore,the relay with largest SNR will transmit first and its signal will be detected by the other relays that will remain idle to avoid collisions.Harvesting duration is optimized to maximize the throughput.Packet’s waiting time and total delay are also computed.We also derive the statistics of SNR when solar energy is used.The harvested power from sun is proportional to the sum of a deterministic radiation intensity and a random attenuation due to weather effects and clouds occlusion.The fixed radiation intensity depends on season,month and time t in hour.The throughput of cooperative communications with energy harvesting from sun was not yet studied.
基金supported by Comunidad de Madrid within the framework of the Multiannual Agreement with Universidad Politécnica de Madrid to encourage research by young doctors(PRINCE).
文摘Cyber-Physical Systems are very vulnerable to sparse sensor attacks.But current protection mechanisms employ linear and deterministic models which cannot detect attacks precisely.Therefore,in this paper,we propose a new non-linear generalized model to describe Cyber-Physical Systems.This model includes unknown multivariable discrete and continuous-time functions and different multiplicative noises to represent the evolution of physical processes and randomeffects in the physical and computationalworlds.Besides,the digitalization stage in hardware devices is represented too.Attackers and most critical sparse sensor attacks are described through a stochastic process.The reconstruction and protectionmechanisms are based on aweighted stochasticmodel.Error probability in data samples is estimated through different indicators commonly employed in non-linear dynamics(such as the Fourier transform,first-return maps,or the probability density function).A decision algorithm calculates the final reconstructed value considering the previous error probability.An experimental validation based on simulation tools and real deployments is also carried out.Both,the new technology performance and scalability are studied.Results prove that the proposed solution protects Cyber-Physical Systems against up to 92%of attacks and perturbations,with a computational delay below 2.5 s.The proposed model shows a linear complexity,as recursive or iterative structures are not employed,just algebraic and probabilistic functions.In conclusion,the new model and reconstructionmechanism can protect successfully Cyber-Physical Systems against sparse sensor attacks,even in dense or pervasive deployments and scenarios.
基金This research work was funded by Institutional Fund Projects under grant no.(IFPIP:333-611-1443)Therefore,the authors gratefully acknowledge technical and financial support provided by the Ministry of Education and Deanship of Scientific Research(DSR),King Abdulaziz University(KAU),Jeddah,Saudi Arabia。
文摘The Internet of Things(IoT)is one of the emergent technologies with advanced developments in several applications like creating smart environments,enabling Industry 4.0,etc.As IoT devices operate via an inbuilt and limited power supply,the effective utilization of available energy plays a vital role in designing the IoT environment.At the same time,the communication of IoT devices in wireless mediums poses security as a challenging issue.Recently,intrusion detection systems(IDS)have paved the way to detect the presence of intrusions in the IoT environment.With this motivation,this article introduces a novel QuantumCat SwarmOptimization based Clustering with Intrusion Detection Technique(QCSOBC-IDT)for IoT environment.The QCSOBC-IDT model aims to achieve energy efficiency by clustering the nodes and security by intrusion detection.Primarily,the QCSOBC-IDT model presents a new QCSO algorithm for effectively choosing cluster heads(CHs)and organizing a set of clusters in the IoT environment.Besides,the QCSO algorithm computes a fitness function involving four parameters,namely energy efficiency,inter-cluster distance,intra-cluster distance,and node density.A harmony search algorithm(HSA)with a cascaded recurrent neural network(CRNN)model can be used for an effective intrusion detection process.The design of HSA assists in the optimal selection of hyperparameters related to the CRNN model.A detailed experimental analysis of the QCSOBC-IDT model ensured its promising efficiency compared to existing models.
文摘Acute Lymphoblastic Leukemia(ALL)is a fatal malignancy that is featured by the abnormal increase of immature lymphocytes in blood or bone marrow.Early prognosis of ALL is indispensable for the effectual remediation of this disease.Initial screening of ALL is conducted through manual examination of stained blood smear microscopic images,a process which is time-consuming and prone to errors.Therefore,many deep learning-based computer-aided diagnosis(CAD)systems have been established to automatically diagnose ALL.This paper proposes a novel hybrid deep learning system for ALL diagnosis in blood smear images.The introduced system integrates the proficiency of autoencoder networks in feature representational learning in latent space with the superior feature extraction capability of standard pretrained convolutional neural networks(CNNs)to identify the existence of ALL in blood smears.An augmented set of deep image features are formed from the features extracted by GoogleNet and Inception-v3 CNNs from a hybrid dataset of microscopic blood smear images.A sparse autoencoder network is designed to create an abstract set of significant latent features from the enlarged image feature set.The latent features are used to perform image classification using Support Vector Machine(SVM)classifier.The obtained results show that the latent features improve the classification performance of the proposed ALL diagnosis system over the original image features.Moreover,the classification performance of the system with various sizes of the latent feature set is evaluated.The retrieved results reveal that the introduced ALL diagnosis system superiorly compete the state of the art.