The application of deep learning techniques in the medical field,specifically for Atrial Fibrillation(AFib)detection through Electrocardiogram(ECG)signals,has witnessed significant interest.Accurate and timely diagnos...The application of deep learning techniques in the medical field,specifically for Atrial Fibrillation(AFib)detection through Electrocardiogram(ECG)signals,has witnessed significant interest.Accurate and timely diagnosis increases the patient’s chances of recovery.However,issues like overfitting and inconsistent accuracy across datasets remain challenges.In a quest to address these challenges,a study presents two prominent deep learning architectures,ResNet-50 and DenseNet-121,to evaluate their effectiveness in AFib detection.The aim was to create a robust detection mechanism that consistently performs well.Metrics such as loss,accuracy,precision,sensitivity,and Area Under the Curve(AUC)were utilized for evaluation.The findings revealed that ResNet-50 surpassed DenseNet-121 in all evaluated categories.It demonstrated lower loss rate 0.0315 and 0.0305 superior accuracy of 98.77%and 98.88%,precision of 98.78%and 98.89%and sensitivity of 98.76%and 98.86%for training and validation,hinting at its advanced capability for AFib detection.These insights offer a substantial contribution to the existing literature on deep learning applications for AFib detection from ECG signals.The comparative performance data assists future researchers in selecting suitable deep-learning architectures for AFib detection.Moreover,the outcomes of this study are anticipated to stimulate the development of more advanced and efficient ECG-based AFib detection methodologies,for more accurate and early detection of AFib,thereby fostering improved patient care and outcomes.展开更多
Recently,automation is considered vital in most fields since computing methods have a significant role in facilitating work such as automatic text summarization.However,most of the computing methods that are used in r...Recently,automation is considered vital in most fields since computing methods have a significant role in facilitating work such as automatic text summarization.However,most of the computing methods that are used in real systems are based on graph models,which are characterized by their simplicity and stability.Thus,this paper proposes an improved extractive text summarization algorithm based on both topic and graph models.The methodology of this work consists of two stages.First,the well-known TextRank algorithm is analyzed and its shortcomings are investigated.Then,an improved method is proposed with a new computational model of sentence weights.The experimental results were carried out on standard DUC2004 and DUC2006 datasets and compared to four text summarization methods.Finally,through experiments on the DUC2004 and DUC2006 datasets,our proposed improved graph model algorithm TG-SMR(Topic Graph-Summarizer)is compared to other text summarization systems.The experimental results prove that the proposed TG-SMR algorithm achieves higher ROUGE scores.It is foreseen that the TG-SMR algorithm will open a new horizon that concerns the performance of ROUGE evaluation indicators.展开更多
In this paper,we consider the problem of minimizing the total tardiness in a deterministic two-machine permutationflowshop scheduling problem subject to release dates of jobs and known unavailability periods of machin...In this paper,we consider the problem of minimizing the total tardiness in a deterministic two-machine permutationflowshop scheduling problem subject to release dates of jobs and known unavailability periods of machines.The theoretical and practical importance of minimizing tardiness inflowshop scheduling environment has motivated us to investigate and solve this interested two-machine scheduling problem.Methods that solve this important optimality criterion inflowshop environment are mainly heuristics.In fact,despite the N P-hardnessin the strong sense of the studied problem,to the best of our knowledge there are no approximate algorithms(constructive heuristics or metaheuristics)or an algorithm with worst case behavior bounds proposed to solve this problem.Thus,the design of new promising algorithms is desirable.We developfive metaheuristics for the problem under consideration.These metaheuristics are:the Particle Swarm Optimization(PSO),the Differential Evolution(DE),the Genetic Algorithm(GA),the Ant Colony Optimization(ACO)and the Imperialist Competitive Algorithm(ICA).All the proposed metaheuristics are population-based approaches.These metaheuristics have been improved by integrating different local search procedures in order to provide more satisfactory,especially in term of quality solutions.Computational experiments carried out on a large set of randomly generated instances provide evidence that the Imperialist Competitive Algorithm(ICA)records the best performances.展开更多
文摘The application of deep learning techniques in the medical field,specifically for Atrial Fibrillation(AFib)detection through Electrocardiogram(ECG)signals,has witnessed significant interest.Accurate and timely diagnosis increases the patient’s chances of recovery.However,issues like overfitting and inconsistent accuracy across datasets remain challenges.In a quest to address these challenges,a study presents two prominent deep learning architectures,ResNet-50 and DenseNet-121,to evaluate their effectiveness in AFib detection.The aim was to create a robust detection mechanism that consistently performs well.Metrics such as loss,accuracy,precision,sensitivity,and Area Under the Curve(AUC)were utilized for evaluation.The findings revealed that ResNet-50 surpassed DenseNet-121 in all evaluated categories.It demonstrated lower loss rate 0.0315 and 0.0305 superior accuracy of 98.77%and 98.88%,precision of 98.78%and 98.89%and sensitivity of 98.76%and 98.86%for training and validation,hinting at its advanced capability for AFib detection.These insights offer a substantial contribution to the existing literature on deep learning applications for AFib detection from ECG signals.The comparative performance data assists future researchers in selecting suitable deep-learning architectures for AFib detection.Moreover,the outcomes of this study are anticipated to stimulate the development of more advanced and efficient ECG-based AFib detection methodologies,for more accurate and early detection of AFib,thereby fostering improved patient care and outcomes.
文摘Recently,automation is considered vital in most fields since computing methods have a significant role in facilitating work such as automatic text summarization.However,most of the computing methods that are used in real systems are based on graph models,which are characterized by their simplicity and stability.Thus,this paper proposes an improved extractive text summarization algorithm based on both topic and graph models.The methodology of this work consists of two stages.First,the well-known TextRank algorithm is analyzed and its shortcomings are investigated.Then,an improved method is proposed with a new computational model of sentence weights.The experimental results were carried out on standard DUC2004 and DUC2006 datasets and compared to four text summarization methods.Finally,through experiments on the DUC2004 and DUC2006 datasets,our proposed improved graph model algorithm TG-SMR(Topic Graph-Summarizer)is compared to other text summarization systems.The experimental results prove that the proposed TG-SMR algorithm achieves higher ROUGE scores.It is foreseen that the TG-SMR algorithm will open a new horizon that concerns the performance of ROUGE evaluation indicators.
文摘In this paper,we consider the problem of minimizing the total tardiness in a deterministic two-machine permutationflowshop scheduling problem subject to release dates of jobs and known unavailability periods of machines.The theoretical and practical importance of minimizing tardiness inflowshop scheduling environment has motivated us to investigate and solve this interested two-machine scheduling problem.Methods that solve this important optimality criterion inflowshop environment are mainly heuristics.In fact,despite the N P-hardnessin the strong sense of the studied problem,to the best of our knowledge there are no approximate algorithms(constructive heuristics or metaheuristics)or an algorithm with worst case behavior bounds proposed to solve this problem.Thus,the design of new promising algorithms is desirable.We developfive metaheuristics for the problem under consideration.These metaheuristics are:the Particle Swarm Optimization(PSO),the Differential Evolution(DE),the Genetic Algorithm(GA),the Ant Colony Optimization(ACO)and the Imperialist Competitive Algorithm(ICA).All the proposed metaheuristics are population-based approaches.These metaheuristics have been improved by integrating different local search procedures in order to provide more satisfactory,especially in term of quality solutions.Computational experiments carried out on a large set of randomly generated instances provide evidence that the Imperialist Competitive Algorithm(ICA)records the best performances.