This article delves into the analysis of performance and utilization of Support Vector Machines (SVMs) for the critical task of forest fire detection using image datasets. With the increasing threat of forest fires to...This article delves into the analysis of performance and utilization of Support Vector Machines (SVMs) for the critical task of forest fire detection using image datasets. With the increasing threat of forest fires to ecosystems and human settlements, the need for rapid and accurate detection systems is of utmost importance. SVMs, renowned for their strong classification capabilities, exhibit proficiency in recognizing patterns associated with fire within images. By training on labeled data, SVMs acquire the ability to identify distinctive attributes associated with fire, such as flames, smoke, or alterations in the visual characteristics of the forest area. The document thoroughly examines the use of SVMs, covering crucial elements like data preprocessing, feature extraction, and model training. It rigorously evaluates parameters such as accuracy, efficiency, and practical applicability. The knowledge gained from this study aids in the development of efficient forest fire detection systems, enabling prompt responses and improving disaster management. Moreover, the correlation between SVM accuracy and the difficulties presented by high-dimensional datasets is carefully investigated, demonstrated through a revealing case study. The relationship between accuracy scores and the different resolutions used for resizing the training datasets has also been discussed in this article. These comprehensive studies result in a definitive overview of the difficulties faced and the potential sectors requiring further improvement and focus.展开更多
BACKGROUND Sodium glucose cotransporter-2 inhibitors(SGLT-2i)are a class of drugs with modest antidiabetic efficacy,weight loss effect,and cardiovascular benefits as proven by multiple randomised controlled trials(RCT...BACKGROUND Sodium glucose cotransporter-2 inhibitors(SGLT-2i)are a class of drugs with modest antidiabetic efficacy,weight loss effect,and cardiovascular benefits as proven by multiple randomised controlled trials(RCTs).However,real-world data on the comparative efficacy and safety of individual SGLT-2i medications is sparse.AIM To study the comparative efficacy and safety of SGLT-2i using real-world clinical data.METHODS We evaluated the comparative efficacy data of 3 SGLT-2i drugs(dapagliflozin,canagliflozin,and empagliflozin)used for treating patients with type 2 diabetes mellitus.Data on the reduction of glycated hemoglobin(HbA1c),body weight,blood pressure(BP),urine albumin creatinine ratio(ACR),and adverse effects were recorded retrospectively.RESULTS Data from 467 patients with a median age of 64(14.8)years,294(62.96%)males and 375(80.5%)Caucasians were analysed.Median diabetes duration was 16.0(9.0)years,and the duration of SGLT-2i use was 3.6(2.1)years.SGLT-2i molecules used were dapagliflozin 10 mg(n=227;48.6%),canagliflozin 300 mg(n=160;34.3%),and empagliflozin 25 mg(n=80;17.1).Baseline median(interquartile range)HbA1c in mmol/mol were:dapagliflozin-78.0(25.3),canagliflozin-80.0(25.5),and empagliflozin-75.0(23.5)respectively.The respective median HbA1c reduction at 12 months and the latest review(just prior to the study)were:66.5(22.8)&69.0(24.0),67.0(16.3)&66.0(28.0),and 67.0(22.5)&66.5(25.8)respectively(P<0.001 for all comparisons from baseline).Significant improvements in body weight(in kilograms)from baseline to study end were noticed with dapagliflozin-101(29.5)to 92.2(25.6),and canagliflozin 100(28.3)to 95.3(27.5)only.Significant reductions in median systolic and diastolic BP,from 144(21)mmHg to 139(23)mmHg;(P=0.015),and from 82(16)mmHg to 78(19)mmHg;(P<0.001)respectively were also observed.A significant reduction of microalbuminuria was observed with canagliflozin only[ACR 14.6(42.6)at baseline to 8.9(23.7)at the study end;P=0.043].Adverse effects of SGLT-2i were as follows:genital thrush and urinary infection-20(8.8%)&17(7.5%)with dapagliflozin;9(5.6%)&5(3.13%)with canagliflozin;and 4(5%)&4(5%)with empagliflozin.Diabetic ketoacidosis was observed in 4(1.8%)with dapagliflozin and 1(0.63%)with canagliflozin.CONCLUSION Treatment of patients with SGLT-2i is associated with statistically significant reductions in HbA1c,body weight,and better than those reported in RCTs,with low side effect profiles.A review of large-scale real-world data is needed to inform better clinical practice decision making.展开更多
Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin ...Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin in CC’s performance,the Cloud Service Broker(CSB),orchestrates DC selection.Failure to adroitly route user requests with suitable DCs transforms the CSB into a bottleneck,endangering service quality.To tackle this,deploying an efficient CSB policy becomes imperative,optimizing DC selection to meet stringent Qualityof-Service(QoS)demands.Amidst numerous CSB policies,their implementation grapples with challenges like costs and availability.This article undertakes a holistic review of diverse CSB policies,concurrently surveying the predicaments confronted by current policies.The foremost objective is to pinpoint research gaps and remedies to invigorate future policy development.Additionally,it extensively clarifies various DC selection methodologies employed in CC,enriching practitioners and researchers alike.Employing synthetic analysis,the article systematically assesses and compares myriad DC selection techniques.These analytical insights equip decision-makers with a pragmatic framework to discern the apt technique for their needs.In summation,this discourse resoundingly underscores the paramount importance of adept CSB policies in DC selection,highlighting the imperative role of efficient CSB policies in optimizing CC performance.By emphasizing the significance of these policies and their modeling implications,the article contributes to both the general modeling discourse and its practical applications in the CC domain.展开更多
Regular physical activity(PA)is known to enhance multifaceted health benefits,including both physical and mental health.However,traditional in-person physical activity programs have drawbacks,including time constraints...Regular physical activity(PA)is known to enhance multifaceted health benefits,including both physical and mental health.However,traditional in-person physical activity programs have drawbacks,including time constraints for busy people.Although evidence suggests positive impacts on mental health through mobile-based physical activity,effects of accumulated short bouts of physical activity using mobile devices are unexplored.Thus,this study aims to investigate these effects,focusing on depression,perceived stress,and negative affectivity among South Korean college students.Forty-six healthy college students were divided into the accumulated group(n=23,female=47.8%)and control group(n=23,female=47.6%).The accumulated group engaged in mobile-based physical activity,following guidelines to accumulate a minimum of two times per day and three times a week.Sessions were divided into short bouts,ensuing each bout lasted at least 10 min.The control group did not engage in any specific physical activity.The data analysis involved comparing the scores of the intervention and control groups using several statistical techniques,such as independent sample t-test,paired sample t-tests,and 2(time)×2(group)repeated measures analysis of variance.The demographic characteristics at the pre-test showed no statistically significant differences between the groups.The accumulated group had significant decreases in depression(t_(40)=2.59,p=0.013,Cohen’s D=0.84)and perceived stress(t_(40)=2.06,p=0.046,Cohen’s D=0.56)from the pre-to post-test.The control group exhibited no statistically significant differences in any variables.Furthermore,there were significant effects of time on depression scores(F1,36=4.77,p=0.036,η_(p)^(2)=0.12)while significant interaction effects were also observed for depression(F_(1,36)=6.59,p=0.015,η_(p)^(2)=0.16).This study offers informative insights into the potential advantages of mobile-based physical activity programs with accumulated periods for enhancing mental health,specifically in relation to depression.This study illuminates the current ongoing discussions on efficient approaches to encourage mobile-based physical activity and improve mental well-being,addressing various lifestyles and busy schedules.展开更多
BACKGROUND Diabesity(diabetes as a consequence of obesity)has emerged as a huge healthcare challenge across the globe due to the obesity pandemic.Judicious use of antidiabetic medications including semaglutide is impo...BACKGROUND Diabesity(diabetes as a consequence of obesity)has emerged as a huge healthcare challenge across the globe due to the obesity pandemic.Judicious use of antidiabetic medications including semaglutide is important for optimal management of diabesity as proven by multiple randomized controlled trials.However,more real-world data is needed to further improve the clinical practice.AIM To study the real-world benefits and side effects of using semaglutide to manage patients with diabesity.METHODS We evaluated the efficacy and safety of semaglutide use in managing patients with diabesity in a large academic hospital in the United States.Several parameters were analyzed including demographic information,the data on improvement of glycated hemoglobin(HbA1c),body weight reduction and insulin dose adjustments at 6 and 12 months,as well as at the latest follow up period.The data was obtained from the electronic patient records between January 2019 to May 2023.RESULTS 106 patients(56 males)with type 2 diabetes mellitus(T2DM),mean age 60.8±11.2 years,mean durations of T2DM 12.4±7.2 years and mean semaglutide treatment for 2.6±1.1 years were included.Semaglutide treatment was associated with significant improvement in diabesity outcomes such as mean weight reductions from baseline 110.4±24.6 kg to 99.9±24.9 kg at 12 months and 96.8±22.9 kg at latest follow up and HbA1c improvement from baseline of 82±21 mmol/mol to 67±20 at 12 months and 71±23 mmol/mol at the latest follow up.An insulin dose reduction from mean baseline of 95±74 units to 76.5±56.2 units was also observed at the latest follow up.Side effects were mild and mainly gastrointestinal like bloating and nausea improving with prolonged use of semaglutide.CONCLUSION Semaglutide treatment is associated with significant improvement in diabesity outcomes such as reduction in body weight,HbA1c and insulin doses without major adverse effects.Reviews of largescale real-world data are expected to inform better clinical practice decision making to improve the care of patients with diabesity.展开更多
This paper presents a detailed statistical exploration of crime trends in Chicago from 2001 to 2023, employing data from the Chicago Police Department’s publicly available crime database. The study aims to elucidate ...This paper presents a detailed statistical exploration of crime trends in Chicago from 2001 to 2023, employing data from the Chicago Police Department’s publicly available crime database. The study aims to elucidate the patterns, distribution, and variations in crime across different types and locations, providing a comprehensive picture of the city’s crime landscape through advanced data analytics and visualization techniques. Using exploratory data analysis (EDA), we identified significant insights into crime trends, including the prevalence of theft and battery, the impact of seasonal changes on crime rates, and spatial concentrations of criminal activities. The research leveraged a Power BI dashboard to visually represent crime data, facilitating an intuitive understanding of complex patterns and enabling dynamic interaction with the dataset. Key findings highlight notable disparities in crime occurrences by type, location, and time, offering a granular view of crime hotspots and temporal trends. Additionally, the study examines clearance rates, revealing variations in the resolution of cases across different crime categories. This analysis not only sheds light on the current state of urban safety but also serves as a critical tool for policymakers and law enforcement agencies to develop targeted interventions. The paper concludes with recommendations for enhancing public safety strategies and suggests directions for future research, emphasizing the need for continuous data-driven approaches to effectively address and mitigate urban crime. This study contributes to the broader discourse on urban safety, crime prevention, and the role of data analytics in public policy and community well-being.展开更多
Customer churn poses a significant challenge for the banking and finance industry in the United States, directly affecting profitability and market share. This study conducts a comprehensive comparative analysis of ma...Customer churn poses a significant challenge for the banking and finance industry in the United States, directly affecting profitability and market share. This study conducts a comprehensive comparative analysis of machine learning models for customer churn prediction, focusing on the U.S. context. The research evaluates the performance of logistic regression, random forest, and neural networks using industry-specific datasets, considering the economic impact and practical implications of the findings. The exploratory data analysis reveals unique patterns and trends in the U.S. banking and finance industry, such as the age distribution of customers and the prevalence of dormant accounts. The study incorporates macroeconomic factors to capture the potential influence of external conditions on customer churn behavior. The findings highlight the importance of leveraging advanced machine learning techniques and comprehensive customer data to develop effective churn prevention strategies in the U.S. context. By accurately predicting customer churn, financial institutions can proactively identify at-risk customers, implement targeted retention strategies, and optimize resource allocation. The study discusses the limitations and potential future improvements, serving as a roadmap for researchers and practitioners to further advance the field of customer churn prediction in the evolving landscape of the U.S. banking and finance industry.展开更多
Mobile Edge Computing(MEC)assists clouds to handle enormous tasks from mobile devices in close proximity.The edge servers are not allocated efficiently according to the dynamic nature of the network.It leads to process...Mobile Edge Computing(MEC)assists clouds to handle enormous tasks from mobile devices in close proximity.The edge servers are not allocated efficiently according to the dynamic nature of the network.It leads to processing delay,and the tasks are dropped due to time limitations.The researchersfind it difficult and complex to determine the offloading decision because of uncertain load dynamic condition over the edge nodes.The challenge relies on the offload-ing decision on selection of edge nodes for offloading in a centralized manner.This study focuses on minimizing task-processing time while simultaneously increasing the success rate of service provided by edge servers.Initially,a task-offloading problem needs to be formulated based on the communication and pro-cessing.Then offloading decision problem is solved by deep analysis on taskflow in the network and feedback from the devices on edge services.The significance of the model is improved with the modelling of Deep Mobile-X architecture and bi-directional Long Short Term Memory(b-LSTM).The simulation is done in the Edgecloudsim environment,and the outcomes show the significance of the proposed idea.The processing time of the anticipated model is 6.6 s.The following perfor-mance metrics,improved server utilization,the ratio of the dropped task,and number of offloading tasks are evaluated and compared with existing learning approaches.The proposed model shows a better trade-off compared to existing approaches.展开更多
With the rapid evolution of Internet technology,fog computing has taken a major role in managing large amounts of data.The major concerns in this domain are security and privacy.Therefore,attaining a reliable level of...With the rapid evolution of Internet technology,fog computing has taken a major role in managing large amounts of data.The major concerns in this domain are security and privacy.Therefore,attaining a reliable level of confidentiality in the fog computing environment is a pivotal task.Among different types of data stored in the fog,the 3D point and mesh fog data are increasingly popular in recent days,due to the growth of 3D modelling and 3D printing technologies.Hence,in this research,we propose a novel scheme for preserving the privacy of 3D point and mesh fog data.Chaotic Cat mapbased data encryption is a recently trending research area due to its unique properties like pseudo-randomness,deterministic nature,sensitivity to initial conditions,ergodicity,etc.To boost encryption efficiency significantly,in this work,we propose a novel Chaotic Cat map.The sequence generated by this map is used to transform the coordinates of the fog data.The improved range of the proposed map is depicted using bifurcation analysis.The quality of the proposed Chaotic Cat map is also analyzed using metrics like Lyapunov exponent and approximate entropy.We also demonstrate the performance of the proposed encryption framework using attacks like brute-force attack and statistical attack.The experimental results clearly depict that the proposed framework produces the best results compared to the previous works in the literature.展开更多
In this article,the rheology of Ferro-fluid over an axisymmetric heated disc with a variable magnetic field by considering the dispersion of hybrid nanoparticles is considered.The flow is assumed to be produced by the...In this article,the rheology of Ferro-fluid over an axisymmetric heated disc with a variable magnetic field by considering the dispersion of hybrid nanoparticles is considered.The flow is assumed to be produced by the stretching of a rotating heated disc.The contribution of variable thermophysical properties is taken to explore themomentum,mass and thermal transportation.The concept of boundary layermechanismis engaged to reduce the complex problem into a simpler one in the form of coupled partial differential equations system.The complex coupled PDEs are converted into highly nonlinear coupled ordinary differential equations system(ODEs)and the resulting nonlinear flow problem is handled numerically.The solution is obtained via finite element procedure(FEP)and convergence is established by conducting the grid-independent survey.The solution of converted dimensionless problem containing fluid velocity,temperature and concentration field is plotted against numerous involved emerging parameters and their impact is noted.From the obtained solution,it is monitored that higher values of magnetic parameter retard the fluid flow and escalating values of Eckert number results in to enhance temperature profile.Ferro-fluid flow and heat energy for the case of the Yamada Ota hybrid model are higher than for the case of the Hamilton Crosser hybrid model.Developing a model is applicable to the printing process,electronic devices,temperature measurements,engineering process and food-making process.The amount of mass species is reduced vs.incline impacts of chemical reaction and Schmidt parameter.展开更多
Due to an increase in agricultural mislabeling and careless handling of non-perishable foods in recent years,consumers have been calling for the food sector to be more transparent.Due to information dispersion between...Due to an increase in agricultural mislabeling and careless handling of non-perishable foods in recent years,consumers have been calling for the food sector to be more transparent.Due to information dispersion between divisions and the propensity to record inaccurate data,current traceability solutions typically fail to provide reliable farm-to-fork histories of products.The threemost enticing characteristics of blockchain technology are openness,integrity,and traceability,which make it a potentially crucial tool for guaranteeing the integrity and correctness of data.In this paper,we suggest a permissioned blockchain system run by organizations,such as regulatory bodies,to promote the origin-tracking of shelf-stable agricultural products.We propose a four-tiered architecture,parallel side chains,Zero-Knowledge Proofs(ZKPs),and Interplanetary File Systems(IPFS).These ensure that information about where an item came from is shared,those commercial competitors cannot get to it,those big storage problems are handled,and the system can be scaled to handle many transactions at once.The solution maintains the confidentiality of all transaction flows when provenance data is queried utilizing smart contracts and a consumer-grade reliance rate.Extensive simulation testing using Ethereum Rinkeby and Polygon demonstrates reduced execution time,latency,and throughput overheads.展开更多
Because stress has such a powerful impact on human health,we must be able to identify it automatically in our everyday lives.The human activity recognition(HAR)system use data from several kinds of sensors to try to r...Because stress has such a powerful impact on human health,we must be able to identify it automatically in our everyday lives.The human activity recognition(HAR)system use data from several kinds of sensors to try to recognize and evaluate human actions automatically recognize and evaluate human actions.Using the multimodal dataset DEAP(Database for Emotion Analysis using Physiological Signals),this paper presents deep learning(DL)technique for effectively detecting human stress.The combination of vision-based and sensor-based approaches for recognizing human stress will help us achieve the increased efficiency of current stress recognition systems and predict probable actions in advance of when fatal.Based on visual and EEG(Electroencephalogram)data,this research aims to enhance the performance and extract the dominating characteristics of stress detection.For the stress identification test,we utilized the DEAP dataset,which included video and EEG data.We also demonstrate that combining video and EEG characteristics may increase overall performance,with the suggested stochastic features providing the most accurate results.In the first step,CNN(Convolutional Neural Network)extracts feature vectors from video frames and EEG data.Feature Level(FL)fusion that combines the features extracted from video and EEG data.We use XGBoost as our classifier model to predict stress,and we put it into action.The stress recognition accuracy of the proposed method is compared to existing methods of Decision Tree(DT),Random Forest(RF),AdaBoost,Linear Discriminant Analysis(LDA),and KNearest Neighborhood(KNN).When we compared our technique to existing state-of-the-art approaches,we found that the suggested DL methodology combining multimodal and heterogeneous inputs may improve stress identification.展开更多
As the COVID-19 pandemic swept the globe,social media plat-forms became an essential source of information and communication for many.International students,particularly,turned to Twitter to express their struggles an...As the COVID-19 pandemic swept the globe,social media plat-forms became an essential source of information and communication for many.International students,particularly,turned to Twitter to express their struggles and hardships during this difficult time.To better understand the sentiments and experiences of these international students,we developed the Situational Aspect-Based Annotation and Classification(SABAC)text mining framework.This framework uses a three-layer approach,combining baseline Deep Learning(DL)models with Machine Learning(ML)models as meta-classifiers to accurately predict the sentiments and aspects expressed in tweets from our collected Student-COVID-19 dataset.Using the pro-posed aspect2class annotation algorithm,we labeled bulk unlabeled tweets according to their contained aspect terms.However,we also recognized the challenges of reducing data’s high dimensionality and sparsity to improve performance and annotation on unlabeled datasets.To address this issue,we proposed the Volatile Stopwords Filtering(VSF)technique to reduce sparsity and enhance classifier performance.The resulting Student-COVID Twitter dataset achieved a sophisticated accuracy of 93.21%when using the random forest as a meta-classifier.Through testing on three benchmark datasets,we found that the SABAC ensemble framework performed exceptionally well.Our findings showed that international students during the pandemic faced various issues,including stress,uncertainty,health concerns,financial stress,and difficulties with online classes and returning to school.By analyzing and summarizing these annotated tweets,decision-makers can better understand and address the real-time problems international students face during the ongoing pandemic.展开更多
Free-space optical(FSO)communication is of supreme importance for designing next-generation networks.Over the past decades,the radio frequency(RF)spectrum has been the main topic of interest for wireless technology.Th...Free-space optical(FSO)communication is of supreme importance for designing next-generation networks.Over the past decades,the radio frequency(RF)spectrum has been the main topic of interest for wireless technology.The RF spectrum is becoming denser and more employed,making its availability tough for additional channels.Optical communication,exploited for messages or indications in historical times,is now becoming famous and useful in combination with error-correcting codes(ECC)to mitigate the effects of fading caused by atmospheric turbulence.A free-space communication system(FSCS)in which the hybrid technology is based on FSO and RF.FSCS is a capable solution to overcome the downsides of current schemes and enhance the overall link reliability and availability.The proposed FSCS with regular low-density parity-check(LDPC)for coding techniques is deliberated and evaluated in terms of signal-to-noise ratio(SNR)in this paper.The extrinsic information transfer(EXIT)methodology is an incredible technique employed to investigate the sum-product decoding algorithm of LDPC codes and optimize the EXIT chart by applying curve fitting.In this research work,we also analyze the behavior of the EXIT chart of regular/irregular LDPC for the FSCS.We also investigate the error performance of LDPC code for the proposed FSCS.展开更多
As the demand for used books has grown in recent years,various online/offline market platforms have emerged to support the trade in used books.The price of used books can depend on various factors,such as the state of...As the demand for used books has grown in recent years,various online/offline market platforms have emerged to support the trade in used books.The price of used books can depend on various factors,such as the state of preservation(i.e.,condition),the value of possession,and so on.Therefore,some online platforms provide a reference document to evaluate the condition of used books,but it is still not trivial for individual sellers to determine the price.The lack of a standard quantitative method to assess the condition of the used book would confuse both sellers and consumers,thereby decreasing the user experience of the online secondhand marketplace.Therefore,this paper discusses the automatic examination of the condition of used books based on deep learning approaches.In this work,we present a book damage detection system based on various You Only Look Once(YOLO)object detection models.Using YOLOv5,YOLOR,and YOLOX,we also introduce various training configurations that can be applied to improve performance.Specifically,a combination of different augmentation strategies including flip,rotation,crop,mosaic,and mixup was used for comparison.To train and validate our system,a book damage dataset composed of a total of 620 book images with 3,989 annotations,containing six types of damages(i.e.,Wear,Spot,Notch,Barcode,Tag,and Ripped),collected from the library books is presented.We evaluated each model trained with different configurations to figure out their detection accuracy as well as training efficiency.The experimental results showed that YOLOX trained with its best training configuration yielded the best performance in terms of detection accuracy,by achieving 60.0%(mAP@.5:.95)and 72.9%(mAP@.5)for book damage detection.However,YOLOX performed worst in terms of training efficiency,indicating that there is a trade-off between accuracy and efficiency.Based on the findings from the study,we discuss the feasibility and limitations of our system and future research directions.展开更多
In the current era of information technology,students need to learn modern programming languages efficiently.The art of teaching/learning program-ming requires many logical and conceptual skills.So it’s a challenging ...In the current era of information technology,students need to learn modern programming languages efficiently.The art of teaching/learning program-ming requires many logical and conceptual skills.So it’s a challenging task for the instructors/learners to teach/learn these programming languages effectively and efficiently.Mind mapping is a useful visual tool for establishing ideas and connecting them to solve problems.This research proposed an effective way to teach programming languages through visual tools.This experimental study uses a mind mapping tool to teach two programming environments:Text-based Programming and Blocks-based Programming.We performed the experiments with one hundred and sixty undergraduate students of two public sector universities in the Asia Pacific region.Four different instructional approaches,including block-based language(BBL),text-based languages(TBL),mind map with text-based language(MMTBL)and mind mapping with block-based(MMBBL)are used for this purpose.The results show that instructional approaches using a mind mapping tool to help students solve given tasks in their critical thinking are more effective than other instructional techniques.展开更多
The extensive utilization of the Internet in everyday life can be attributed to the substantial accessibility of online services and the growing significance of the data transmitted via the Internet.Regrettably,this d...The extensive utilization of the Internet in everyday life can be attributed to the substantial accessibility of online services and the growing significance of the data transmitted via the Internet.Regrettably,this development has expanded the potential targets that hackers might exploit.Without adequate safeguards,data transmitted on the internet is significantly more susceptible to unauthorized access,theft,or alteration.The identification of unauthorised access attempts is a critical component of cybersecurity as it aids in the detection and prevention of malicious attacks.This research paper introduces a novel intrusion detection framework that utilizes Recurrent Neural Networks(RNN)integrated with Long Short-Term Memory(LSTM)units.The proposed model can identify various types of cyberattacks,including conventional and distinctive forms.Recurrent networks,a specific kind of feedforward neural networks,possess an intrinsic memory component.Recurrent Neural Networks(RNNs)incorporating Long Short-Term Memory(LSTM)mechanisms have demonstrated greater capabilities in retaining and utilizing data dependencies over extended periods.Metrics such as data types,training duration,accuracy,number of false positives,and number of false negatives are among the parameters employed to assess the effectiveness of these models in identifying both common and unusual cyberattacks.RNNs are utilised in conjunction with LSTM to support human analysts in identifying possible intrusion events,hence enhancing their decision-making capabilities.A potential solution to address the limitations of Shallow learning is the introduction of the Eccentric Intrusion Detection Model.This model utilises Recurrent Neural Networks,specifically exploiting LSTM techniques.The proposed model achieves detection accuracy(99.5%),generalisation(99%),and false-positive rate(0.72%),the parameters findings reveal that it is superior to state-of-the-art techniques.展开更多
Neuroimaging has emerged over the last few decades as a crucial tool in diagnosing Alzheimer’s disease(AD).Mild cognitive impairment(MCI)is a condition that falls between the spectrum of normal cognitive function and...Neuroimaging has emerged over the last few decades as a crucial tool in diagnosing Alzheimer’s disease(AD).Mild cognitive impairment(MCI)is a condition that falls between the spectrum of normal cognitive function and AD.However,previous studies have mainly used handcrafted features to classify MCI,AD,and normal control(NC)individuals.This paper focuses on using gray matter(GM)scans obtained through magnetic resonance imaging(MRI)for the diagnosis of individuals with MCI,AD,and NC.To improve classification performance,we developed two transfer learning strategies with data augmentation(i.e.,shear range,rotation,zoom range,channel shift).The first approach is a deep Siamese network(DSN),and the second approach involves using a cross-domain strategy with customized VGG-16.We performed experiments on the Alzheimer’s Disease Neuroimaging Initiative(ADNI)dataset to evaluate the performance of our proposed models.Our experimental results demonstrate superior performance in classifying the three binary classification tasks:NC vs.AD,NC vs.MCI,and MCI vs.AD.Specifically,we achieved a classification accuracy of 97.68%,94.25%,and 92.18%for the three cases,respectively.Our study proposes two transfer learning strategies with data augmentation to accurately diagnose MCI,AD,and normal control individuals using GM scans.Our findings provide promising results for future research and clinical applications in the early detection and diagnosis of AD.展开更多
This paper proposes a blockchain-based system as a secure, efficient, and cost-effective alternative to SWIFT for cross-border remittances. The current SWIFT system faces challenges, including slow settlement times, h...This paper proposes a blockchain-based system as a secure, efficient, and cost-effective alternative to SWIFT for cross-border remittances. The current SWIFT system faces challenges, including slow settlement times, high transaction costs, and vulnerability to fraud. Leveraging blockchain technology’s decentralized, transparent, and immutable nature, the proposed system aims to address these limitations. Key features include modular architecture, implementation of microservices, and advanced cryptographic protocols. The system incorporates Proof of Stake consensus with BLS signatures, smart contract execution with dynamic pricing, and a decentralized oracle network for currency conversion. A sophisticated risk-based authentication system utilizes Bayesian networks and machine learning for enhanced security. Mathematical models are presented for critical components, including transaction validation, currency conversion, and regulatory compliance. Simulations demonstrate potential improvements in transaction speed and costs. However, challenges such as regulatory hurdles, user adoption, scalability, and integration with legacy systems must be addressed. The paper provides a comparative analysis between the proposed blockchain system and SWIFT, highlighting advantages in transaction speed, costs, and security. Mitigation strategies are proposed for key challenges. Recommendations are made for further research into scaling solutions, regulatory frameworks, and user-centric designs. The adoption of blockchain-based remittances could significantly impact the financial sector, potentially disrupting traditional models and promoting financial inclusion in underserved markets. However, successful implementation will require collaboration between blockchain innovators, financial institutions, and regulators to create an enabling environment for this transformative system.展开更多
Deep learning has recently become a viable approach for classifying Alzheimer's disease(AD)in medical imaging.However,existing models struggle to efficiently extract features from medical images and may squander a...Deep learning has recently become a viable approach for classifying Alzheimer's disease(AD)in medical imaging.However,existing models struggle to efficiently extract features from medical images and may squander additional information resources for illness classification.To address these issues,a deep three‐dimensional convolutional neural network incorporating multi‐task learning and attention mechanisms is proposed.An upgraded primary C3D network is utilised to create rougher low‐level feature maps.It introduces a new convolution block that focuses on the structural aspects of the magnetORCID:ic resonance imaging image and another block that extracts attention weights unique to certain pixel positions in the feature map and multiplies them with the feature map output.Then,several fully connected layers are used to achieve multi‐task learning,generating three outputs,including the primary classification task.The other two outputs employ backpropagation during training to improve the primary classification job.Experimental findings show that the authors’proposed method outperforms current approaches for classifying AD,achieving enhanced classification accuracy and other in-dicators on the Alzheimer's disease Neuroimaging Initiative dataset.The authors demonstrate promise for future disease classification studies.展开更多
文摘This article delves into the analysis of performance and utilization of Support Vector Machines (SVMs) for the critical task of forest fire detection using image datasets. With the increasing threat of forest fires to ecosystems and human settlements, the need for rapid and accurate detection systems is of utmost importance. SVMs, renowned for their strong classification capabilities, exhibit proficiency in recognizing patterns associated with fire within images. By training on labeled data, SVMs acquire the ability to identify distinctive attributes associated with fire, such as flames, smoke, or alterations in the visual characteristics of the forest area. The document thoroughly examines the use of SVMs, covering crucial elements like data preprocessing, feature extraction, and model training. It rigorously evaluates parameters such as accuracy, efficiency, and practical applicability. The knowledge gained from this study aids in the development of efficient forest fire detection systems, enabling prompt responses and improving disaster management. Moreover, the correlation between SVM accuracy and the difficulties presented by high-dimensional datasets is carefully investigated, demonstrated through a revealing case study. The relationship between accuracy scores and the different resolutions used for resizing the training datasets has also been discussed in this article. These comprehensive studies result in a definitive overview of the difficulties faced and the potential sectors requiring further improvement and focus.
文摘BACKGROUND Sodium glucose cotransporter-2 inhibitors(SGLT-2i)are a class of drugs with modest antidiabetic efficacy,weight loss effect,and cardiovascular benefits as proven by multiple randomised controlled trials(RCTs).However,real-world data on the comparative efficacy and safety of individual SGLT-2i medications is sparse.AIM To study the comparative efficacy and safety of SGLT-2i using real-world clinical data.METHODS We evaluated the comparative efficacy data of 3 SGLT-2i drugs(dapagliflozin,canagliflozin,and empagliflozin)used for treating patients with type 2 diabetes mellitus.Data on the reduction of glycated hemoglobin(HbA1c),body weight,blood pressure(BP),urine albumin creatinine ratio(ACR),and adverse effects were recorded retrospectively.RESULTS Data from 467 patients with a median age of 64(14.8)years,294(62.96%)males and 375(80.5%)Caucasians were analysed.Median diabetes duration was 16.0(9.0)years,and the duration of SGLT-2i use was 3.6(2.1)years.SGLT-2i molecules used were dapagliflozin 10 mg(n=227;48.6%),canagliflozin 300 mg(n=160;34.3%),and empagliflozin 25 mg(n=80;17.1).Baseline median(interquartile range)HbA1c in mmol/mol were:dapagliflozin-78.0(25.3),canagliflozin-80.0(25.5),and empagliflozin-75.0(23.5)respectively.The respective median HbA1c reduction at 12 months and the latest review(just prior to the study)were:66.5(22.8)&69.0(24.0),67.0(16.3)&66.0(28.0),and 67.0(22.5)&66.5(25.8)respectively(P<0.001 for all comparisons from baseline).Significant improvements in body weight(in kilograms)from baseline to study end were noticed with dapagliflozin-101(29.5)to 92.2(25.6),and canagliflozin 100(28.3)to 95.3(27.5)only.Significant reductions in median systolic and diastolic BP,from 144(21)mmHg to 139(23)mmHg;(P=0.015),and from 82(16)mmHg to 78(19)mmHg;(P<0.001)respectively were also observed.A significant reduction of microalbuminuria was observed with canagliflozin only[ACR 14.6(42.6)at baseline to 8.9(23.7)at the study end;P=0.043].Adverse effects of SGLT-2i were as follows:genital thrush and urinary infection-20(8.8%)&17(7.5%)with dapagliflozin;9(5.6%)&5(3.13%)with canagliflozin;and 4(5%)&4(5%)with empagliflozin.Diabetic ketoacidosis was observed in 4(1.8%)with dapagliflozin and 1(0.63%)with canagliflozin.CONCLUSION Treatment of patients with SGLT-2i is associated with statistically significant reductions in HbA1c,body weight,and better than those reported in RCTs,with low side effect profiles.A review of large-scale real-world data is needed to inform better clinical practice decision making.
文摘Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin in CC’s performance,the Cloud Service Broker(CSB),orchestrates DC selection.Failure to adroitly route user requests with suitable DCs transforms the CSB into a bottleneck,endangering service quality.To tackle this,deploying an efficient CSB policy becomes imperative,optimizing DC selection to meet stringent Qualityof-Service(QoS)demands.Amidst numerous CSB policies,their implementation grapples with challenges like costs and availability.This article undertakes a holistic review of diverse CSB policies,concurrently surveying the predicaments confronted by current policies.The foremost objective is to pinpoint research gaps and remedies to invigorate future policy development.Additionally,it extensively clarifies various DC selection methodologies employed in CC,enriching practitioners and researchers alike.Employing synthetic analysis,the article systematically assesses and compares myriad DC selection techniques.These analytical insights equip decision-makers with a pragmatic framework to discern the apt technique for their needs.In summation,this discourse resoundingly underscores the paramount importance of adept CSB policies in DC selection,highlighting the imperative role of efficient CSB policies in optimizing CC performance.By emphasizing the significance of these policies and their modeling implications,the article contributes to both the general modeling discourse and its practical applications in the CC domain.
基金supported by the Bio&Medical Technology Development Program of the National Research Foundation(NRF)funded by the Korean government(MSIT)(NRF-2021M3A9E4080780)Hankuk University of Foreign Studies(2023).
文摘Regular physical activity(PA)is known to enhance multifaceted health benefits,including both physical and mental health.However,traditional in-person physical activity programs have drawbacks,including time constraints for busy people.Although evidence suggests positive impacts on mental health through mobile-based physical activity,effects of accumulated short bouts of physical activity using mobile devices are unexplored.Thus,this study aims to investigate these effects,focusing on depression,perceived stress,and negative affectivity among South Korean college students.Forty-six healthy college students were divided into the accumulated group(n=23,female=47.8%)and control group(n=23,female=47.6%).The accumulated group engaged in mobile-based physical activity,following guidelines to accumulate a minimum of two times per day and three times a week.Sessions were divided into short bouts,ensuing each bout lasted at least 10 min.The control group did not engage in any specific physical activity.The data analysis involved comparing the scores of the intervention and control groups using several statistical techniques,such as independent sample t-test,paired sample t-tests,and 2(time)×2(group)repeated measures analysis of variance.The demographic characteristics at the pre-test showed no statistically significant differences between the groups.The accumulated group had significant decreases in depression(t_(40)=2.59,p=0.013,Cohen’s D=0.84)and perceived stress(t_(40)=2.06,p=0.046,Cohen’s D=0.56)from the pre-to post-test.The control group exhibited no statistically significant differences in any variables.Furthermore,there were significant effects of time on depression scores(F1,36=4.77,p=0.036,η_(p)^(2)=0.12)while significant interaction effects were also observed for depression(F_(1,36)=6.59,p=0.015,η_(p)^(2)=0.16).This study offers informative insights into the potential advantages of mobile-based physical activity programs with accumulated periods for enhancing mental health,specifically in relation to depression.This study illuminates the current ongoing discussions on efficient approaches to encourage mobile-based physical activity and improve mental well-being,addressing various lifestyles and busy schedules.
文摘BACKGROUND Diabesity(diabetes as a consequence of obesity)has emerged as a huge healthcare challenge across the globe due to the obesity pandemic.Judicious use of antidiabetic medications including semaglutide is important for optimal management of diabesity as proven by multiple randomized controlled trials.However,more real-world data is needed to further improve the clinical practice.AIM To study the real-world benefits and side effects of using semaglutide to manage patients with diabesity.METHODS We evaluated the efficacy and safety of semaglutide use in managing patients with diabesity in a large academic hospital in the United States.Several parameters were analyzed including demographic information,the data on improvement of glycated hemoglobin(HbA1c),body weight reduction and insulin dose adjustments at 6 and 12 months,as well as at the latest follow up period.The data was obtained from the electronic patient records between January 2019 to May 2023.RESULTS 106 patients(56 males)with type 2 diabetes mellitus(T2DM),mean age 60.8±11.2 years,mean durations of T2DM 12.4±7.2 years and mean semaglutide treatment for 2.6±1.1 years were included.Semaglutide treatment was associated with significant improvement in diabesity outcomes such as mean weight reductions from baseline 110.4±24.6 kg to 99.9±24.9 kg at 12 months and 96.8±22.9 kg at latest follow up and HbA1c improvement from baseline of 82±21 mmol/mol to 67±20 at 12 months and 71±23 mmol/mol at the latest follow up.An insulin dose reduction from mean baseline of 95±74 units to 76.5±56.2 units was also observed at the latest follow up.Side effects were mild and mainly gastrointestinal like bloating and nausea improving with prolonged use of semaglutide.CONCLUSION Semaglutide treatment is associated with significant improvement in diabesity outcomes such as reduction in body weight,HbA1c and insulin doses without major adverse effects.Reviews of largescale real-world data are expected to inform better clinical practice decision making to improve the care of patients with diabesity.
文摘This paper presents a detailed statistical exploration of crime trends in Chicago from 2001 to 2023, employing data from the Chicago Police Department’s publicly available crime database. The study aims to elucidate the patterns, distribution, and variations in crime across different types and locations, providing a comprehensive picture of the city’s crime landscape through advanced data analytics and visualization techniques. Using exploratory data analysis (EDA), we identified significant insights into crime trends, including the prevalence of theft and battery, the impact of seasonal changes on crime rates, and spatial concentrations of criminal activities. The research leveraged a Power BI dashboard to visually represent crime data, facilitating an intuitive understanding of complex patterns and enabling dynamic interaction with the dataset. Key findings highlight notable disparities in crime occurrences by type, location, and time, offering a granular view of crime hotspots and temporal trends. Additionally, the study examines clearance rates, revealing variations in the resolution of cases across different crime categories. This analysis not only sheds light on the current state of urban safety but also serves as a critical tool for policymakers and law enforcement agencies to develop targeted interventions. The paper concludes with recommendations for enhancing public safety strategies and suggests directions for future research, emphasizing the need for continuous data-driven approaches to effectively address and mitigate urban crime. This study contributes to the broader discourse on urban safety, crime prevention, and the role of data analytics in public policy and community well-being.
文摘Customer churn poses a significant challenge for the banking and finance industry in the United States, directly affecting profitability and market share. This study conducts a comprehensive comparative analysis of machine learning models for customer churn prediction, focusing on the U.S. context. The research evaluates the performance of logistic regression, random forest, and neural networks using industry-specific datasets, considering the economic impact and practical implications of the findings. The exploratory data analysis reveals unique patterns and trends in the U.S. banking and finance industry, such as the age distribution of customers and the prevalence of dormant accounts. The study incorporates macroeconomic factors to capture the potential influence of external conditions on customer churn behavior. The findings highlight the importance of leveraging advanced machine learning techniques and comprehensive customer data to develop effective churn prevention strategies in the U.S. context. By accurately predicting customer churn, financial institutions can proactively identify at-risk customers, implement targeted retention strategies, and optimize resource allocation. The study discusses the limitations and potential future improvements, serving as a roadmap for researchers and practitioners to further advance the field of customer churn prediction in the evolving landscape of the U.S. banking and finance industry.
文摘Mobile Edge Computing(MEC)assists clouds to handle enormous tasks from mobile devices in close proximity.The edge servers are not allocated efficiently according to the dynamic nature of the network.It leads to processing delay,and the tasks are dropped due to time limitations.The researchersfind it difficult and complex to determine the offloading decision because of uncertain load dynamic condition over the edge nodes.The challenge relies on the offload-ing decision on selection of edge nodes for offloading in a centralized manner.This study focuses on minimizing task-processing time while simultaneously increasing the success rate of service provided by edge servers.Initially,a task-offloading problem needs to be formulated based on the communication and pro-cessing.Then offloading decision problem is solved by deep analysis on taskflow in the network and feedback from the devices on edge services.The significance of the model is improved with the modelling of Deep Mobile-X architecture and bi-directional Long Short Term Memory(b-LSTM).The simulation is done in the Edgecloudsim environment,and the outcomes show the significance of the proposed idea.The processing time of the anticipated model is 6.6 s.The following perfor-mance metrics,improved server utilization,the ratio of the dropped task,and number of offloading tasks are evaluated and compared with existing learning approaches.The proposed model shows a better trade-off compared to existing approaches.
基金This work was supprted by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2022R151),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘With the rapid evolution of Internet technology,fog computing has taken a major role in managing large amounts of data.The major concerns in this domain are security and privacy.Therefore,attaining a reliable level of confidentiality in the fog computing environment is a pivotal task.Among different types of data stored in the fog,the 3D point and mesh fog data are increasingly popular in recent days,due to the growth of 3D modelling and 3D printing technologies.Hence,in this research,we propose a novel scheme for preserving the privacy of 3D point and mesh fog data.Chaotic Cat mapbased data encryption is a recently trending research area due to its unique properties like pseudo-randomness,deterministic nature,sensitivity to initial conditions,ergodicity,etc.To boost encryption efficiency significantly,in this work,we propose a novel Chaotic Cat map.The sequence generated by this map is used to transform the coordinates of the fog data.The improved range of the proposed map is depicted using bifurcation analysis.The quality of the proposed Chaotic Cat map is also analyzed using metrics like Lyapunov exponent and approximate entropy.We also demonstrate the performance of the proposed encryption framework using attacks like brute-force attack and statistical attack.The experimental results clearly depict that the proposed framework produces the best results compared to the previous works in the literature.
文摘In this article,the rheology of Ferro-fluid over an axisymmetric heated disc with a variable magnetic field by considering the dispersion of hybrid nanoparticles is considered.The flow is assumed to be produced by the stretching of a rotating heated disc.The contribution of variable thermophysical properties is taken to explore themomentum,mass and thermal transportation.The concept of boundary layermechanismis engaged to reduce the complex problem into a simpler one in the form of coupled partial differential equations system.The complex coupled PDEs are converted into highly nonlinear coupled ordinary differential equations system(ODEs)and the resulting nonlinear flow problem is handled numerically.The solution is obtained via finite element procedure(FEP)and convergence is established by conducting the grid-independent survey.The solution of converted dimensionless problem containing fluid velocity,temperature and concentration field is plotted against numerous involved emerging parameters and their impact is noted.From the obtained solution,it is monitored that higher values of magnetic parameter retard the fluid flow and escalating values of Eckert number results in to enhance temperature profile.Ferro-fluid flow and heat energy for the case of the Yamada Ota hybrid model are higher than for the case of the Hamilton Crosser hybrid model.Developing a model is applicable to the printing process,electronic devices,temperature measurements,engineering process and food-making process.The amount of mass species is reduced vs.incline impacts of chemical reaction and Schmidt parameter.
文摘Due to an increase in agricultural mislabeling and careless handling of non-perishable foods in recent years,consumers have been calling for the food sector to be more transparent.Due to information dispersion between divisions and the propensity to record inaccurate data,current traceability solutions typically fail to provide reliable farm-to-fork histories of products.The threemost enticing characteristics of blockchain technology are openness,integrity,and traceability,which make it a potentially crucial tool for guaranteeing the integrity and correctness of data.In this paper,we suggest a permissioned blockchain system run by organizations,such as regulatory bodies,to promote the origin-tracking of shelf-stable agricultural products.We propose a four-tiered architecture,parallel side chains,Zero-Knowledge Proofs(ZKPs),and Interplanetary File Systems(IPFS).These ensure that information about where an item came from is shared,those commercial competitors cannot get to it,those big storage problems are handled,and the system can be scaled to handle many transactions at once.The solution maintains the confidentiality of all transaction flows when provenance data is queried utilizing smart contracts and a consumer-grade reliance rate.Extensive simulation testing using Ethereum Rinkeby and Polygon demonstrates reduced execution time,latency,and throughput overheads.
文摘Because stress has such a powerful impact on human health,we must be able to identify it automatically in our everyday lives.The human activity recognition(HAR)system use data from several kinds of sensors to try to recognize and evaluate human actions automatically recognize and evaluate human actions.Using the multimodal dataset DEAP(Database for Emotion Analysis using Physiological Signals),this paper presents deep learning(DL)technique for effectively detecting human stress.The combination of vision-based and sensor-based approaches for recognizing human stress will help us achieve the increased efficiency of current stress recognition systems and predict probable actions in advance of when fatal.Based on visual and EEG(Electroencephalogram)data,this research aims to enhance the performance and extract the dominating characteristics of stress detection.For the stress identification test,we utilized the DEAP dataset,which included video and EEG data.We also demonstrate that combining video and EEG characteristics may increase overall performance,with the suggested stochastic features providing the most accurate results.In the first step,CNN(Convolutional Neural Network)extracts feature vectors from video frames and EEG data.Feature Level(FL)fusion that combines the features extracted from video and EEG data.We use XGBoost as our classifier model to predict stress,and we put it into action.The stress recognition accuracy of the proposed method is compared to existing methods of Decision Tree(DT),Random Forest(RF),AdaBoost,Linear Discriminant Analysis(LDA),and KNearest Neighborhood(KNN).When we compared our technique to existing state-of-the-art approaches,we found that the suggested DL methodology combining multimodal and heterogeneous inputs may improve stress identification.
基金supported by the National Natural Science Foundation of China[Grant Number:92067106]the Ministry of Education of the People’s Republic of China[Grant Number:E-GCCRC20200309].
文摘As the COVID-19 pandemic swept the globe,social media plat-forms became an essential source of information and communication for many.International students,particularly,turned to Twitter to express their struggles and hardships during this difficult time.To better understand the sentiments and experiences of these international students,we developed the Situational Aspect-Based Annotation and Classification(SABAC)text mining framework.This framework uses a three-layer approach,combining baseline Deep Learning(DL)models with Machine Learning(ML)models as meta-classifiers to accurately predict the sentiments and aspects expressed in tweets from our collected Student-COVID-19 dataset.Using the pro-posed aspect2class annotation algorithm,we labeled bulk unlabeled tweets according to their contained aspect terms.However,we also recognized the challenges of reducing data’s high dimensionality and sparsity to improve performance and annotation on unlabeled datasets.To address this issue,we proposed the Volatile Stopwords Filtering(VSF)technique to reduce sparsity and enhance classifier performance.The resulting Student-COVID Twitter dataset achieved a sophisticated accuracy of 93.21%when using the random forest as a meta-classifier.Through testing on three benchmark datasets,we found that the SABAC ensemble framework performed exceptionally well.Our findings showed that international students during the pandemic faced various issues,including stress,uncertainty,health concerns,financial stress,and difficulties with online classes and returning to school.By analyzing and summarizing these annotated tweets,decision-makers can better understand and address the real-time problems international students face during the ongoing pandemic.
文摘Free-space optical(FSO)communication is of supreme importance for designing next-generation networks.Over the past decades,the radio frequency(RF)spectrum has been the main topic of interest for wireless technology.The RF spectrum is becoming denser and more employed,making its availability tough for additional channels.Optical communication,exploited for messages or indications in historical times,is now becoming famous and useful in combination with error-correcting codes(ECC)to mitigate the effects of fading caused by atmospheric turbulence.A free-space communication system(FSCS)in which the hybrid technology is based on FSO and RF.FSCS is a capable solution to overcome the downsides of current schemes and enhance the overall link reliability and availability.The proposed FSCS with regular low-density parity-check(LDPC)for coding techniques is deliberated and evaluated in terms of signal-to-noise ratio(SNR)in this paper.The extrinsic information transfer(EXIT)methodology is an incredible technique employed to investigate the sum-product decoding algorithm of LDPC codes and optimize the EXIT chart by applying curve fitting.In this research work,we also analyze the behavior of the EXIT chart of regular/irregular LDPC for the FSCS.We also investigate the error performance of LDPC code for the proposed FSCS.
文摘As the demand for used books has grown in recent years,various online/offline market platforms have emerged to support the trade in used books.The price of used books can depend on various factors,such as the state of preservation(i.e.,condition),the value of possession,and so on.Therefore,some online platforms provide a reference document to evaluate the condition of used books,but it is still not trivial for individual sellers to determine the price.The lack of a standard quantitative method to assess the condition of the used book would confuse both sellers and consumers,thereby decreasing the user experience of the online secondhand marketplace.Therefore,this paper discusses the automatic examination of the condition of used books based on deep learning approaches.In this work,we present a book damage detection system based on various You Only Look Once(YOLO)object detection models.Using YOLOv5,YOLOR,and YOLOX,we also introduce various training configurations that can be applied to improve performance.Specifically,a combination of different augmentation strategies including flip,rotation,crop,mosaic,and mixup was used for comparison.To train and validate our system,a book damage dataset composed of a total of 620 book images with 3,989 annotations,containing six types of damages(i.e.,Wear,Spot,Notch,Barcode,Tag,and Ripped),collected from the library books is presented.We evaluated each model trained with different configurations to figure out their detection accuracy as well as training efficiency.The experimental results showed that YOLOX trained with its best training configuration yielded the best performance in terms of detection accuracy,by achieving 60.0%(mAP@.5:.95)and 72.9%(mAP@.5)for book damage detection.However,YOLOX performed worst in terms of training efficiency,indicating that there is a trade-off between accuracy and efficiency.Based on the findings from the study,we discuss the feasibility and limitations of our system and future research directions.
文摘In the current era of information technology,students need to learn modern programming languages efficiently.The art of teaching/learning program-ming requires many logical and conceptual skills.So it’s a challenging task for the instructors/learners to teach/learn these programming languages effectively and efficiently.Mind mapping is a useful visual tool for establishing ideas and connecting them to solve problems.This research proposed an effective way to teach programming languages through visual tools.This experimental study uses a mind mapping tool to teach two programming environments:Text-based Programming and Blocks-based Programming.We performed the experiments with one hundred and sixty undergraduate students of two public sector universities in the Asia Pacific region.Four different instructional approaches,including block-based language(BBL),text-based languages(TBL),mind map with text-based language(MMTBL)and mind mapping with block-based(MMBBL)are used for this purpose.The results show that instructional approaches using a mind mapping tool to help students solve given tasks in their critical thinking are more effective than other instructional techniques.
基金This work was supported partially by the MSIT(Ministry of Science and ICT),Korea,under the ITRC(Information Technology Research Center)Support Program(IITP-2024-2018-0-01431)supervised by the IITP(Institute for Information&Communications Technology Planning&Evaluation).
文摘The extensive utilization of the Internet in everyday life can be attributed to the substantial accessibility of online services and the growing significance of the data transmitted via the Internet.Regrettably,this development has expanded the potential targets that hackers might exploit.Without adequate safeguards,data transmitted on the internet is significantly more susceptible to unauthorized access,theft,or alteration.The identification of unauthorised access attempts is a critical component of cybersecurity as it aids in the detection and prevention of malicious attacks.This research paper introduces a novel intrusion detection framework that utilizes Recurrent Neural Networks(RNN)integrated with Long Short-Term Memory(LSTM)units.The proposed model can identify various types of cyberattacks,including conventional and distinctive forms.Recurrent networks,a specific kind of feedforward neural networks,possess an intrinsic memory component.Recurrent Neural Networks(RNNs)incorporating Long Short-Term Memory(LSTM)mechanisms have demonstrated greater capabilities in retaining and utilizing data dependencies over extended periods.Metrics such as data types,training duration,accuracy,number of false positives,and number of false negatives are among the parameters employed to assess the effectiveness of these models in identifying both common and unusual cyberattacks.RNNs are utilised in conjunction with LSTM to support human analysts in identifying possible intrusion events,hence enhancing their decision-making capabilities.A potential solution to address the limitations of Shallow learning is the introduction of the Eccentric Intrusion Detection Model.This model utilises Recurrent Neural Networks,specifically exploiting LSTM techniques.The proposed model achieves detection accuracy(99.5%),generalisation(99%),and false-positive rate(0.72%),the parameters findings reveal that it is superior to state-of-the-art techniques.
基金Research work funded by Zhejiang Normal University Research Fund YS304023947 and YS304023948.
文摘Neuroimaging has emerged over the last few decades as a crucial tool in diagnosing Alzheimer’s disease(AD).Mild cognitive impairment(MCI)is a condition that falls between the spectrum of normal cognitive function and AD.However,previous studies have mainly used handcrafted features to classify MCI,AD,and normal control(NC)individuals.This paper focuses on using gray matter(GM)scans obtained through magnetic resonance imaging(MRI)for the diagnosis of individuals with MCI,AD,and NC.To improve classification performance,we developed two transfer learning strategies with data augmentation(i.e.,shear range,rotation,zoom range,channel shift).The first approach is a deep Siamese network(DSN),and the second approach involves using a cross-domain strategy with customized VGG-16.We performed experiments on the Alzheimer’s Disease Neuroimaging Initiative(ADNI)dataset to evaluate the performance of our proposed models.Our experimental results demonstrate superior performance in classifying the three binary classification tasks:NC vs.AD,NC vs.MCI,and MCI vs.AD.Specifically,we achieved a classification accuracy of 97.68%,94.25%,and 92.18%for the three cases,respectively.Our study proposes two transfer learning strategies with data augmentation to accurately diagnose MCI,AD,and normal control individuals using GM scans.Our findings provide promising results for future research and clinical applications in the early detection and diagnosis of AD.
文摘This paper proposes a blockchain-based system as a secure, efficient, and cost-effective alternative to SWIFT for cross-border remittances. The current SWIFT system faces challenges, including slow settlement times, high transaction costs, and vulnerability to fraud. Leveraging blockchain technology’s decentralized, transparent, and immutable nature, the proposed system aims to address these limitations. Key features include modular architecture, implementation of microservices, and advanced cryptographic protocols. The system incorporates Proof of Stake consensus with BLS signatures, smart contract execution with dynamic pricing, and a decentralized oracle network for currency conversion. A sophisticated risk-based authentication system utilizes Bayesian networks and machine learning for enhanced security. Mathematical models are presented for critical components, including transaction validation, currency conversion, and regulatory compliance. Simulations demonstrate potential improvements in transaction speed and costs. However, challenges such as regulatory hurdles, user adoption, scalability, and integration with legacy systems must be addressed. The paper provides a comparative analysis between the proposed blockchain system and SWIFT, highlighting advantages in transaction speed, costs, and security. Mitigation strategies are proposed for key challenges. Recommendations are made for further research into scaling solutions, regulatory frameworks, and user-centric designs. The adoption of blockchain-based remittances could significantly impact the financial sector, potentially disrupting traditional models and promoting financial inclusion in underserved markets. However, successful implementation will require collaboration between blockchain innovators, financial institutions, and regulators to create an enabling environment for this transformative system.
基金the Deanship of Scientific Research at King Khalid University for funding this work through General Research Project under grant number(GRP/75/44).
文摘Deep learning has recently become a viable approach for classifying Alzheimer's disease(AD)in medical imaging.However,existing models struggle to efficiently extract features from medical images and may squander additional information resources for illness classification.To address these issues,a deep three‐dimensional convolutional neural network incorporating multi‐task learning and attention mechanisms is proposed.An upgraded primary C3D network is utilised to create rougher low‐level feature maps.It introduces a new convolution block that focuses on the structural aspects of the magnetORCID:ic resonance imaging image and another block that extracts attention weights unique to certain pixel positions in the feature map and multiplies them with the feature map output.Then,several fully connected layers are used to achieve multi‐task learning,generating three outputs,including the primary classification task.The other two outputs employ backpropagation during training to improve the primary classification job.Experimental findings show that the authors’proposed method outperforms current approaches for classifying AD,achieving enhanced classification accuracy and other in-dicators on the Alzheimer's disease Neuroimaging Initiative dataset.The authors demonstrate promise for future disease classification studies.