Mental health is a significant issue worldwide,and the utilization of technology to assist mental health has seen a growing trend.This aims to alleviate the workload on healthcare professionals and aid individuals.Num...Mental health is a significant issue worldwide,and the utilization of technology to assist mental health has seen a growing trend.This aims to alleviate the workload on healthcare professionals and aid individuals.Numerous applications have been developed to support the challenges in intelligent healthcare systems.However,because mental health data is sensitive,privacy concerns have emerged.Federated learning has gotten some attention.This research reviews the studies on federated learning and mental health related to solving the issue of intelligent healthcare systems.It explores various dimensions of federated learning in mental health,such as datasets(their types and sources),applications categorized based on mental health symptoms,federated mental health frameworks,federated machine learning,federated deep learning,and the benefits of federated learning in mental health applications.This research conducts surveys to evaluate the current state of mental health applications,mainly focusing on the role of Federated Learning(FL)and related privacy and data security concerns.The survey provides valuable insights into how these applications are emerging and evolving,specifically emphasizing FL’s impact.展开更多
In the rapidly evolving landscape of today’s digital economy,Financial Technology(Fintech)emerges as a trans-formative force,propelled by the dynamic synergy between Artificial Intelligence(AI)and Algorithmic Trading...In the rapidly evolving landscape of today’s digital economy,Financial Technology(Fintech)emerges as a trans-formative force,propelled by the dynamic synergy between Artificial Intelligence(AI)and Algorithmic Trading.Our in-depth investigation delves into the intricacies of merging Multi-Agent Reinforcement Learning(MARL)and Explainable AI(XAI)within Fintech,aiming to refine Algorithmic Trading strategies.Through meticulous examination,we uncover the nuanced interactions of AI-driven agents as they collaborate and compete within the financial realm,employing sophisticated deep learning techniques to enhance the clarity and adaptability of trading decisions.These AI-infused Fintech platforms harness collective intelligence to unearth trends,mitigate risks,and provide tailored financial guidance,fostering benefits for individuals and enterprises navigating the digital landscape.Our research holds the potential to revolutionize finance,opening doors to fresh avenues for investment and asset management in the digital age.Additionally,our statistical evaluation yields encouraging results,with metrics such as Accuracy=0.85,Precision=0.88,and F1 Score=0.86,reaffirming the efficacy of our approach within Fintech and emphasizing its reliability and innovative prowess.展开更多
In today’s world,image processing techniques play a crucial role in the prognosis and diagnosis of various diseases due to the development of several precise and accurate methods for medical images.Automated analysis...In today’s world,image processing techniques play a crucial role in the prognosis and diagnosis of various diseases due to the development of several precise and accurate methods for medical images.Automated analysis of medical images is essential for doctors,as manual investigation often leads to inter-observer variability.This research aims to enhance healthcare by enabling the early detection of diabetic retinopathy through an efficient image processing framework.The proposed hybridized method combines Modified Inertia Weight Particle Swarm Optimization(MIWPSO)and Fuzzy C-Means clustering(FCM)algorithms.Traditional FCM does not incorporate spatial neighborhood features,making it highly sensitive to noise,which significantly affects segmentation output.Our method incorporates a modified FCM that includes spatial functions in the fuzzy membership matrix to eliminate noise.The results demonstrate that the proposed FCM-MIWPSO method achieves highly precise and accurate medical image segmentation.Furthermore,segmented images are classified as benign or malignant using the Decision Tree-Based Temporal Association Rule(DT-TAR)Algorithm.Comparative analysis with existing state-of-the-art models indicates that the proposed FCM-MIWPSO segmentation technique achieves a remarkable accuracy of 98.42%on the dataset,highlighting its significant impact on improving diagnostic capabilities in medical imaging.展开更多
Phishing attacks present a persistent and evolving threat in the cybersecurity land-scape,necessitating the development of more sophisticated detection methods.Traditional machine learning approaches to phishing detec...Phishing attacks present a persistent and evolving threat in the cybersecurity land-scape,necessitating the development of more sophisticated detection methods.Traditional machine learning approaches to phishing detection have relied heavily on feature engineering and have often fallen short in adapting to the dynamically changing patterns of phishingUniformResource Locator(URLs).Addressing these challenge,we introduce a framework that integrates the sequential data processing strengths of a Recurrent Neural Network(RNN)with the hyperparameter optimization prowess of theWhale Optimization Algorithm(WOA).Ourmodel capitalizes on an extensive Kaggle dataset,featuring over 11,000 URLs,each delineated by 30 attributes.The WOA’s hyperparameter optimization enhances the RNN’s performance,evidenced by a meticulous validation process.The results,encapsulated in precision,recall,and F1-score metrics,surpass baseline models,achieving an overall accuracy of 92%.This study not only demonstrates the RNN’s proficiency in learning complex patterns but also underscores the WOA’s effectiveness in refining machine learning models for the critical task of phishing detection.展开更多
The visions of Industry 4.0 and 5.0 have reinforced the industrial environment.They have also made artificial intelligence incorporated as a major facilitator.Diagnosing machine faults has become a solid foundation fo...The visions of Industry 4.0 and 5.0 have reinforced the industrial environment.They have also made artificial intelligence incorporated as a major facilitator.Diagnosing machine faults has become a solid foundation for automatically recognizing machine failure,and thus timely maintenance can ensure safe operations.Transfer learning is a promising solution that can enhance the machine fault diagnosis model by borrowing pre-trained knowledge from the source model and applying it to the target model,which typically involves two datasets.In response to the availability of multiple datasets,this paper proposes using selective and adaptive incremental transfer learning(SA-ITL),which fuses three algorithms,namely,the hybrid selective algorithm,the transferability enhancement algorithm,and the incremental transfer learning algorithm.It is a selective algorithm that enables selecting and ordering appropriate datasets for transfer learning and selecting useful knowledge to avoid negative transfer.The algorithm also adaptively adjusts the portion of training data to balance the learning rate and training time.The proposed algorithm is evaluated and analyzed using ten benchmark datasets.Compared with other algorithms from existing works,SA-ITL improves the accuracy of all datasets.Ablation studies present the accuracy enhancements of the SA-ITL,including the hybrid selective algorithm(1.22%-3.82%),transferability enhancement algorithm(1.91%-4.15%),and incremental transfer learning algorithm(0.605%-2.68%).These also show the benefits of enhancing the target model with heterogeneous image datasets that widen the range of domain selection between source and target domains.展开更多
This study introduces a long-short-term memory(LSTM)-based neural network model developed for detecting anomaly events in care-independent smart homes,focusing on the critical application of elderly fall detection.It ...This study introduces a long-short-term memory(LSTM)-based neural network model developed for detecting anomaly events in care-independent smart homes,focusing on the critical application of elderly fall detection.It balances the dataset using the Synthetic Minority Over-sampling Technique(SMOTE),effectively neutralizing bias to address the challenge of unbalanced datasets prevalent in time-series classification tasks.The proposed LSTM model is trained on the enriched dataset,capturing the temporal dependencies essential for anomaly recognition.The model demonstrated a significant improvement in anomaly detection,with an accuracy of 84%.The results,detailed in the comprehensive classification and confusion matrices,showed the model’s proficiency in distinguishing between normal activities and falls.This study contributes to the advancement of smart home safety,presenting a robust framework for real-time anomaly monitoring.展开更多
Phishing attacks present a serious threat to enterprise systems,requiring advanced detection techniques to protect sensitive data.This study introduces a phishing email detection framework that combines Bidirectional ...Phishing attacks present a serious threat to enterprise systems,requiring advanced detection techniques to protect sensitive data.This study introduces a phishing email detection framework that combines Bidirectional Encoder Representations from Transformers(BERT)for feature extraction and CNN for classification,specifically designed for enterprise information systems.BERT’s linguistic capabilities are used to extract key features from email content,which are then processed by a convolutional neural network(CNN)model optimized for phishing detection.Achieving an accuracy of 97.5%,our proposed model demonstrates strong proficiency in identifying phishing emails.This approach represents a significant advancement in applying deep learning to cybersecurity,setting a new benchmark for email security by effectively addressing the increasing complexity of phishing attacks.展开更多
This paper discusses the bathymetric mapping technologies by means of satellite remote sensing (RS) with special emphasis on bathymetry derivation models, methods, accuracies, advantages, limitations, and comparisons....This paper discusses the bathymetric mapping technologies by means of satellite remote sensing (RS) with special emphasis on bathymetry derivation models, methods, accuracies, advantages, limitations, and comparisons. Traditionally, bathymetry can be mapped using echo sounding sounders. However, this method is constrained by its inefficiency in shallow waters and very high operating logistic costs. In comparison, RS technologies present efficient and cost-effective means of mapping bathymetry over remote and broad areas. RS of bathymetry can be categorised into two broad classes: active RS and passive RS. Active RS methods are based on active satellite sensors, which emit artificial radiation to study the earth surface or atmospheric features, e.g. light detection and ranging (LIDAR), polarimetric synthetic aperture radar (SAR), altimeters, etc. Passive RS methods are based on passive satellite sensors, which detect sunlight (natural source of light) radiation reflected from the earth and thermal radiation in the visible and infrared portion of the electromagnetic spectrum, e.g. multispectral or optical satellite sensors. Bathymetric methods can also be categorised as imaging methods and non-imaging methods. The non-imaging method is elucidated by laser scanners or LIDAR, which measures the distance between the sensor and the water surface or the ocean floor using a single wave pulse or double waves. On the other hand, imaging methods approximate the water depth based on the pixel values or digital numbers (DN) (representing reflectance or backscatter) of an image. Imaging methods make use of the visible and/or near infrared (NIR) and microwave radiation. Imaging methods are implemented with either analytical modelling or empirical modelling, or by a blend of both. This paper presents the development of bathymetric mapping technology by using RS, and discusses the state-of-the-art bathymetry derivation methods/algorithms and their implications in practical applications.展开更多
The cryosphere is the frozen part of the Earth’s system. Snow and ice are the main constituents of the cryosphere and may be found in different states, such as snow, freshwater ice, sea ice, perma-frost, and continen...The cryosphere is the frozen part of the Earth’s system. Snow and ice are the main constituents of the cryosphere and may be found in different states, such as snow, freshwater ice, sea ice, perma-frost, and continental ice masses in the form of glaciers and ice sheets. The present review mainly deals with state-of-the-art applications of synthetic aperture radar (SAR) with a special emphasize on cryospheric information extraction. SAR is the most important active microwave remote sensing (RS) instrument for ice monitoring, which provides high-resolution images of the Earth’s surface. SAR is an ideal sensor in RS technology, which works in all-weather and day and night conditions to provide useful unprecedented information, especially in the cryospheric regions which are almost inaccessible areas on Earth. This paper addresses the technological evolution of SAR and its applications in studying the various components of the cryosphere. The arrival of SAR radically changed the capabilities of information extraction related to ice type, new ice formation, and ice thickness. SAR applications can be divided into two broad classes-polarimetric applications and interferometric applications. Polarimetric SAR has been effectively used for mapping calving fronts, crevasses, surface structures, sea ice, detection of icebergs, etc. The paper also summarizes both the operational and climate change research by using SAR for sea ice parameter detection. Digital elevation model (DEM) generation and glacier velocity mapping are the two most important applications used in cryosphere using SAR interferometry or interferometric SAR (InSAR). Space-borne InSAR techniques for measuring ice flow velocity and topography have developed rapidly over the last decade. InSAR is capable of measuring ice motion that has radically changed the science of glaciers and ice sheets. Measurement of temperate glacier velocities and surface characteristics by using airborne and space-borne interferometric satellite images have been the significant application in glaciology and cryospheric studies. Space-borne InSAR has contributed to major evolution in many research areas of glaciological study by measuring ice-stream flow velocity, improving understanding of ice-shelf processes, yielding velocity for flux-gate based mass-balance assessment, and mapping flow of mountain glaciers. The present review summarizes the salient development of SAR applications in cryosphere and glaciology.展开更多
Image classification is one of the most basic operations of digital image processing. The present review focuses on the strengths and weaknesses of traditional pixel-based classification (PBC) and the advances of obje...Image classification is one of the most basic operations of digital image processing. The present review focuses on the strengths and weaknesses of traditional pixel-based classification (PBC) and the advances of object-oriented classification (OOC) algorithms employed for the extraction of information from remotely sensed satellite imageries. The state-of-the-art classifiers are reviewed for their potential usage in urban remote sensing (RS), with a special focus on cryospheric applications. Generally, classifiers for information extraction can be divided into three catalogues: 1) based on the type of learning (supervised and unsupervised), 2) based on assumptions on data distribution (parametric and non-parametric) and, 3) based on the number of outputs for each spatial unit (hard and soft). The classification methods are broadly based on the PBC or the OOC approaches. Both methods have their own advantages and disadvantages depending upon their area of application and most importantly the RS datasets that are used for information extraction. Classification algorithms are variedly explored in the cryosphere for extracting geospatial information for various logistic and scientific applications, such as to understand temporal changes in geographical phenomena. Information extraction in cryospheric regions is challenging, accounting to the very similar and conflicting spectral responses of the features present in the region. The spectral responses of snow and ice, water, and blue ice, rock and shadow are a big challenge for the pixel-based classifiers. Thus, in such cases, OOC approach is superior for extracting information from the cryospheric regions. Also, ensemble classifiers and customized spectral index ratios (CSIR) proved extremely good approaches for information extraction from cryospheric regions. The present review would be beneficial for developing new classifiers in the cryospheric environment for better understanding of spatial-temporal changes over long time scales.展开更多
Epigenetic regulations are heritable changes in gene expression that occur in the absence of alterations in DNA sequences. Various epigenetic mechanisms include histone modifications and DNA methylations. In this revi...Epigenetic regulations are heritable changes in gene expression that occur in the absence of alterations in DNA sequences. Various epigenetic mechanisms include histone modifications and DNA methylations. In this review, we examine methods to study DNA methylations and their contribution to degenerative diseases by mediating the complex gene-by-environment interactions. Such epigenetic modifications despite being heritable and stably maintained are also potentially reversible and there is scope for the development of epigenetic therapies for this disease.展开更多
Background Diabetes mellitus is an obtrusive universal health emergency in developed and developing countries,including India.With the exponential rise of epidemiological conditions,the costs of treating and managing ...Background Diabetes mellitus is an obtrusive universal health emergency in developed and developing countries,including India.With the exponential rise of epidemiological conditions,the costs of treating and managing diabetes are on an upsurge.This study aimed to estimate the cost of diabetes and determine the determinants of the total cost among diabetic patients.Methods This cross-sectional study was executed in the northern state of Punjab,India.It involves the multi-stage area sampling technique and data was collected through a self-structured questionnaire adapted following the“WHO STEPS Surveillance”manual.Mann-Whitney U and Kruskal-Wallis tests were performed to compare the cost differences in socio-demographic variables.Lastly,multiple linear regression was conducted to determine and evaluate the association of the dependent variable with numerous influential determinants.Results The urban respondents’average direct and indirect costs are higher than rural respondents.Age manifests very eccentric results;the highest mean direct outpatient care expenditure of₹52,104 was incurred by the respondents below 20 years of age.Gender,complications,income,history of diabetes and work status were statistically significant determinants of the total cost.Study reports a rapid increase in the median annual direct and indirect cost from₹15,460 and₹3572 in 1999 to₹34,100 and₹4200 in 2021.Conclusions The present study highlights that the economic jeopardy of diabetes can be managed by educating people about diabetes and its associated risk factors.The economic burden of diabetes could be restrained by formulating new health policies and promoting the use of generic medicines.The result of the study directs that expenditure on outpatient care is to be reimbursed under the‘Ayushman Bharat-Sarbat Sehat Bima Yojana’.展开更多
Healthy forest is the vital resource to regulate climate at a regional and global level. Forest fire has been regarded as one of the major reasons for the loss of forest and degradation of the environment. Global warm...Healthy forest is the vital resource to regulate climate at a regional and global level. Forest fire has been regarded as one of the major reasons for the loss of forest and degradation of the environment. Global warming is increasing its intensity at an alarming rate. Real-time fire detection is a necessity to avoid large scale losses. Remote sensing is a quick and cheap technique for detecting and monitoring forest fires on a large scale. Advance Very Radiometer Resolution (AVHRR) has been used already for a long period for fire detection. The use of Moderate Resolution Imaging Radio Spectrometer (MODIS) for fire detection has recently preceded AVHRR and a large number of fire products are being developed. MODIS based forest fire detection and monitoring system can solve the problem of real-time forest fire monitoring. The system facilitates data acquisition, processing, reporting and feedback on the fire location information in an automated manner. It provides location information at 1 × 1 kilometer resolution on the active fires which are present during the satellite overpass twice a day. The users are provided with the information on SMS alert with fire location details, email notification, and online visualization of fire locations on website automatically. The whole processes are automated and provide better accuracy for fire detection.展开更多
In today’s information technology(IT)world,the multi-hop wireless sensor networks(MHWSNs)are considered the building block for the Internet of Things(IoT)enabled communication systems for controlling everyday tasks o...In today’s information technology(IT)world,the multi-hop wireless sensor networks(MHWSNs)are considered the building block for the Internet of Things(IoT)enabled communication systems for controlling everyday tasks of organizations and industry to provide quality of service(QoS)in a stipulated time slot to end-user over the Internet.Smart city(SC)is an example of one such application which can automate a group of civil services like automatic control of traffic lights,weather prediction,surveillance,etc.,in our daily life.These IoT-based networks with multi-hop communication and multiple sink nodes provide efficient communication in terms of performance parameters such as throughput,energy efficiency,and end-to-end delay,wherein low latency is considered a challenging issue in next-generation networks(NGN).This paper introduces a single and parallels stable server queuing model with amulti-class of packets and native and coded packet flowto illustrate the simple chain topology and complexmultiway relay(MWR)node with specific neighbor topology.Further,for improving data transmission capacity inMHWSNs,an analytical framework for packet transmission using network coding at the MWR node in the network layer with opportunistic listening is performed by considering bi-directional network flow at the MWR node.Finally,the accuracy of the proposed multi-server multi-class queuing model is evaluated with and without network coding at the network layer by transmitting data packets.The results of the proposed analytical framework are validated and proved effective by comparing these analytical results to simulation results.展开更多
Nanoparticles are extensively used for various applications in science, engineering and medicine. Synthesis of nanoparticles with high purity is essential to utilize the same in different fields of science and technol...Nanoparticles are extensively used for various applications in science, engineering and medicine. Synthesis of nanoparticles with high purity is essential to utilize the same in different fields of science and technology. In the present study, liquid chromatography is utilized to purify the nanoparticles. Predominantly, gold nanoparticles were synthesized from gold auric cholide and preserved in phosphate or citrate buffer. A method to purify gold nanoparticles is essential because of the possible interference from gold auric chloride and other impurities in buffer. Herein, a method has been developed using high performance liquid chromatography to purify gold nanoparticles with 100 nm in size from gold auric chloride and residues. UV-Vis spectroscopy was also done to ascertain the purity of the nanoparticles.展开更多
Banking institutions all over the world face significant challenge due to the cumulative loss due to defaults of borrowers of different types of loans. The cumulative default loss built up over a period of time could ...Banking institutions all over the world face significant challenge due to the cumulative loss due to defaults of borrowers of different types of loans. The cumulative default loss built up over a period of time could wipe out the capital cushion of the banks. The aim of this paper is to help the banks to forecast the cumulative loss and its volatility. Defaulting amounts are random and defaults occur at random instants of time. A non Markovian time dependent random point process is used to model the cumulative loss. The expected loss and volatility are evaluated analytically. They are functions of probability of default, probability of loss amount, recovery rate and time. Probability of default being the important contributor is evaluated using Hidden Markov modeling. Numerical results obtained validate the model.展开更多
Correction to:Global Health Research and Policy(2023)8:11 https://doi.org/10.1186/s41256-023-00293-3 Following publication of the original article[1],the authors reported that the fifth sentence in the last paragraph ...Correction to:Global Health Research and Policy(2023)8:11 https://doi.org/10.1186/s41256-023-00293-3 Following publication of the original article[1],the authors reported that the fifth sentence in the last paragraph of the Background section needed to be updated.The original sentence was:“Additionally,most prior articles assessing cost-of-diabetes(COD)used secondary data acquired from‘hospital databases’or‘national health surveys’,which possess evident restraints regarding reliability[10-12].”展开更多
Phishing attacks seriously threaten information privacy and security within the Internet of Things(IoT)ecosystem.Numerous phishing attack detection solutions have been developed for IoT;however,many of these are eithe...Phishing attacks seriously threaten information privacy and security within the Internet of Things(IoT)ecosystem.Numerous phishing attack detection solutions have been developed for IoT;however,many of these are either not optimally efficient or lack the lightweight characteristics needed for practical application.This paper proposes and optimizes a lightweight deep-learning model for phishing attack detection.Our model employs a two-fold optimization approach:first,it utilizes the analysis of the variance(ANOVA)F-test to select the optimal features for phishing detection,and second,it applies the Cuckoo Search algorithm to tune the hyperparameters(learning rate and dropout rate)of the deep learning model.Additionally,our model is trained in only five epochs,making it more lightweight than other deep learning(DL)and machine learning(ML)models.The proposed model achieved a phishing detection accuracy of 91%,with a precision of 92%for the’normal’class and 91%for the‘attack’class.Moreover,the model’s recall and F1-score are 91%for both classes.We also compared our approach with traditional DL/ML models and past literature,demonstrating that our model is more accurate.This study enhances the security of sensitive information and IoT devices by offering a novel and effective approach to phishing detection.展开更多
In this article,we have presented the exact solutions of the Couette,Poiseuille and generalized Couette flows of an incompressible magnetohydrodynamic Jeffrey fluid between parallel plates through homogeneous porous m...In this article,we have presented the exact solutions of the Couette,Poiseuille and generalized Couette flows of an incompressible magnetohydrodynamic Jeffrey fluid between parallel plates through homogeneous porous medium.The effects of slip boundary conditions and heat transfer are considered.Viscous dissipation,radiation and Joule heating are also considered in the energy equation.The governing equations of the Jeffrey fluid flow are modeled in Cartesian coordinate system.Analytical solutions for the velocity and temperature the velocity and temperature profiles are studied and the results are presented through graphs.It Temperature behaves as a decreasing function due to the impact of Hartmann number,non-Newtonian parameter and slip parameter in all noted problems.展开更多
The geopolitical construct of the Indo-Pacific has evolved as one of the most important ones of the twenty-first century and more particularly of the last decade.While there is little or no consensus on where the Indo...The geopolitical construct of the Indo-Pacific has evolved as one of the most important ones of the twenty-first century and more particularly of the last decade.While there is little or no consensus on where the Indo-Pacific Region(IPR)begins or ends,it has inadvertently become a space where new convergences,competitions and alignments have emerged.These developments are intrinsically linked with the ascent of China as a global power,the retreat of the American strategic footprint and the emergence of a multi-polar world order.Within the larger Indo-Pacific construct,the Western Indian Ocean region is a space of considerable geopolitical and maritime interactions between states.The United Arab Emirates(UAE)and India are both countries of the Western Indian Ocean region while France is a resident power of the region owing to the presence of two of its overseas departments-Mayotte and Reunion—and its inter services bases in the UAE and Djibouti.The three countries have considerable experience in operationalising bilateral as well as trilateral initiatives.The lack of such initiatives in the Western Indian Ocean region could therefore offer the opportunity for UAE,India and France to come together in a trilateral arrangement to further their strategic interests and uphold the concept of a‘free and open Indo-Pacific’.The paper seeks to explore whether a trilateral partnership between the UAE,India and France could contribute to furthering their respective strategic autonomy in the Indo-Pacific Region.The paper will also endeavour to examine the conflicts and differences that could be expected and the possible areas of convergence.展开更多
文摘Mental health is a significant issue worldwide,and the utilization of technology to assist mental health has seen a growing trend.This aims to alleviate the workload on healthcare professionals and aid individuals.Numerous applications have been developed to support the challenges in intelligent healthcare systems.However,because mental health data is sensitive,privacy concerns have emerged.Federated learning has gotten some attention.This research reviews the studies on federated learning and mental health related to solving the issue of intelligent healthcare systems.It explores various dimensions of federated learning in mental health,such as datasets(their types and sources),applications categorized based on mental health symptoms,federated mental health frameworks,federated machine learning,federated deep learning,and the benefits of federated learning in mental health applications.This research conducts surveys to evaluate the current state of mental health applications,mainly focusing on the role of Federated Learning(FL)and related privacy and data security concerns.The survey provides valuable insights into how these applications are emerging and evolving,specifically emphasizing FL’s impact.
基金This project was funded by Deanship of Scientific Research(DSR)at King Abdulaziz University,Jeddah underGrant No.(IFPIP-1127-611-1443)the authors,therefore,acknowledge with thanks DSR technical and financial support.
文摘In the rapidly evolving landscape of today’s digital economy,Financial Technology(Fintech)emerges as a trans-formative force,propelled by the dynamic synergy between Artificial Intelligence(AI)and Algorithmic Trading.Our in-depth investigation delves into the intricacies of merging Multi-Agent Reinforcement Learning(MARL)and Explainable AI(XAI)within Fintech,aiming to refine Algorithmic Trading strategies.Through meticulous examination,we uncover the nuanced interactions of AI-driven agents as they collaborate and compete within the financial realm,employing sophisticated deep learning techniques to enhance the clarity and adaptability of trading decisions.These AI-infused Fintech platforms harness collective intelligence to unearth trends,mitigate risks,and provide tailored financial guidance,fostering benefits for individuals and enterprises navigating the digital landscape.Our research holds the potential to revolutionize finance,opening doors to fresh avenues for investment and asset management in the digital age.Additionally,our statistical evaluation yields encouraging results,with metrics such as Accuracy=0.85,Precision=0.88,and F1 Score=0.86,reaffirming the efficacy of our approach within Fintech and emphasizing its reliability and innovative prowess.
基金Scientific Research Deanship has funded this project at the University of Ha’il–Saudi Arabia Ha’il–Saudi Arabia through project number RG-21104.
文摘In today’s world,image processing techniques play a crucial role in the prognosis and diagnosis of various diseases due to the development of several precise and accurate methods for medical images.Automated analysis of medical images is essential for doctors,as manual investigation often leads to inter-observer variability.This research aims to enhance healthcare by enabling the early detection of diabetic retinopathy through an efficient image processing framework.The proposed hybridized method combines Modified Inertia Weight Particle Swarm Optimization(MIWPSO)and Fuzzy C-Means clustering(FCM)algorithms.Traditional FCM does not incorporate spatial neighborhood features,making it highly sensitive to noise,which significantly affects segmentation output.Our method incorporates a modified FCM that includes spatial functions in the fuzzy membership matrix to eliminate noise.The results demonstrate that the proposed FCM-MIWPSO method achieves highly precise and accurate medical image segmentation.Furthermore,segmented images are classified as benign or malignant using the Decision Tree-Based Temporal Association Rule(DT-TAR)Algorithm.Comparative analysis with existing state-of-the-art models indicates that the proposed FCM-MIWPSO segmentation technique achieves a remarkable accuracy of 98.42%on the dataset,highlighting its significant impact on improving diagnostic capabilities in medical imaging.
基金Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2024R 343)PrincessNourah bint Abdulrahman University,Riyadh,Saudi ArabiaDeanship of Scientific Research at Northern Border University,Arar,Kingdom of Saudi Arabia,for funding this researchwork through the project number“NBU-FFR-2024-1092-02”.
文摘Phishing attacks present a persistent and evolving threat in the cybersecurity land-scape,necessitating the development of more sophisticated detection methods.Traditional machine learning approaches to phishing detection have relied heavily on feature engineering and have often fallen short in adapting to the dynamically changing patterns of phishingUniformResource Locator(URLs).Addressing these challenge,we introduce a framework that integrates the sequential data processing strengths of a Recurrent Neural Network(RNN)with the hyperparameter optimization prowess of theWhale Optimization Algorithm(WOA).Ourmodel capitalizes on an extensive Kaggle dataset,featuring over 11,000 URLs,each delineated by 30 attributes.The WOA’s hyperparameter optimization enhances the RNN’s performance,evidenced by a meticulous validation process.The results,encapsulated in precision,recall,and F1-score metrics,surpass baseline models,achieving an overall accuracy of 92%.This study not only demonstrates the RNN’s proficiency in learning complex patterns but also underscores the WOA’s effectiveness in refining machine learning models for the critical task of phishing detection.
文摘The visions of Industry 4.0 and 5.0 have reinforced the industrial environment.They have also made artificial intelligence incorporated as a major facilitator.Diagnosing machine faults has become a solid foundation for automatically recognizing machine failure,and thus timely maintenance can ensure safe operations.Transfer learning is a promising solution that can enhance the machine fault diagnosis model by borrowing pre-trained knowledge from the source model and applying it to the target model,which typically involves two datasets.In response to the availability of multiple datasets,this paper proposes using selective and adaptive incremental transfer learning(SA-ITL),which fuses three algorithms,namely,the hybrid selective algorithm,the transferability enhancement algorithm,and the incremental transfer learning algorithm.It is a selective algorithm that enables selecting and ordering appropriate datasets for transfer learning and selecting useful knowledge to avoid negative transfer.The algorithm also adaptively adjusts the portion of training data to balance the learning rate and training time.The proposed algorithm is evaluated and analyzed using ten benchmark datasets.Compared with other algorithms from existing works,SA-ITL improves the accuracy of all datasets.Ablation studies present the accuracy enhancements of the SA-ITL,including the hybrid selective algorithm(1.22%-3.82%),transferability enhancement algorithm(1.91%-4.15%),and incremental transfer learning algorithm(0.605%-2.68%).These also show the benefits of enhancing the target model with heterogeneous image datasets that widen the range of domain selection between source and target domains.
基金Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2024R 343),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.The authors extend their appreciation to the Deanship of Scientific Research at Northern Border University,Arar,KSA for funding this research work through the Project Number“NBU-FFR-2024-1092-04”.
文摘This study introduces a long-short-term memory(LSTM)-based neural network model developed for detecting anomaly events in care-independent smart homes,focusing on the critical application of elderly fall detection.It balances the dataset using the Synthetic Minority Over-sampling Technique(SMOTE),effectively neutralizing bias to address the challenge of unbalanced datasets prevalent in time-series classification tasks.The proposed LSTM model is trained on the enriched dataset,capturing the temporal dependencies essential for anomaly recognition.The model demonstrated a significant improvement in anomaly detection,with an accuracy of 84%.The results,detailed in the comprehensive classification and confusion matrices,showed the model’s proficiency in distinguishing between normal activities and falls.This study contributes to the advancement of smart home safety,presenting a robust framework for real-time anomaly monitoring.
基金supported by a grant from Hong Kong Metropolitan University (RD/2023/2.3)supported Princess Nourah bint Abdulrah-man University Researchers Supporting Project number (PNURSP2024R 343)+1 种基金Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabiathe Deanship of Scientific Research at Northern Border University,Arar,Kingdom of Saudi Arabia for funding this research work through the project number“NBU-FFR-2024-1092-09”.
文摘Phishing attacks present a serious threat to enterprise systems,requiring advanced detection techniques to protect sensitive data.This study introduces a phishing email detection framework that combines Bidirectional Encoder Representations from Transformers(BERT)for feature extraction and CNN for classification,specifically designed for enterprise information systems.BERT’s linguistic capabilities are used to extract key features from email content,which are then processed by a convolutional neural network(CNN)model optimized for phishing detection.Achieving an accuracy of 97.5%,our proposed model demonstrates strong proficiency in identifying phishing emails.This approach represents a significant advancement in applying deep learning to cybersecurity,setting a new benchmark for email security by effectively addressing the increasing complexity of phishing attacks.
文摘This paper discusses the bathymetric mapping technologies by means of satellite remote sensing (RS) with special emphasis on bathymetry derivation models, methods, accuracies, advantages, limitations, and comparisons. Traditionally, bathymetry can be mapped using echo sounding sounders. However, this method is constrained by its inefficiency in shallow waters and very high operating logistic costs. In comparison, RS technologies present efficient and cost-effective means of mapping bathymetry over remote and broad areas. RS of bathymetry can be categorised into two broad classes: active RS and passive RS. Active RS methods are based on active satellite sensors, which emit artificial radiation to study the earth surface or atmospheric features, e.g. light detection and ranging (LIDAR), polarimetric synthetic aperture radar (SAR), altimeters, etc. Passive RS methods are based on passive satellite sensors, which detect sunlight (natural source of light) radiation reflected from the earth and thermal radiation in the visible and infrared portion of the electromagnetic spectrum, e.g. multispectral or optical satellite sensors. Bathymetric methods can also be categorised as imaging methods and non-imaging methods. The non-imaging method is elucidated by laser scanners or LIDAR, which measures the distance between the sensor and the water surface or the ocean floor using a single wave pulse or double waves. On the other hand, imaging methods approximate the water depth based on the pixel values or digital numbers (DN) (representing reflectance or backscatter) of an image. Imaging methods make use of the visible and/or near infrared (NIR) and microwave radiation. Imaging methods are implemented with either analytical modelling or empirical modelling, or by a blend of both. This paper presents the development of bathymetric mapping technology by using RS, and discusses the state-of-the-art bathymetry derivation methods/algorithms and their implications in practical applications.
文摘The cryosphere is the frozen part of the Earth’s system. Snow and ice are the main constituents of the cryosphere and may be found in different states, such as snow, freshwater ice, sea ice, perma-frost, and continental ice masses in the form of glaciers and ice sheets. The present review mainly deals with state-of-the-art applications of synthetic aperture radar (SAR) with a special emphasize on cryospheric information extraction. SAR is the most important active microwave remote sensing (RS) instrument for ice monitoring, which provides high-resolution images of the Earth’s surface. SAR is an ideal sensor in RS technology, which works in all-weather and day and night conditions to provide useful unprecedented information, especially in the cryospheric regions which are almost inaccessible areas on Earth. This paper addresses the technological evolution of SAR and its applications in studying the various components of the cryosphere. The arrival of SAR radically changed the capabilities of information extraction related to ice type, new ice formation, and ice thickness. SAR applications can be divided into two broad classes-polarimetric applications and interferometric applications. Polarimetric SAR has been effectively used for mapping calving fronts, crevasses, surface structures, sea ice, detection of icebergs, etc. The paper also summarizes both the operational and climate change research by using SAR for sea ice parameter detection. Digital elevation model (DEM) generation and glacier velocity mapping are the two most important applications used in cryosphere using SAR interferometry or interferometric SAR (InSAR). Space-borne InSAR techniques for measuring ice flow velocity and topography have developed rapidly over the last decade. InSAR is capable of measuring ice motion that has radically changed the science of glaciers and ice sheets. Measurement of temperate glacier velocities and surface characteristics by using airborne and space-borne interferometric satellite images have been the significant application in glaciology and cryospheric studies. Space-borne InSAR has contributed to major evolution in many research areas of glaciological study by measuring ice-stream flow velocity, improving understanding of ice-shelf processes, yielding velocity for flux-gate based mass-balance assessment, and mapping flow of mountain glaciers. The present review summarizes the salient development of SAR applications in cryosphere and glaciology.
文摘Image classification is one of the most basic operations of digital image processing. The present review focuses on the strengths and weaknesses of traditional pixel-based classification (PBC) and the advances of object-oriented classification (OOC) algorithms employed for the extraction of information from remotely sensed satellite imageries. The state-of-the-art classifiers are reviewed for their potential usage in urban remote sensing (RS), with a special focus on cryospheric applications. Generally, classifiers for information extraction can be divided into three catalogues: 1) based on the type of learning (supervised and unsupervised), 2) based on assumptions on data distribution (parametric and non-parametric) and, 3) based on the number of outputs for each spatial unit (hard and soft). The classification methods are broadly based on the PBC or the OOC approaches. Both methods have their own advantages and disadvantages depending upon their area of application and most importantly the RS datasets that are used for information extraction. Classification algorithms are variedly explored in the cryosphere for extracting geospatial information for various logistic and scientific applications, such as to understand temporal changes in geographical phenomena. Information extraction in cryospheric regions is challenging, accounting to the very similar and conflicting spectral responses of the features present in the region. The spectral responses of snow and ice, water, and blue ice, rock and shadow are a big challenge for the pixel-based classifiers. Thus, in such cases, OOC approach is superior for extracting information from the cryospheric regions. Also, ensemble classifiers and customized spectral index ratios (CSIR) proved extremely good approaches for information extraction from cryospheric regions. The present review would be beneficial for developing new classifiers in the cryospheric environment for better understanding of spatial-temporal changes over long time scales.
文摘Epigenetic regulations are heritable changes in gene expression that occur in the absence of alterations in DNA sequences. Various epigenetic mechanisms include histone modifications and DNA methylations. In this review, we examine methods to study DNA methylations and their contribution to degenerative diseases by mediating the complex gene-by-environment interactions. Such epigenetic modifications despite being heritable and stably maintained are also potentially reversible and there is scope for the development of epigenetic therapies for this disease.
文摘Background Diabetes mellitus is an obtrusive universal health emergency in developed and developing countries,including India.With the exponential rise of epidemiological conditions,the costs of treating and managing diabetes are on an upsurge.This study aimed to estimate the cost of diabetes and determine the determinants of the total cost among diabetic patients.Methods This cross-sectional study was executed in the northern state of Punjab,India.It involves the multi-stage area sampling technique and data was collected through a self-structured questionnaire adapted following the“WHO STEPS Surveillance”manual.Mann-Whitney U and Kruskal-Wallis tests were performed to compare the cost differences in socio-demographic variables.Lastly,multiple linear regression was conducted to determine and evaluate the association of the dependent variable with numerous influential determinants.Results The urban respondents’average direct and indirect costs are higher than rural respondents.Age manifests very eccentric results;the highest mean direct outpatient care expenditure of₹52,104 was incurred by the respondents below 20 years of age.Gender,complications,income,history of diabetes and work status were statistically significant determinants of the total cost.Study reports a rapid increase in the median annual direct and indirect cost from₹15,460 and₹3572 in 1999 to₹34,100 and₹4200 in 2021.Conclusions The present study highlights that the economic jeopardy of diabetes can be managed by educating people about diabetes and its associated risk factors.The economic burden of diabetes could be restrained by formulating new health policies and promoting the use of generic medicines.The result of the study directs that expenditure on outpatient care is to be reimbursed under the‘Ayushman Bharat-Sarbat Sehat Bima Yojana’.
文摘Healthy forest is the vital resource to regulate climate at a regional and global level. Forest fire has been regarded as one of the major reasons for the loss of forest and degradation of the environment. Global warming is increasing its intensity at an alarming rate. Real-time fire detection is a necessity to avoid large scale losses. Remote sensing is a quick and cheap technique for detecting and monitoring forest fires on a large scale. Advance Very Radiometer Resolution (AVHRR) has been used already for a long period for fire detection. The use of Moderate Resolution Imaging Radio Spectrometer (MODIS) for fire detection has recently preceded AVHRR and a large number of fire products are being developed. MODIS based forest fire detection and monitoring system can solve the problem of real-time forest fire monitoring. The system facilitates data acquisition, processing, reporting and feedback on the fire location information in an automated manner. It provides location information at 1 × 1 kilometer resolution on the active fires which are present during the satellite overpass twice a day. The users are provided with the information on SMS alert with fire location details, email notification, and online visualization of fire locations on website automatically. The whole processes are automated and provide better accuracy for fire detection.
文摘In today’s information technology(IT)world,the multi-hop wireless sensor networks(MHWSNs)are considered the building block for the Internet of Things(IoT)enabled communication systems for controlling everyday tasks of organizations and industry to provide quality of service(QoS)in a stipulated time slot to end-user over the Internet.Smart city(SC)is an example of one such application which can automate a group of civil services like automatic control of traffic lights,weather prediction,surveillance,etc.,in our daily life.These IoT-based networks with multi-hop communication and multiple sink nodes provide efficient communication in terms of performance parameters such as throughput,energy efficiency,and end-to-end delay,wherein low latency is considered a challenging issue in next-generation networks(NGN).This paper introduces a single and parallels stable server queuing model with amulti-class of packets and native and coded packet flowto illustrate the simple chain topology and complexmultiway relay(MWR)node with specific neighbor topology.Further,for improving data transmission capacity inMHWSNs,an analytical framework for packet transmission using network coding at the MWR node in the network layer with opportunistic listening is performed by considering bi-directional network flow at the MWR node.Finally,the accuracy of the proposed multi-server multi-class queuing model is evaluated with and without network coding at the network layer by transmitting data packets.The results of the proposed analytical framework are validated and proved effective by comparing these analytical results to simulation results.
文摘Nanoparticles are extensively used for various applications in science, engineering and medicine. Synthesis of nanoparticles with high purity is essential to utilize the same in different fields of science and technology. In the present study, liquid chromatography is utilized to purify the nanoparticles. Predominantly, gold nanoparticles were synthesized from gold auric cholide and preserved in phosphate or citrate buffer. A method to purify gold nanoparticles is essential because of the possible interference from gold auric chloride and other impurities in buffer. Herein, a method has been developed using high performance liquid chromatography to purify gold nanoparticles with 100 nm in size from gold auric chloride and residues. UV-Vis spectroscopy was also done to ascertain the purity of the nanoparticles.
文摘Banking institutions all over the world face significant challenge due to the cumulative loss due to defaults of borrowers of different types of loans. The cumulative default loss built up over a period of time could wipe out the capital cushion of the banks. The aim of this paper is to help the banks to forecast the cumulative loss and its volatility. Defaulting amounts are random and defaults occur at random instants of time. A non Markovian time dependent random point process is used to model the cumulative loss. The expected loss and volatility are evaluated analytically. They are functions of probability of default, probability of loss amount, recovery rate and time. Probability of default being the important contributor is evaluated using Hidden Markov modeling. Numerical results obtained validate the model.
文摘Correction to:Global Health Research and Policy(2023)8:11 https://doi.org/10.1186/s41256-023-00293-3 Following publication of the original article[1],the authors reported that the fifth sentence in the last paragraph of the Background section needed to be updated.The original sentence was:“Additionally,most prior articles assessing cost-of-diabetes(COD)used secondary data acquired from‘hospital databases’or‘national health surveys’,which possess evident restraints regarding reliability[10-12].”
基金supported by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2024R 343),Princess Nourah bint Abdulrahman University,Riyadh,Saudi ArabiaThe authors extend their appreciation to the Deanship of Scientific Research at Northern Border University,Arar,Saudi Arabia for funding this research work through the Project number“NBU-FFR-2024-1092-09”.
文摘Phishing attacks seriously threaten information privacy and security within the Internet of Things(IoT)ecosystem.Numerous phishing attack detection solutions have been developed for IoT;however,many of these are either not optimally efficient or lack the lightweight characteristics needed for practical application.This paper proposes and optimizes a lightweight deep-learning model for phishing attack detection.Our model employs a two-fold optimization approach:first,it utilizes the analysis of the variance(ANOVA)F-test to select the optimal features for phishing detection,and second,it applies the Cuckoo Search algorithm to tune the hyperparameters(learning rate and dropout rate)of the deep learning model.Additionally,our model is trained in only five epochs,making it more lightweight than other deep learning(DL)and machine learning(ML)models.The proposed model achieved a phishing detection accuracy of 91%,with a precision of 92%for the’normal’class and 91%for the‘attack’class.Moreover,the model’s recall and F1-score are 91%for both classes.We also compared our approach with traditional DL/ML models and past literature,demonstrating that our model is more accurate.This study enhances the security of sensitive information and IoT devices by offering a novel and effective approach to phishing detection.
文摘In this article,we have presented the exact solutions of the Couette,Poiseuille and generalized Couette flows of an incompressible magnetohydrodynamic Jeffrey fluid between parallel plates through homogeneous porous medium.The effects of slip boundary conditions and heat transfer are considered.Viscous dissipation,radiation and Joule heating are also considered in the energy equation.The governing equations of the Jeffrey fluid flow are modeled in Cartesian coordinate system.Analytical solutions for the velocity and temperature the velocity and temperature profiles are studied and the results are presented through graphs.It Temperature behaves as a decreasing function due to the impact of Hartmann number,non-Newtonian parameter and slip parameter in all noted problems.
文摘The geopolitical construct of the Indo-Pacific has evolved as one of the most important ones of the twenty-first century and more particularly of the last decade.While there is little or no consensus on where the Indo-Pacific Region(IPR)begins or ends,it has inadvertently become a space where new convergences,competitions and alignments have emerged.These developments are intrinsically linked with the ascent of China as a global power,the retreat of the American strategic footprint and the emergence of a multi-polar world order.Within the larger Indo-Pacific construct,the Western Indian Ocean region is a space of considerable geopolitical and maritime interactions between states.The United Arab Emirates(UAE)and India are both countries of the Western Indian Ocean region while France is a resident power of the region owing to the presence of two of its overseas departments-Mayotte and Reunion—and its inter services bases in the UAE and Djibouti.The three countries have considerable experience in operationalising bilateral as well as trilateral initiatives.The lack of such initiatives in the Western Indian Ocean region could therefore offer the opportunity for UAE,India and France to come together in a trilateral arrangement to further their strategic interests and uphold the concept of a‘free and open Indo-Pacific’.The paper seeks to explore whether a trilateral partnership between the UAE,India and France could contribute to furthering their respective strategic autonomy in the Indo-Pacific Region.The paper will also endeavour to examine the conflicts and differences that could be expected and the possible areas of convergence.