Recently,researchers have shown increasing interest in combining more than one programming model into systems running on high performance computing systems(HPCs)to achieve exascale by applying parallelism at multiple ...Recently,researchers have shown increasing interest in combining more than one programming model into systems running on high performance computing systems(HPCs)to achieve exascale by applying parallelism at multiple levels.Combining different programming paradigms,such as Message Passing Interface(MPI),Open Multiple Processing(OpenMP),and Open Accelerators(OpenACC),can increase computation speed and improve performance.During the integration of multiple models,the probability of runtime errors increases,making their detection difficult,especially in the absence of testing techniques that can detect these errors.Numerous studies have been conducted to identify these errors,but no technique exists for detecting errors in three-level programming models.Despite the increasing research that integrates the three programming models,MPI,OpenMP,and OpenACC,a testing technology to detect runtime errors,such as deadlocks and race conditions,which can arise from this integration has not been developed.Therefore,this paper begins with a definition and explanation of runtime errors that result fromintegrating the three programming models that compilers cannot detect.For the first time,this paper presents a classification of operational errors that can result from the integration of the three models.This paper also proposes a parallel hybrid testing technique for detecting runtime errors in systems built in the C++programming language that uses the triple programming models MPI,OpenMP,and OpenACC.This hybrid technology combines static technology and dynamic technology,given that some errors can be detected using static techniques,whereas others can be detected using dynamic technology.The hybrid technique can detect more errors because it combines two distinct technologies.The proposed static technology detects a wide range of error types in less time,whereas a portion of the potential errors that may or may not occur depending on the 4502 CMC,2023,vol.74,no.2 operating environment are left to the dynamic technology,which completes the validation.展开更多
To guarantee a unified response to disasters, humanitarian organizations work together via the United Nations Office for the Coordination of Humanitarian Affairs (OCHA). Although the OCHA has made great strides to imp...To guarantee a unified response to disasters, humanitarian organizations work together via the United Nations Office for the Coordination of Humanitarian Affairs (OCHA). Although the OCHA has made great strides to improve its information management and increase the availability of accurate, real-time data for disaster and humanitarian response teams, significant gaps persist. There are inefficiencies in the emergency management of data at every stage of its lifecycle: collection, processing, analysis, distribution, storage, and retrieval. Disaster risk reduction and disaster risk management are the two main tenets of the United Nations’ worldwide plan for disaster management. Information systems are crucial because of the crucial roles they play in capturing, processing, and transmitting data. The management of information is seldom discussed in published works. The goal of this study is to employ qualitative research methods to provide insight by facilitating an expanded comprehension of relevant contexts, phenomena, and individual experiences. Humanitarian workers and OCHA staffers will take part in the research. The study subjects will be chosen using a random selection procedure. Online surveys with both closed- and open-ended questions will be used to compile the data. UN OCHA offers a structure for the handling of information via which all humanitarian actors may contribute to the overall response. This research will enable the UN Office for OCHA better gather, process, analyze, disseminate, store, and retrieve data in the event of a catastrophe or humanitarian crisis.展开更多
Olive trees are susceptible to a variety of diseases that can cause significant crop damage and economic losses.Early detection of these diseases is essential for effective management.We propose a novel transformed wa...Olive trees are susceptible to a variety of diseases that can cause significant crop damage and economic losses.Early detection of these diseases is essential for effective management.We propose a novel transformed wavelet,feature-fused,pre-trained deep learning model for detecting olive leaf diseases.The proposed model combines wavelet transforms with pre-trained deep-learning models to extract discriminative features from olive leaf images.The model has four main phases:preprocessing using data augmentation,three-level wavelet transformation,learning using pre-trained deep learning models,and a fused deep learning model.In the preprocessing phase,the image dataset is augmented using techniques such as resizing,rescaling,flipping,rotation,zooming,and contrasting.In wavelet transformation,the augmented images are decomposed into three frequency levels.Three pre-trained deep learning models,EfficientNet-B7,DenseNet-201,and ResNet-152-V2,are used in the learning phase.The models were trained using the approximate images of the third-level sub-band of the wavelet transform.In the fused phase,the fused model consists of a merge layer,three dense layers,and two dropout layers.The proposed model was evaluated using a dataset of images of healthy and infected olive leaves.It achieved an accuracy of 99.72%in the diagnosis of olive leaf diseases,which exceeds the accuracy of other methods reported in the literature.This finding suggests that our proposed method is a promising tool for the early detection of olive leaf diseases.展开更多
Recent developments in Computer Vision have presented novel opportunities to tackle complex healthcare issues,particularly in the field of lung disease diagnosis.One promising avenue involves the use of chest X-Rays,w...Recent developments in Computer Vision have presented novel opportunities to tackle complex healthcare issues,particularly in the field of lung disease diagnosis.One promising avenue involves the use of chest X-Rays,which are commonly utilized in radiology.To fully exploit their potential,researchers have suggested utilizing deep learning methods to construct computer-aided diagnostic systems.However,constructing and compressing these systems presents a significant challenge,as it relies heavily on the expertise of data scientists.To tackle this issue,we propose an automated approach that utilizes an evolutionary algorithm(EA)to optimize the design and compression of a convolutional neural network(CNN)for X-Ray image classification.Our approach accurately classifies radiography images and detects potential chest abnormalities and infections,including COVID-19.Furthermore,our approach incorporates transfer learning,where a pre-trainedCNNmodel on a vast dataset of chest X-Ray images is fine-tuned for the specific task of detecting COVID-19.This method can help reduce the amount of labeled data required for the task and enhance the overall performance of the model.We have validated our method via a series of experiments against state-of-the-art architectures.展开更多
The Internet of Things(IoT)is a smart networking infrastructure of physical devices,i.e.,things,that are embedded with sensors,actuators,software,and other technologies,to connect and share data with the respective se...The Internet of Things(IoT)is a smart networking infrastructure of physical devices,i.e.,things,that are embedded with sensors,actuators,software,and other technologies,to connect and share data with the respective server module.Although IoTs are cornerstones in different application domains,the device’s authenticity,i.e.,of server(s)and ordinary devices,is the most crucial issue and must be resolved on a priority basis.Therefore,various field-proven methodologies were presented to streamline the verification process of the communicating devices;however,location-aware authentication has not been reported as per our knowledge,which is a crucial metric,especially in scenarios where devices are mobile.This paper presents a lightweight and location-aware device-to-server authentication technique where the device’s membership with the nearest server is subjected to its location information along with other measures.Initially,Media Access Control(MAC)address and Advance Encryption Scheme(AES)along with a secret shared key,i.e.,λ_(i) of 128 bits,have been utilized by Trusted Authority(TA)to generate MaskIDs,which are used instead of the original ID,for every device,i.e.,server and member,and are shared in the offline phase.Secondly,TA shares a list of authentic devices,i.e.,server S_(j) and members C_(i),with every device in the IoT for the onward verification process,which is required to be executed before the initialization of the actual communication process.Additionally,every device should be located such that it lies within the coverage area of a server,and this location information is used in the authentication process.A thorough analytical analysis was carried out to check the susceptibility of the proposed and existing authentication approaches against well-known intruder attacks,i.e.,man-in-the-middle,masquerading,device,and server impersonations,etc.,especially in the IoT domain.Moreover,proposed authentication and existing state-of-the-art approaches have been simulated in the real environment of IoT to verify their performance,particularly in terms of various evaluation metrics,i.e.,processing,communication,and storage overheads.These results have verified the superiority of the proposed scheme against existing state-of-the-art approaches,preferably in terms of communication,storage,and processing costs.展开更多
As the extensive use of cloud computing raises questions about the security of any personal data stored there,cryptography is being used more frequently as a security tool to protect data confidentiality and privacy i...As the extensive use of cloud computing raises questions about the security of any personal data stored there,cryptography is being used more frequently as a security tool to protect data confidentiality and privacy in the cloud environment.A hypervisor is a virtualization software used in cloud hosting to divide and allocate resources on various pieces of hardware.The choice of hypervisor can significantly impact the performance of cryptographic operations in the cloud environment.An important issue that must be carefully examined is that no hypervisor is completely superior in terms of performance;Each hypervisor should be examined to meet specific needs.The main objective of this study is to provide accurate results to compare the performance of Hyper-V and Kernel-based Virtual Machine(KVM)while implementing different cryptographic algorithms to guide cloud service providers and end users in choosing the most suitable hypervisor for their cryptographic needs.This study evaluated the efficiency of two hypervisors,Hyper-V and KVM,in implementing six cryptographic algorithms:Rivest,Shamir,Adleman(RSA),Advanced Encryption Standard(AES),Triple Data Encryption Standard(TripleDES),Carlisle Adams and Stafford Tavares(CAST-128),BLOWFISH,and TwoFish.The study’s findings show that KVM outperforms Hyper-V,with 12.2%less Central Processing Unit(CPU)use and 12.95%less time overall for encryption and decryption operations with various file sizes.The study’s findings emphasize how crucial it is to pick a hypervisor that is appropriate for cryptographic needs in a cloud environment,which could assist both cloud service providers and end users.Future research may focus more on how various hypervisors perform while handling cryptographic workloads.展开更多
Chronic kidney disease(CKD)is a major health concern today,requiring early and accurate diagnosis.Machine learning has emerged as a powerful tool for disease detection,and medical professionals are increasingly using ...Chronic kidney disease(CKD)is a major health concern today,requiring early and accurate diagnosis.Machine learning has emerged as a powerful tool for disease detection,and medical professionals are increasingly using ML classifier algorithms to identify CKD early.This study explores the application of advanced machine learning techniques on a CKD dataset obtained from the University of California,UC Irvine Machine Learning repository.The research introduces TrioNet,an ensemble model combining extreme gradient boosting,random forest,and extra tree classifier,which excels in providing highly accurate predictions for CKD.Furthermore,K nearest neighbor(KNN)imputer is utilized to deal withmissing values while synthetic minority oversampling(SMOTE)is used for class-imbalance problems.To ascertain the efficacy of the proposed model,a comprehensive comparative analysis is conducted with various machine learning models.The proposed TrioNet using KNN imputer and SMOTE outperformed other models with 98.97%accuracy for detectingCKD.This in-depth analysis demonstrates the model’s capabilities and underscores its potential as a valuable tool in the diagnosis of CKD.展开更多
Aim:This study aims to establish an artificial intelligence model,ThyroidNet,to diagnose thyroid nodules using deep learning techniques accurately.Methods:A novel method,ThyroidNet,is introduced and evaluated based on...Aim:This study aims to establish an artificial intelligence model,ThyroidNet,to diagnose thyroid nodules using deep learning techniques accurately.Methods:A novel method,ThyroidNet,is introduced and evaluated based on deep learning for the localization and classification of thyroid nodules.First,we propose the multitask TransUnet,which combines the TransUnet encoder and decoder with multitask learning.Second,we propose the DualLoss function,tailored to the thyroid nodule localization and classification tasks.It balances the learning of the localization and classification tasks to help improve the model’s generalization ability.Third,we introduce strategies for augmenting the data.Finally,we submit a novel deep learning model,ThyroidNet,to accurately detect thyroid nodules.Results:ThyroidNet was evaluated on private datasets and was comparable to other existing methods,including U-Net and TransUnet.Experimental results show that ThyroidNet outperformed these methods in localizing and classifying thyroid nodules.It achieved improved accuracy of 3.9%and 1.5%,respectively.Conclusion:ThyroidNet significantly improves the clinical diagnosis of thyroid nodules and supports medical image analysis tasks.Future research directions include optimization of the model structure,expansion of the dataset size,reduction of computational complexity and memory requirements,and exploration of additional applications of ThyroidNet in medical image analysis.展开更多
Face recognition (FR) technology has numerous applications in artificial intelligence including biometrics, security,authentication, law enforcement, and surveillance. Deep learning (DL) models, notably convolutional ...Face recognition (FR) technology has numerous applications in artificial intelligence including biometrics, security,authentication, law enforcement, and surveillance. Deep learning (DL) models, notably convolutional neuralnetworks (CNNs), have shown promising results in the field of FR. However CNNs are easily fooled since theydo not encode position and orientation correlations between features. Hinton et al. envisioned Capsule Networksas a more robust design capable of retaining pose information and spatial correlations to recognize objects morelike the brain does. Lower-level capsules hold 8-dimensional vectors of attributes like position, hue, texture, andso on, which are routed to higher-level capsules via a new routing by agreement algorithm. This provides capsulenetworks with viewpoint invariance, which has previously evaded CNNs. This research presents a FR model basedon capsule networks that was tested using the LFW dataset, COMSATS face dataset, and own acquired photos usingcameras measuring 128 × 128 pixels, 40 × 40 pixels, and 30 × 30 pixels. The trained model outperforms state-ofthe-art algorithms, achieving 95.82% test accuracy and performing well on unseen faces that have been blurred orrotated. Additionally, the suggested model outperformed the recently released approaches on the COMSATS facedataset, achieving a high accuracy of 92.47%. Based on the results of this research as well as previous results, capsulenetworks perform better than deeper CNNs on unobserved altered data because of their special equivarianceproperties.展开更多
The Internet of Things(IoT)is growing rapidly and impacting almost every aspect of our lives,fromwearables and healthcare to security,traffic management,and fleet management systems.This has generated massive volumes ...The Internet of Things(IoT)is growing rapidly and impacting almost every aspect of our lives,fromwearables and healthcare to security,traffic management,and fleet management systems.This has generated massive volumes of data and security,and data privacy risks are increasing with the advancement of technology and network connections.Traditional access control solutions are inadequate for establishing access control in IoT systems to provide data protection owing to their vulnerability to single-point OF failure.Additionally,conventional privacy preservation methods have high latency costs and overhead for resource-constrained devices.Previous machine learning approaches were also unable to detect denial-of-service(DoS)attacks.This study introduced a novel decentralized and secure framework for blockchain integration.To avoid single-point OF failure,an accredited access control scheme is incorporated,combining blockchain with local peers to record each transaction and verify the signature to access.Blockchain-based attribute-based cryptography is implemented to protect data storage privacy by generating threshold parameters,managing keys,and revoking users on the blockchain.An innovative contract-based DOS attack mitigation method is also incorporated to effectively validate devices with intelligent contracts as trusted or untrusted,preventing the server from becoming overwhelmed.The proposed framework effectively controls access,safeguards data privacy,and reduces the risk of cyberattacks.The results depict that the suggested framework outperforms the results in terms of accuracy,precision,sensitivity,recall,and F-measure at 96.9%,98.43%,98.8%,98.43%,and 98.4%,respectively.展开更多
End-user computing empowers non-developers to manage data and applications, enhancing collaboration and efficiency. Spreadsheets, a prime example of end-user programming environments widely used in business for data a...End-user computing empowers non-developers to manage data and applications, enhancing collaboration and efficiency. Spreadsheets, a prime example of end-user programming environments widely used in business for data analysis. However, Excel functionalities have limits compared to dedicated programming languages. This paper addresses this gap by proposing a prototype for integrating Python’s capabilities into Excel through on-premises desktop to build custom spreadsheet functions with Python. This approach overcomes potential latency issues associated with cloud-based solutions. This prototype utilizes Excel-DNA and IronPython. Excel-DNA allows creating custom Python functions that seamlessly integrate with Excel’s calculation engine. IronPython enables the execution of these Python (CSFs) directly within Excel. C# and VSTO add-ins form the core components, facilitating communication between Python and Excel. This approach empowers users with a potentially open-ended set of Python (CSFs) for tasks like mathematical calculations, statistical analysis, and even predictive modeling, all within the familiar Excel interface. This prototype demonstrates smooth integration, allowing users to call Python (CSFs) just like standard Excel functions. This research contributes to enhancing spreadsheet capabilities for end-user programmers by leveraging Python’s power within Excel. Future research could explore expanding data analysis capabilities by expanding the (CSFs) functions for complex calculations, statistical analysis, data manipulation, and even external library integration. The possibility of integrating machine learning models through the (CSFs) functions within the familiar Excel environment.展开更多
In light of the rapid growth and development of social media, it has become the focus of interest in many different scientific fields. They seek to extract useful information from it, and this is called (knowledge), s...In light of the rapid growth and development of social media, it has become the focus of interest in many different scientific fields. They seek to extract useful information from it, and this is called (knowledge), such as extracting information related to people’s behaviors and interactions to analyze feelings or understand the behavior of users or groups, and many others. This extracted knowledge has a very important role in decision-making, creating and improving marketing objectives and competitive advantage, monitoring events, whether political or economic, and development in all fields. Therefore, to extract this knowledge, we need to analyze the vast amount of data found within social media using the most popular data mining techniques and applications related to social media sites.展开更多
In the past decade,online Peer-to-Peer(P2P)lending platforms have transformed the lending industry,which has been historically dominated by commercial banks.Information technology breakthroughs such as big data-based ...In the past decade,online Peer-to-Peer(P2P)lending platforms have transformed the lending industry,which has been historically dominated by commercial banks.Information technology breakthroughs such as big data-based financial technologies(Fintech)have been identified as important disruptive driving forces for this paradigm shift.In this paper,we take an information economics perspective to investigate how big data affects the transformation of the lending industry.By identifying how signaling and search costs are reduced by big data analytics for credit risk management of P2P lending,we discuss how information asymmetry is reduced in the big data era.Rooted in the lending business,we propose a theory on the economics of big data and outline a number of research opportunities and challenging issues.展开更多
This paper deals with the robust control problem for a class of uncertain nonlinear networked systems with stochastic communication delays via sliding mode conception (SMC). A sequence of variables obeying Bernoulli...This paper deals with the robust control problem for a class of uncertain nonlinear networked systems with stochastic communication delays via sliding mode conception (SMC). A sequence of variables obeying Bernoulli distribution are employed to model the randomly occurring communication delays which could be different for different state variables. A discrete switching function that is different from those in the existing literature is first proposed. Then, expressed as the feasibility of a linear matrix inequality (LMI) with an equality constraint, sufficient conditions are derived in order to ensure the globally mean-square asymptotic stability of the system dynamics on the sliding surface. A discrete-time SMC controller is then synthesized to guarantee the discrete-time sliding mode reaching condition with the specified sliding surface. Finally, a simulation example is given to show the effectiveness of the proposed method.展开更多
Cyber-physical systems(CPS)are increasingly commonplace,with applications in energy,health,transportation,and many other sectors.One of the major requirements in CPS is that the interaction between cyber-world and man...Cyber-physical systems(CPS)are increasingly commonplace,with applications in energy,health,transportation,and many other sectors.One of the major requirements in CPS is that the interaction between cyber-world and man-made physical world(exchanging and sharing of data and information with other physical objects and systems)must be safe,especially in bi-directional communications.In particular,there is a need to suitably address security and/or privacy concerns in this human-in-the-loop CPS ecosystem.However,existing centralized architecture models in CPS,and also the more general IoT systems,have a number of associated limitations,in terms of single point of failure,data privacy,security,robustness,etc.Such limitations reinforce the importance of designing reliable,secure and privacy-preserving distributed solutions and other novel approaches,such as those based on blockchain technology due to its features(e.g.,decentralization,transparency and immutability of data).This is the focus of this special issue.展开更多
Internet of Things(IoT)devices work mainly in wireless mediums;requiring different Intrusion Detection System(IDS)kind of solutions to leverage 802.11 header information for intrusion detection.Wireless-specific traff...Internet of Things(IoT)devices work mainly in wireless mediums;requiring different Intrusion Detection System(IDS)kind of solutions to leverage 802.11 header information for intrusion detection.Wireless-specific traffic features with high information gain are primarily found in data link layers rather than application layers in wired networks.This survey investigates some of the complexities and challenges in deploying wireless IDS in terms of data collection methods,IDS techniques,IDS placement strategies,and traffic data analysis techniques.This paper’s main finding highlights the lack of available network traces for training modern machine-learning models against IoT specific intrusions.Specifically,the Knowledge Discovery in Databases(KDD)Cup dataset is reviewed to highlight the design challenges of wireless intrusion detection based on current data attributes and proposed several guidelines to future-proof following traffic capture methods in the wireless network(WN).The paper starts with a review of various intrusion detection techniques,data collection methods and placement methods.The main goal of this paper is to study the design challenges of deploying intrusion detection system in a wireless environment.Intrusion detection system deployment in a wireless environment is not as straightforward as in the wired network environment due to the architectural complexities.So this paper reviews the traditional wired intrusion detection deployment methods and discusses how these techniques could be adopted into the wireless environment and also highlights the design challenges in the wireless environment.The main wireless environments to look into would be Wireless Sensor Networks(WSN),Mobile Ad Hoc Networks(MANET)and IoT as this are the future trends and a lot of attacks have been targeted into these networks.So it is very crucial to design an IDS specifically to target on the wireless networks.展开更多
Background:We examine the signaling effect of borrowers’social media behavior,especially self-disclosure behavior,on the default probability of money borrowers on a peer-to-peer(P2P)lending site.Method:We use a uniqu...Background:We examine the signaling effect of borrowers’social media behavior,especially self-disclosure behavior,on the default probability of money borrowers on a peer-to-peer(P2P)lending site.Method:We use a unique dataset that combines loan data from a large P2P lending site with the borrower’s social media presence data from a popular social media site.Results:Through a natural experiment enabled by an instrument variable,we identify two forms of social media information that act as signals of borrowers’creditworthiness:(1)borrowers’choice to self-disclose their social media account to the P2P lending site,and(2)borrowers’social media behavior,such as their social network scope and social media engagement.Conclusion:This study offers new insights for screening borrowers in P2P lending and a novel usage of social media information.展开更多
Accumulating evidence suggests that the gut microbiota plays an important role in the pathogenesis of inflammatory bowel disease(IBD).Carnosic acid(CA)is a major antioxidant component of rosemary and sage.Herein,we in...Accumulating evidence suggests that the gut microbiota plays an important role in the pathogenesis of inflammatory bowel disease(IBD).Carnosic acid(CA)is a major antioxidant component of rosemary and sage.Herein,we investigated the protective effects of dietary CA on dextran sodium sulfate(DSS)-induced colitis mouse model with an emphasis on its impact on the composition and metabolic function of gut microbiota.We found that CA effectively attenuated DSS-stimulated colitis in mice,as evidenced by reduced disease activity index(DAI),and systemic and colonic inflammation.Additionally,CA restored microbial diversity and improved the composition of gut microbiota in DSS-treated mice.Moreover,Spearman’s correlation coefficient showed a significant correlation between the fecal metabolites and the gut microbiota species.Changes in gut microbiota and the correlated metabolites might partially explain CA’s anti-inflammatory effects against colitis.Future clinical trials are needed to determine the therapeutic effects and mechanisms of CA on IBD in humans.展开更多
One of the most critical objectives of precision farming is to assess the germination quality of seeds.Modern models contribute to thisfield primarily through the use of artificial intelligence techniques such as machin...One of the most critical objectives of precision farming is to assess the germination quality of seeds.Modern models contribute to thisfield primarily through the use of artificial intelligence techniques such as machine learning,which present difficulties in feature extraction and optimization,which are critical factors in predicting accuracy with few false alarms,and another significant dif-ficulty is assessing germination quality.Additionally,the majority of these contri-butions make use of benchmark classification methods that are either inept or too complex to train with the supplied features.This manuscript addressed these issues by introducing a novel ensemble classification strategy dubbed“Assessing Germination Quality of Seed Samples(AGQSS)by Adaptive Boosting Ensemble Classification”that learns from quantitative phase features as well as universal features in greyscale spectroscopic images.The experimental inquiry illustrates the significance of the proposed model,which outperformed the currently avail-able models when performance analysis was performed.展开更多
Mobile edge computing(MEC)provides effective cloud services and functionality at the edge device,to improve the quality of service(QoS)of end users by offloading the high computation tasks.Currently,the introduction o...Mobile edge computing(MEC)provides effective cloud services and functionality at the edge device,to improve the quality of service(QoS)of end users by offloading the high computation tasks.Currently,the introduction of deep learning(DL)and hardware technologies paves amethod in detecting the current traffic status,data offloading,and cyberattacks in MEC.This study introduces an artificial intelligence with metaheuristic based data offloading technique for Secure MEC(AIMDO-SMEC)systems.The proposed AIMDO-SMEC technique incorporates an effective traffic prediction module using Siamese Neural Networks(SNN)to determine the traffic status in the MEC system.Also,an adaptive sampling cross entropy(ASCE)technique is utilized for data offloading in MEC systems.Moreover,the modified salp swarm algorithm(MSSA)with extreme gradient boosting(XGBoost)technique was implemented to identification and classification of cyberattack that exist in the MEC systems.For examining the enhanced outcomes of the AIMDO-SMEC technique,a comprehensive experimental analysis is carried out and the results demonstrated the enhanced outcomes of the AIMDOSMEC technique with the minimal completion time of tasks(CTT)of 0.680.展开更多
基金[King Abdulaziz University][Deanship of Scientific Research]Grant Number[KEP-PHD-20-611-42].
文摘Recently,researchers have shown increasing interest in combining more than one programming model into systems running on high performance computing systems(HPCs)to achieve exascale by applying parallelism at multiple levels.Combining different programming paradigms,such as Message Passing Interface(MPI),Open Multiple Processing(OpenMP),and Open Accelerators(OpenACC),can increase computation speed and improve performance.During the integration of multiple models,the probability of runtime errors increases,making their detection difficult,especially in the absence of testing techniques that can detect these errors.Numerous studies have been conducted to identify these errors,but no technique exists for detecting errors in three-level programming models.Despite the increasing research that integrates the three programming models,MPI,OpenMP,and OpenACC,a testing technology to detect runtime errors,such as deadlocks and race conditions,which can arise from this integration has not been developed.Therefore,this paper begins with a definition and explanation of runtime errors that result fromintegrating the three programming models that compilers cannot detect.For the first time,this paper presents a classification of operational errors that can result from the integration of the three models.This paper also proposes a parallel hybrid testing technique for detecting runtime errors in systems built in the C++programming language that uses the triple programming models MPI,OpenMP,and OpenACC.This hybrid technology combines static technology and dynamic technology,given that some errors can be detected using static techniques,whereas others can be detected using dynamic technology.The hybrid technique can detect more errors because it combines two distinct technologies.The proposed static technology detects a wide range of error types in less time,whereas a portion of the potential errors that may or may not occur depending on the 4502 CMC,2023,vol.74,no.2 operating environment are left to the dynamic technology,which completes the validation.
文摘To guarantee a unified response to disasters, humanitarian organizations work together via the United Nations Office for the Coordination of Humanitarian Affairs (OCHA). Although the OCHA has made great strides to improve its information management and increase the availability of accurate, real-time data for disaster and humanitarian response teams, significant gaps persist. There are inefficiencies in the emergency management of data at every stage of its lifecycle: collection, processing, analysis, distribution, storage, and retrieval. Disaster risk reduction and disaster risk management are the two main tenets of the United Nations’ worldwide plan for disaster management. Information systems are crucial because of the crucial roles they play in capturing, processing, and transmitting data. The management of information is seldom discussed in published works. The goal of this study is to employ qualitative research methods to provide insight by facilitating an expanded comprehension of relevant contexts, phenomena, and individual experiences. Humanitarian workers and OCHA staffers will take part in the research. The study subjects will be chosen using a random selection procedure. Online surveys with both closed- and open-ended questions will be used to compile the data. UN OCHA offers a structure for the handling of information via which all humanitarian actors may contribute to the overall response. This research will enable the UN Office for OCHA better gather, process, analyze, disseminate, store, and retrieve data in the event of a catastrophe or humanitarian crisis.
文摘Olive trees are susceptible to a variety of diseases that can cause significant crop damage and economic losses.Early detection of these diseases is essential for effective management.We propose a novel transformed wavelet,feature-fused,pre-trained deep learning model for detecting olive leaf diseases.The proposed model combines wavelet transforms with pre-trained deep-learning models to extract discriminative features from olive leaf images.The model has four main phases:preprocessing using data augmentation,three-level wavelet transformation,learning using pre-trained deep learning models,and a fused deep learning model.In the preprocessing phase,the image dataset is augmented using techniques such as resizing,rescaling,flipping,rotation,zooming,and contrasting.In wavelet transformation,the augmented images are decomposed into three frequency levels.Three pre-trained deep learning models,EfficientNet-B7,DenseNet-201,and ResNet-152-V2,are used in the learning phase.The models were trained using the approximate images of the third-level sub-band of the wavelet transform.In the fused phase,the fused model consists of a merge layer,three dense layers,and two dropout layers.The proposed model was evaluated using a dataset of images of healthy and infected olive leaves.It achieved an accuracy of 99.72%in the diagnosis of olive leaf diseases,which exceeds the accuracy of other methods reported in the literature.This finding suggests that our proposed method is a promising tool for the early detection of olive leaf diseases.
基金via funding from Prince Sattam bin Abdulaziz University Project Number(PSAU/2023/R/1444).
文摘Recent developments in Computer Vision have presented novel opportunities to tackle complex healthcare issues,particularly in the field of lung disease diagnosis.One promising avenue involves the use of chest X-Rays,which are commonly utilized in radiology.To fully exploit their potential,researchers have suggested utilizing deep learning methods to construct computer-aided diagnostic systems.However,constructing and compressing these systems presents a significant challenge,as it relies heavily on the expertise of data scientists.To tackle this issue,we propose an automated approach that utilizes an evolutionary algorithm(EA)to optimize the design and compression of a convolutional neural network(CNN)for X-Ray image classification.Our approach accurately classifies radiography images and detects potential chest abnormalities and infections,including COVID-19.Furthermore,our approach incorporates transfer learning,where a pre-trainedCNNmodel on a vast dataset of chest X-Ray images is fine-tuned for the specific task of detecting COVID-19.This method can help reduce the amount of labeled data required for the task and enhance the overall performance of the model.We have validated our method via a series of experiments against state-of-the-art architectures.
文摘The Internet of Things(IoT)is a smart networking infrastructure of physical devices,i.e.,things,that are embedded with sensors,actuators,software,and other technologies,to connect and share data with the respective server module.Although IoTs are cornerstones in different application domains,the device’s authenticity,i.e.,of server(s)and ordinary devices,is the most crucial issue and must be resolved on a priority basis.Therefore,various field-proven methodologies were presented to streamline the verification process of the communicating devices;however,location-aware authentication has not been reported as per our knowledge,which is a crucial metric,especially in scenarios where devices are mobile.This paper presents a lightweight and location-aware device-to-server authentication technique where the device’s membership with the nearest server is subjected to its location information along with other measures.Initially,Media Access Control(MAC)address and Advance Encryption Scheme(AES)along with a secret shared key,i.e.,λ_(i) of 128 bits,have been utilized by Trusted Authority(TA)to generate MaskIDs,which are used instead of the original ID,for every device,i.e.,server and member,and are shared in the offline phase.Secondly,TA shares a list of authentic devices,i.e.,server S_(j) and members C_(i),with every device in the IoT for the onward verification process,which is required to be executed before the initialization of the actual communication process.Additionally,every device should be located such that it lies within the coverage area of a server,and this location information is used in the authentication process.A thorough analytical analysis was carried out to check the susceptibility of the proposed and existing authentication approaches against well-known intruder attacks,i.e.,man-in-the-middle,masquerading,device,and server impersonations,etc.,especially in the IoT domain.Moreover,proposed authentication and existing state-of-the-art approaches have been simulated in the real environment of IoT to verify their performance,particularly in terms of various evaluation metrics,i.e.,processing,communication,and storage overheads.These results have verified the superiority of the proposed scheme against existing state-of-the-art approaches,preferably in terms of communication,storage,and processing costs.
文摘As the extensive use of cloud computing raises questions about the security of any personal data stored there,cryptography is being used more frequently as a security tool to protect data confidentiality and privacy in the cloud environment.A hypervisor is a virtualization software used in cloud hosting to divide and allocate resources on various pieces of hardware.The choice of hypervisor can significantly impact the performance of cryptographic operations in the cloud environment.An important issue that must be carefully examined is that no hypervisor is completely superior in terms of performance;Each hypervisor should be examined to meet specific needs.The main objective of this study is to provide accurate results to compare the performance of Hyper-V and Kernel-based Virtual Machine(KVM)while implementing different cryptographic algorithms to guide cloud service providers and end users in choosing the most suitable hypervisor for their cryptographic needs.This study evaluated the efficiency of two hypervisors,Hyper-V and KVM,in implementing six cryptographic algorithms:Rivest,Shamir,Adleman(RSA),Advanced Encryption Standard(AES),Triple Data Encryption Standard(TripleDES),Carlisle Adams and Stafford Tavares(CAST-128),BLOWFISH,and TwoFish.The study’s findings show that KVM outperforms Hyper-V,with 12.2%less Central Processing Unit(CPU)use and 12.95%less time overall for encryption and decryption operations with various file sizes.The study’s findings emphasize how crucial it is to pick a hypervisor that is appropriate for cryptographic needs in a cloud environment,which could assist both cloud service providers and end users.Future research may focus more on how various hypervisors perform while handling cryptographic workloads.
基金funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project Number PNURSP2024R333,Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘Chronic kidney disease(CKD)is a major health concern today,requiring early and accurate diagnosis.Machine learning has emerged as a powerful tool for disease detection,and medical professionals are increasingly using ML classifier algorithms to identify CKD early.This study explores the application of advanced machine learning techniques on a CKD dataset obtained from the University of California,UC Irvine Machine Learning repository.The research introduces TrioNet,an ensemble model combining extreme gradient boosting,random forest,and extra tree classifier,which excels in providing highly accurate predictions for CKD.Furthermore,K nearest neighbor(KNN)imputer is utilized to deal withmissing values while synthetic minority oversampling(SMOTE)is used for class-imbalance problems.To ascertain the efficacy of the proposed model,a comprehensive comparative analysis is conducted with various machine learning models.The proposed TrioNet using KNN imputer and SMOTE outperformed other models with 98.97%accuracy for detectingCKD.This in-depth analysis demonstrates the model’s capabilities and underscores its potential as a valuable tool in the diagnosis of CKD.
基金supported by MRC,UK (MC_PC_17171)Royal Society,UK (RP202G0230)+8 种基金BHF,UK (AA/18/3/34220)Hope Foundation for Cancer Research,UK (RM60G0680)GCRF,UK (P202PF11)Sino-UK Industrial Fund,UK (RP202G0289)LIAS,UK (P202ED10,P202RE969)Data Science Enhancement Fund,UK (P202RE237)Fight for Sight,UK (24NN201)Sino-UK Education Fund,UK (OP202006)BBSRC,UK (RM32G0178B8).
文摘Aim:This study aims to establish an artificial intelligence model,ThyroidNet,to diagnose thyroid nodules using deep learning techniques accurately.Methods:A novel method,ThyroidNet,is introduced and evaluated based on deep learning for the localization and classification of thyroid nodules.First,we propose the multitask TransUnet,which combines the TransUnet encoder and decoder with multitask learning.Second,we propose the DualLoss function,tailored to the thyroid nodule localization and classification tasks.It balances the learning of the localization and classification tasks to help improve the model’s generalization ability.Third,we introduce strategies for augmenting the data.Finally,we submit a novel deep learning model,ThyroidNet,to accurately detect thyroid nodules.Results:ThyroidNet was evaluated on private datasets and was comparable to other existing methods,including U-Net and TransUnet.Experimental results show that ThyroidNet outperformed these methods in localizing and classifying thyroid nodules.It achieved improved accuracy of 3.9%and 1.5%,respectively.Conclusion:ThyroidNet significantly improves the clinical diagnosis of thyroid nodules and supports medical image analysis tasks.Future research directions include optimization of the model structure,expansion of the dataset size,reduction of computational complexity and memory requirements,and exploration of additional applications of ThyroidNet in medical image analysis.
基金Princess Nourah bint Abdulrahman University Riyadh,Saudi Arabia with Researchers Supporting Project Number:PNURSP2024R234.
文摘Face recognition (FR) technology has numerous applications in artificial intelligence including biometrics, security,authentication, law enforcement, and surveillance. Deep learning (DL) models, notably convolutional neuralnetworks (CNNs), have shown promising results in the field of FR. However CNNs are easily fooled since theydo not encode position and orientation correlations between features. Hinton et al. envisioned Capsule Networksas a more robust design capable of retaining pose information and spatial correlations to recognize objects morelike the brain does. Lower-level capsules hold 8-dimensional vectors of attributes like position, hue, texture, andso on, which are routed to higher-level capsules via a new routing by agreement algorithm. This provides capsulenetworks with viewpoint invariance, which has previously evaded CNNs. This research presents a FR model basedon capsule networks that was tested using the LFW dataset, COMSATS face dataset, and own acquired photos usingcameras measuring 128 × 128 pixels, 40 × 40 pixels, and 30 × 30 pixels. The trained model outperforms state-ofthe-art algorithms, achieving 95.82% test accuracy and performing well on unseen faces that have been blurred orrotated. Additionally, the suggested model outperformed the recently released approaches on the COMSATS facedataset, achieving a high accuracy of 92.47%. Based on the results of this research as well as previous results, capsulenetworks perform better than deeper CNNs on unobserved altered data because of their special equivarianceproperties.
文摘The Internet of Things(IoT)is growing rapidly and impacting almost every aspect of our lives,fromwearables and healthcare to security,traffic management,and fleet management systems.This has generated massive volumes of data and security,and data privacy risks are increasing with the advancement of technology and network connections.Traditional access control solutions are inadequate for establishing access control in IoT systems to provide data protection owing to their vulnerability to single-point OF failure.Additionally,conventional privacy preservation methods have high latency costs and overhead for resource-constrained devices.Previous machine learning approaches were also unable to detect denial-of-service(DoS)attacks.This study introduced a novel decentralized and secure framework for blockchain integration.To avoid single-point OF failure,an accredited access control scheme is incorporated,combining blockchain with local peers to record each transaction and verify the signature to access.Blockchain-based attribute-based cryptography is implemented to protect data storage privacy by generating threshold parameters,managing keys,and revoking users on the blockchain.An innovative contract-based DOS attack mitigation method is also incorporated to effectively validate devices with intelligent contracts as trusted or untrusted,preventing the server from becoming overwhelmed.The proposed framework effectively controls access,safeguards data privacy,and reduces the risk of cyberattacks.The results depict that the suggested framework outperforms the results in terms of accuracy,precision,sensitivity,recall,and F-measure at 96.9%,98.43%,98.8%,98.43%,and 98.4%,respectively.
文摘End-user computing empowers non-developers to manage data and applications, enhancing collaboration and efficiency. Spreadsheets, a prime example of end-user programming environments widely used in business for data analysis. However, Excel functionalities have limits compared to dedicated programming languages. This paper addresses this gap by proposing a prototype for integrating Python’s capabilities into Excel through on-premises desktop to build custom spreadsheet functions with Python. This approach overcomes potential latency issues associated with cloud-based solutions. This prototype utilizes Excel-DNA and IronPython. Excel-DNA allows creating custom Python functions that seamlessly integrate with Excel’s calculation engine. IronPython enables the execution of these Python (CSFs) directly within Excel. C# and VSTO add-ins form the core components, facilitating communication between Python and Excel. This approach empowers users with a potentially open-ended set of Python (CSFs) for tasks like mathematical calculations, statistical analysis, and even predictive modeling, all within the familiar Excel interface. This prototype demonstrates smooth integration, allowing users to call Python (CSFs) just like standard Excel functions. This research contributes to enhancing spreadsheet capabilities for end-user programmers by leveraging Python’s power within Excel. Future research could explore expanding data analysis capabilities by expanding the (CSFs) functions for complex calculations, statistical analysis, data manipulation, and even external library integration. The possibility of integrating machine learning models through the (CSFs) functions within the familiar Excel environment.
文摘In light of the rapid growth and development of social media, it has become the focus of interest in many different scientific fields. They seek to extract useful information from it, and this is called (knowledge), such as extracting information related to people’s behaviors and interactions to analyze feelings or understand the behavior of users or groups, and many others. This extracted knowledge has a very important role in decision-making, creating and improving marketing objectives and competitive advantage, monitoring events, whether political or economic, and development in all fields. Therefore, to extract this knowledge, we need to analyze the vast amount of data found within social media using the most popular data mining techniques and applications related to social media sites.
文摘In the past decade,online Peer-to-Peer(P2P)lending platforms have transformed the lending industry,which has been historically dominated by commercial banks.Information technology breakthroughs such as big data-based financial technologies(Fintech)have been identified as important disruptive driving forces for this paradigm shift.In this paper,we take an information economics perspective to investigate how big data affects the transformation of the lending industry.By identifying how signaling and search costs are reduced by big data analytics for credit risk management of P2P lending,we discuss how information asymmetry is reduced in the big data era.Rooted in the lending business,we propose a theory on the economics of big data and outline a number of research opportunities and challenging issues.
基金supported by the Engineering and Physical Sciences Research Council(EPSRC)of the UK(No.GR/S27658/01)the Royal Society of the UK and the Alexander von Humboldt Foundation of Germany
文摘This paper deals with the robust control problem for a class of uncertain nonlinear networked systems with stochastic communication delays via sliding mode conception (SMC). A sequence of variables obeying Bernoulli distribution are employed to model the randomly occurring communication delays which could be different for different state variables. A discrete switching function that is different from those in the existing literature is first proposed. Then, expressed as the feasibility of a linear matrix inequality (LMI) with an equality constraint, sufficient conditions are derived in order to ensure the globally mean-square asymptotic stability of the system dynamics on the sliding surface. A discrete-time SMC controller is then synthesized to guarantee the discrete-time sliding mode reaching condition with the specified sliding surface. Finally, a simulation example is given to show the effectiveness of the proposed method.
文摘Cyber-physical systems(CPS)are increasingly commonplace,with applications in energy,health,transportation,and many other sectors.One of the major requirements in CPS is that the interaction between cyber-world and man-made physical world(exchanging and sharing of data and information with other physical objects and systems)must be safe,especially in bi-directional communications.In particular,there is a need to suitably address security and/or privacy concerns in this human-in-the-loop CPS ecosystem.However,existing centralized architecture models in CPS,and also the more general IoT systems,have a number of associated limitations,in terms of single point of failure,data privacy,security,robustness,etc.Such limitations reinforce the importance of designing reliable,secure and privacy-preserving distributed solutions and other novel approaches,such as those based on blockchain technology due to its features(e.g.,decentralization,transparency and immutability of data).This is the focus of this special issue.
基金The authors acknowledge Jouf University,Saudi Arabia for his funding support.
文摘Internet of Things(IoT)devices work mainly in wireless mediums;requiring different Intrusion Detection System(IDS)kind of solutions to leverage 802.11 header information for intrusion detection.Wireless-specific traffic features with high information gain are primarily found in data link layers rather than application layers in wired networks.This survey investigates some of the complexities and challenges in deploying wireless IDS in terms of data collection methods,IDS techniques,IDS placement strategies,and traffic data analysis techniques.This paper’s main finding highlights the lack of available network traces for training modern machine-learning models against IoT specific intrusions.Specifically,the Knowledge Discovery in Databases(KDD)Cup dataset is reviewed to highlight the design challenges of wireless intrusion detection based on current data attributes and proposed several guidelines to future-proof following traffic capture methods in the wireless network(WN).The paper starts with a review of various intrusion detection techniques,data collection methods and placement methods.The main goal of this paper is to study the design challenges of deploying intrusion detection system in a wireless environment.Intrusion detection system deployment in a wireless environment is not as straightforward as in the wired network environment due to the architectural complexities.So this paper reviews the traditional wired intrusion detection deployment methods and discusses how these techniques could be adopted into the wireless environment and also highlights the design challenges in the wireless environment.The main wireless environments to look into would be Wireless Sensor Networks(WSN),Mobile Ad Hoc Networks(MANET)and IoT as this are the future trends and a lot of attacks have been targeted into these networks.So it is very crucial to design an IDS specifically to target on the wireless networks.
基金Juan Feng would like to acknowledge GRF(General Research Fund)9042133City U SRG grant 7004566Bin Gu would like to acknowledge National Natural Science Foundation of China[Grant 71328102].
文摘Background:We examine the signaling effect of borrowers’social media behavior,especially self-disclosure behavior,on the default probability of money borrowers on a peer-to-peer(P2P)lending site.Method:We use a unique dataset that combines loan data from a large P2P lending site with the borrower’s social media presence data from a popular social media site.Results:Through a natural experiment enabled by an instrument variable,we identify two forms of social media information that act as signals of borrowers’creditworthiness:(1)borrowers’choice to self-disclose their social media account to the P2P lending site,and(2)borrowers’social media behavior,such as their social network scope and social media engagement.Conclusion:This study offers new insights for screening borrowers in P2P lending and a novel usage of social media information.
基金supported by Natural Science Foundation of Guangdong basic and applied basic research foundation(2021A1515010965)General project of Basic and applied basic Research in Guangzhou(202102080241)+3 种基金Laboratory opening project of Guangzhou Medical University(PX-1020423)Natural Science Foundation of Guangdong basic and applied basic research foundation([2018]105)Guangdong Provincial Department of Education(S202010570042)Communist Youth League Committee of Guangzhou Medical University(2019A060).
文摘Accumulating evidence suggests that the gut microbiota plays an important role in the pathogenesis of inflammatory bowel disease(IBD).Carnosic acid(CA)is a major antioxidant component of rosemary and sage.Herein,we investigated the protective effects of dietary CA on dextran sodium sulfate(DSS)-induced colitis mouse model with an emphasis on its impact on the composition and metabolic function of gut microbiota.We found that CA effectively attenuated DSS-stimulated colitis in mice,as evidenced by reduced disease activity index(DAI),and systemic and colonic inflammation.Additionally,CA restored microbial diversity and improved the composition of gut microbiota in DSS-treated mice.Moreover,Spearman’s correlation coefficient showed a significant correlation between the fecal metabolites and the gut microbiota species.Changes in gut microbiota and the correlated metabolites might partially explain CA’s anti-inflammatory effects against colitis.Future clinical trials are needed to determine the therapeutic effects and mechanisms of CA on IBD in humans.
文摘One of the most critical objectives of precision farming is to assess the germination quality of seeds.Modern models contribute to thisfield primarily through the use of artificial intelligence techniques such as machine learning,which present difficulties in feature extraction and optimization,which are critical factors in predicting accuracy with few false alarms,and another significant dif-ficulty is assessing germination quality.Additionally,the majority of these contri-butions make use of benchmark classification methods that are either inept or too complex to train with the supplied features.This manuscript addressed these issues by introducing a novel ensemble classification strategy dubbed“Assessing Germination Quality of Seed Samples(AGQSS)by Adaptive Boosting Ensemble Classification”that learns from quantitative phase features as well as universal features in greyscale spectroscopic images.The experimental inquiry illustrates the significance of the proposed model,which outperformed the currently avail-able models when performance analysis was performed.
基金The authors extend their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work under Grant Number(RGP 2/209/42)Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2022R77),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘Mobile edge computing(MEC)provides effective cloud services and functionality at the edge device,to improve the quality of service(QoS)of end users by offloading the high computation tasks.Currently,the introduction of deep learning(DL)and hardware technologies paves amethod in detecting the current traffic status,data offloading,and cyberattacks in MEC.This study introduces an artificial intelligence with metaheuristic based data offloading technique for Secure MEC(AIMDO-SMEC)systems.The proposed AIMDO-SMEC technique incorporates an effective traffic prediction module using Siamese Neural Networks(SNN)to determine the traffic status in the MEC system.Also,an adaptive sampling cross entropy(ASCE)technique is utilized for data offloading in MEC systems.Moreover,the modified salp swarm algorithm(MSSA)with extreme gradient boosting(XGBoost)technique was implemented to identification and classification of cyberattack that exist in the MEC systems.For examining the enhanced outcomes of the AIMDO-SMEC technique,a comprehensive experimental analysis is carried out and the results demonstrated the enhanced outcomes of the AIMDOSMEC technique with the minimal completion time of tasks(CTT)of 0.680.