Wireless technology is transforming the future of transportation through the development of the Internet of Vehicles(IoV).However,intricate security challenges are intertwinedwith technological progress:Vehicular ad h...Wireless technology is transforming the future of transportation through the development of the Internet of Vehicles(IoV).However,intricate security challenges are intertwinedwith technological progress:Vehicular ad hoc Networks(VANETs),a core component of IoV,face security issues,particularly the Black Hole Attack(BHA).This malicious attack disrupts the seamless flow of data and threatens the network’s overall reliability;also,BHA strategically disrupts communication pathways by dropping data packets from legitimate nodes altogether.Recognizing the importance of this challenge,we have introduced a new solution called ad hoc On-Demand Distance Vector-Reputation-based mechanism Local Outlier Factor(AODV-RL).The significance of AODVRL lies in its unique approach:it verifies and confirms the trustworthiness of network components,providing robust protection against BHA.An additional safety layer is established by implementing the Local Outlier Factor(LOF),which detects and addresses abnormal network behaviors.Rigorous testing of our solution has revealed its remarkable ability to enhance communication in VANETs.Specifically,Our experimental results achieve message delivery ratios of up to 94.25%andminimal packet loss ratios of just 0.297%.Based on our experimental results,the proposedmechanismsignificantly improves VANET communication reliability and security.These results promise a more secure and dependable future for IoV,capable of transforming transportation safety and efficiency.展开更多
The concept of smart houses has grown in prominence in recent years.Major challenges linked to smart homes are identification theft,data safety,automated decision-making for IoT-based devices,and the security of the d...The concept of smart houses has grown in prominence in recent years.Major challenges linked to smart homes are identification theft,data safety,automated decision-making for IoT-based devices,and the security of the device itself.Current home automation systems try to address these issues but there is still an urgent need for a dependable and secure smart home solution that includes automatic decision-making systems and methodical features.This paper proposes a smart home system based on ensemble learning of random forest(RF)and convolutional neural networks(CNN)for programmed decision-making tasks,such as categorizing gadgets as“OFF”or“ON”based on their normal routine in homes.We have integrated emerging blockchain technology to provide secure,decentralized,and trustworthy authentication and recognition of IoT devices.Our system consists of a 5V relay circuit,various sensors,and a Raspberry Pi server and database for managing devices.We have also developed an Android app that communicates with the server interface through an HTTP web interface and an Apache server.The feasibility and efficacy of the proposed smart home automation system have been evaluated in both laboratory and real-time settings.It is essential to use inexpensive,scalable,and readily available components and technologies in smart home automation systems.Additionally,we must incorporate a comprehensive security and privacy-centric design that emphasizes risk assessments,such as cyberattacks,hardware security,and other cyber threats.The trial results support the proposed system and demonstrate its potential for use in everyday life.展开更多
The massive growth of diversified smart devices and continuous data generation poses a challenge to communication architectures.To deal with this problem,communication networks consider fog computing as one of promisi...The massive growth of diversified smart devices and continuous data generation poses a challenge to communication architectures.To deal with this problem,communication networks consider fog computing as one of promising technologies that can improve overall communication performance.It brings on-demand services proximate to the end devices and delivers the requested data in a short time.Fog computing faces several issues such as latency,bandwidth,and link utilization due to limited resources and the high processing demands of end devices.To this end,fog caching plays an imperative role in addressing data dissemination issues.This study provides a comprehensive discussion of fog computing,Internet of Things(IoTs)and the critical issues related to data security and dissemination in fog computing.Moreover,we determine the fog-based caching schemes and contribute to deal with the existing issues of fog computing.Besides,this paper presents a number of caching schemes with their contributions,benefits,and challenges to overcome the problems and limitations of fog computing.We also identify machine learning-based approaches for cache security and management in fog computing,as well as several prospective future research directions in caching,fog computing,and machine learning.展开更多
Cloud computing promises the advent of a new era of service boosted by means of virtualization technology.The process of virtualization means creation of virtual infrastructure,devices,servers and computing resources ...Cloud computing promises the advent of a new era of service boosted by means of virtualization technology.The process of virtualization means creation of virtual infrastructure,devices,servers and computing resources needed to deploy an application smoothly.This extensively practiced technology involves selecting an efficient Virtual Machine(VM)to complete the task by transferring applications from Physical Machines(PM)to VM or from VM to VM.The whole process is very challenging not only in terms of computation but also in terms of energy and memory.This research paper presents an energy aware VM allocation and migration approach to meet the challenges faced by the growing number of cloud data centres.Machine Learning(ML)based Artificial Bee Colony(ABC)is used to rank the VM with respect to the load while considering the energy efficiency as a crucial parameter.The most efficient virtual machines are further selected and thus depending on the dynamics of the load and energy,applications are migrated fromoneVMto another.The simulation analysis is performed inMatlab and it shows that this research work results in more reduction in energy consumption as compared to existing studies.展开更多
In this paper, the generalized Dodd-Bullough-Mikhailov equation is studied. The existence of periodic wave and unbounded wave solutions is proved by using the method of bifurcation theory of dynamical systems. Under d...In this paper, the generalized Dodd-Bullough-Mikhailov equation is studied. The existence of periodic wave and unbounded wave solutions is proved by using the method of bifurcation theory of dynamical systems. Under different parametric conditions, various sufficient conditions to guarantee the existence of the above solutions are given.Some exact explicit parametric representations of the above travelling solutions are obtained.展开更多
The introduction of the Internet of Things(IoT)paradigm serves as pervasive resource access and sharing platform for different real-time applications.Decentralized resource availability,access,and allocation provide a...The introduction of the Internet of Things(IoT)paradigm serves as pervasive resource access and sharing platform for different real-time applications.Decentralized resource availability,access,and allocation provide a better quality of user experience regardless of the application type and scenario.However,privacy remains an open issue in this ubiquitous sharing platform due to massive and replicated data availability.In this paper,privacy-preserving decision-making for the data-sharing scheme is introduced.This scheme is responsible for improving the security in data sharing without the impact of replicated resources on communicating users.In this scheme,classification learning is used for identifying replicas and accessing granted resources independently.Based on the trust score of the available resources,this classification is recurrently performed to improve the reliability of information sharing.The user-level decisions for information sharing and access are made using the classification of the resources at the time of availability.This proposed scheme is verified using the metrics access delay,success ratio,computation complexity,and sharing loss.展开更多
There are many ways of describing a solid,porous or fluid region of the computational domain when solving the Navier-Stokes equations(NSE)for flow motions.Amongst these the porous cell method is one of the most flexib...There are many ways of describing a solid,porous or fluid region of the computational domain when solving the Navier-Stokes equations(NSE)for flow motions.Amongst these the porous cell method is one of the most flexible approaches.In this method,a parameter is defined as a ratio of the volume open to water and air in a calculation cell to its cell volume.In the calculation,the same numerical procedure is applied to every cell and no explicit boundary conditions are needed at solid boundaries.The method is used to simulate flow through porous media,around solid bodies and over a moving seabed.The results compare well with experimental data and other numerical results.In our future work the porous cell method will be applied to more complex fluid-solid interaction situations.展开更多
The COVID-19 pandemic has caused hundreds of thousands of deaths,millions of infections worldwide,and the loss of trillions of dollars for many large economies.It poses a grave threat to the human population with an e...The COVID-19 pandemic has caused hundreds of thousands of deaths,millions of infections worldwide,and the loss of trillions of dollars for many large economies.It poses a grave threat to the human population with an excessive number of patients constituting an unprecedented challenge with which health systems have to cope.Researchers from many domains have devised diverse approaches for the timely diagnosis of COVID-19 to facilitate medical responses.In the same vein,a wide variety of research studies have investigated underlying medical conditions for indicators suggesting the severity and mortality of,and role of age groups and gender on,the probability of COVID-19 infection.This study aimed to review,analyze,and critically appraise published works that report on various factors to explain their relationship with COVID-19.Such studies span a wide range,including descriptive analyses,ratio analyses,cohort,prospective and retrospective studies.Various studies that describe indicators to determine the probability of infection among the general population,as well as the risk factors associated with severe illness and mortality,are critically analyzed and these ndings are discussed in detail.A comprehensive analysis was conducted on research studies that investigated the perceived differences in vulnerability of different age groups and genders to severe outcomes of COVID-19.Studies incorporating important demographic,health,and socioeconomic characteristics are highlighted to emphasize their importance.Predominantly,the lack of an appropriated dataset that contains demographic,personal health,and socioeconomic information implicates the efcacy and efciency of the discussed methods.Results are overstated on the part of both exclusion of quarantined and patients with mild symptoms and inclusion of the data from hospitals where the majority of the cases are potentially ill.展开更多
With the rise of internet facilities,a greater number of people have started doing online transactions at an exponential rate in recent years as the online transaction system has eliminated the need of going to the ba...With the rise of internet facilities,a greater number of people have started doing online transactions at an exponential rate in recent years as the online transaction system has eliminated the need of going to the bank physically for every transaction.However,the fraud cases have also increased causing the loss of money to the consumers.Hence,an effective fraud detection system is the need of the hour which can detect fraudulent transactions automatically in real-time.Generally,the genuine transactions are large in number than the fraudulent transactions which leads to the class imbalance problem.In this research work,an online transaction fraud detection system using deep learning has been proposed which can handle class imbalance problem by applying algorithm-level methods which modify the learning of the model to focus more on the minority class i.e.,fraud transactions.A novel loss function named Weighted Hard-Reduced Focal Loss(WH-RFL)has been proposed which has achieved maximum fraud detection rate i.e.,True PositiveRate(TPR)at the cost of misclassification of few genuine transactions as high TPR is preferred over a high True Negative Rate(TNR)in fraud detection system and same has been demonstrated using three publicly available imbalanced transactional datasets.Also,Thresholding has been applied to optimize the decision threshold using cross-validation to detect maximum number of frauds and it has been demonstrated by the experimental results that the selection of the right thresholding method with deep learning yields better results.展开更多
Africa is a developing economy and as such, emphasis has been placed on the achievement of revolutionary goals that will place her on a similar rank as the developed economies. Pertaining to this objective, Heads of S...Africa is a developing economy and as such, emphasis has been placed on the achievement of revolutionary goals that will place her on a similar rank as the developed economies. Pertaining to this objective, Heads of States and government all over Africa instigated the African Union (AU) Agenda 2063, which is a framework put in place to achieve a continental transformation over the next 40 years. The use of satellites has been proven to be a major influence on economic growth since it facilitates the exchange of information. Environmental hazards such as climate changes, pollution, and inefficient waste management can be classified as one of the drawbacks to achieving this economic growth we hope to accomplish. The purpose of this paper is to analyze and examine satellite communication as a tool for the attainment of an integrated, prosperous and peaceful Africa by means of combatting environmental hazards in the continent.展开更多
In several countries,the ageing population contour focuses on high healthcare costs and overloaded health care environments.Pervasive health care monitoring system can be a potential alternative,especially in the COVI...In several countries,the ageing population contour focuses on high healthcare costs and overloaded health care environments.Pervasive health care monitoring system can be a potential alternative,especially in the COVID-19 pandemic situation to help mitigate such problems by encouraging healthcare to transition from hospital-centred services to self-care,mobile care and home care.In this aspect,we propose a pervasive system to monitor the COVID’19 patient’s conditions within the hospital and outside by monitoring their medical and psychological situation.It facilitates better healthcare assistance,especially for COVID’19 patients and quarantined people.It identies the patient’s medical and psychological condition based on the current context and activities using a fuzzy context-aware reasoning engine based model.Fuzzy reasoning engine makes decisions using linguistic rules based on inference mechanisms that support the patient condition identication.Linguistics rules are framed based on the fuzzy set attributes belong to different context types.The fuzzy semantic rules are used to identify the relationship among the attributes,and the reasoning engine is used to ensure precise real-time context interpretation and current evaluation of the situation.Outcomes are measured using a fuzzy logic-based context reasoning system under simulation.The results indicate the usefulness of monitoring the COVID’19 patients based on the current context.展开更多
Decision making in case of medical diagnosis is a complicated process.A large number of overlapping structures and cases,and distractions,tiredness,and limitations with the human visual system can lead to inappropriat...Decision making in case of medical diagnosis is a complicated process.A large number of overlapping structures and cases,and distractions,tiredness,and limitations with the human visual system can lead to inappropriate diagnosis.Machine learning(ML)methods have been employed to assist clinicians in overcoming these limitations and in making informed and correct decisions in disease diagnosis.Many academic papers involving the use of machine learning for disease diagnosis have been increasingly getting published.Hence,to determine the use of ML to improve the diagnosis in varied medical disciplines,a systematic review is conducted in this study.To carry out the review,six different databases are selected.Inclusion and exclusion criteria are employed to limit the research.Further,the eligible articles are classied depending on publication year,authors,type of articles,research objective,inputs and outputs,problem and research gaps,and ndings and results.Then the selected articles are analyzed to show the impact of ML methods in improving the disease diagnosis.The ndings of this study show the most used ML methods and the most common diseases that are focused on by researchers.It also shows the increase in use of machine learning for disease diagnosis over the years.These results will help in focusing on those areas which are neglected and also to determine various ways in which ML methods could be employed to achieve desirable results.展开更多
Drought is the least understood natural disaster due to the complex relationship of multiple contributory factors. Itsbeginning and end are hard to gauge, and they can last for months or even for years. India has face...Drought is the least understood natural disaster due to the complex relationship of multiple contributory factors. Itsbeginning and end are hard to gauge, and they can last for months or even for years. India has faced many droughtsin the last few decades. Predicting future droughts is vital for framing drought management plans to sustainnatural resources. The data-driven modelling for forecasting the metrological time series prediction is becomingmore powerful and flexible with computational intelligence techniques. Machine learning (ML) techniques havedemonstrated success in the drought prediction process and are becoming popular to predict the weather, especiallythe minimum temperature using backpropagation algorithms. The favourite ML techniques for weather forecastinginclude support vector machines (SVM), support vector regression, random forest, decision tree, logistic regression,Naive Bayes, linear regression, gradient boosting tree, k-nearest neighbours (KNN), the adaptive neuro-fuzzyinference system, the feed-forward neural networks, Markovian chain, Bayesian network, hidden Markov models,and autoregressive moving averages, evolutionary algorithms, deep learning and many more. This paper presentsa recent review of the literature using ML in drought prediction, the drought indices, dataset, and performancemetrics.展开更多
Faster internet, IoT, and social media have reformed the conventional web into a collaborative web resulting in enormous user-generated content. Several studies are focused on such content;however, they mainly focus o...Faster internet, IoT, and social media have reformed the conventional web into a collaborative web resulting in enormous user-generated content. Several studies are focused on such content;however, they mainly focus on textual data, thus undermining the importance of metadata. Considering this gap, we provide a temporal pattern mining framework to model and utilize user-generated content's metadata. First, we scrap 2.1 million tweets from Twitter between Nov-2020 to Sep-2021 about 100 hashtag keywords and present these tweets into 100 User-Tweet-Hashtag (UTH) dynamic graphs. Second, we extract and identify four time-series in three timespans (Day, Hour, and Minute) from UTH dynamic graphs. Lastly, we model these four time-series with three machine learning algorithms to mine temporal patterns with the accuracy of 95.89%, 93.17%, 90.97%, and 93.73%, respectively. We demonstrate that user-generated content's metadata contains valuable information, which helps to understand the users' collective behavior and can be beneficial for business and research. Dataset and codes are publicly available;the link is given in the dataset section.展开更多
Game player modeling is a paradigm of computational models to exploit players’behavior and experience using game and player analytics.Player modeling refers to descriptions of players based on frameworks of data deri...Game player modeling is a paradigm of computational models to exploit players’behavior and experience using game and player analytics.Player modeling refers to descriptions of players based on frameworks of data derived from the interaction of a player’s behavior within the game as well as the player’s experience with the game.Player behavior focuses on dynamic and static information gathered at the time of gameplay.Player experience concerns the association of the human player during gameplay,which is based on cognitive and affective physiological measurements collected from sensors mounted on the player’s body or in the player’s surroundings.In this paper,player experience modeling is studied based on the board puzzle game“Candy Crush Saga”using cognitive data of players accessed by physiological and peripheral devices.Long Short-Term Memory-based Deep Neural Network(LSTM-DNN)is used to predict players’effective states in terms of valence,arousal,dominance,and liking by employing the concept of transfer learning.Transfer learning focuses on gaining knowledge while solving one problem and using the same knowledge to solve different but related problems.The homogeneous transfer learning approach has not been implemented in the game domain before,and this novel study opens a new research area for the game industry where the main challenge is predicting the significance of innovative games for entertainment and players’engagement.Relevant not only from a player’s point of view,it is also a benchmark study for game developers who have been facing problems of“cold start”for innovative games that strengthen the game industrial economy.展开更多
The unavailability of sufficient information for proper diagnosis,incomplete or miscommunication between patient and the clinician,or among the healthcare professionals,delay or incorrect diagnosis,the fatigue of clin...The unavailability of sufficient information for proper diagnosis,incomplete or miscommunication between patient and the clinician,or among the healthcare professionals,delay or incorrect diagnosis,the fatigue of clinician,or even the high diagnostic complexity in limited time can lead to diagnostic errors.Diagnostic errors have adverse effects on the treatment of a patient.Unnecessary treatments increase the medical bills and deteriorate the health of a patient.Such diagnostic errors that harm the patient in various ways could be minimized using machine learning.Machine learning algorithms could be used to diagnose various diseases with high accuracy.The use of machine learning could assist the doctors in making decisions on time,and could also be used as a second opinion or supporting tool.This study aims to provide a comprehensive review of research articles published from the year 2015 to mid of the year 2020 that have used machine learning for diagnosis of various diseases.We present the various machine learning algorithms used over the years to diagnose various diseases.The results of this study show the distribution of machine learning methods by medical disciplines.Based on our review,we present future research directions that could be used to conduct further research.展开更多
Microblogging, a popular social media service platform, has become a new information channel for users to receive and exchange the most up-to-date information on current events. Consequently, it is a crucial platform ...Microblogging, a popular social media service platform, has become a new information channel for users to receive and exchange the most up-to-date information on current events. Consequently, it is a crucial platform for detecting newly emerging events and for identifying influential spreaders who have the potential to actively disseminate knowledge about events through microblogs. However, traditional event detection models require human intervention to detect the number of topics to be explored, which significantly reduces the efficiency and accuracy of event detection. In addition, most existing methods focus only on event detection and are unable to identify either influential spreaders or key event-related posts, thus making it challenging to track momentous events in a timely manner. To address these problems, we propose a Hypertext-Induced Topic Search(HITS) based Topic-Decision method(TD-HITS), and a Latent Dirichlet Allocation(LDA) based Three-Step model(TS-LDA). TDHITS can automatically detect the number of topics as well as identify associated key posts in a large number of posts. TS-LDA can identify influential spreaders of hot event topics based on both post and user information.The experimental results, using a Twitter dataset, demonstrate the effectiveness of our proposed methods for both detecting events and identifying influential spreaders.展开更多
Industrial IoT(IIoT)aims to enhance services provided by various industries,such as manufacturing and product processing.IIoT suffers from various challenges,and security is one of the key challenge among those challe...Industrial IoT(IIoT)aims to enhance services provided by various industries,such as manufacturing and product processing.IIoT suffers from various challenges,and security is one of the key challenge among those challenges.Authentication and access control are two notable challenges for any IIoT based industrial deployment.Any IoT based Industry 4.0 enterprise designs networks between hundreds of tiny devices such as sensors,actuators,fog devices and gateways.Thus,articulating a secure authentication protocol between sensing devices or a sensing device and user devices is an essential step in IoT security.In this paper,first,we present cryptanalysis for the certificate-based scheme proposed for a similar environment by Das et al.and prove that their scheme is vulnerable to various traditional attacks such as device anonymity,MITM,and DoS.We then put forward an interdevice authentication scheme using an ECC(Elliptic Curve Cryptography)that is highly secure and lightweight compared to other existing schemes for a similar environment.Furthermore,we set forth a formal security analysis using the random oracle-based ROR model and informal security analysis over the Doleve-Yao channel.In this paper,we present comparison of the proposed scheme with existing schemes based on communication cost,computation cost and security index to prove that the proposed EBAKE-SE is highly efficient,reliable,and trustworthy compared to other existing schemes for an inter-device authentication.At long last,we present an implementation for the proposed EBAKE-SE using MQTT protocol.展开更多
Purpose-The purpose of this paper is to propose two operators for diversity and mutation control in artificial immune systems(AISs).Design/methodology/approach-The proposed operators are applied in substitution to the...Purpose-The purpose of this paper is to propose two operators for diversity and mutation control in artificial immune systems(AISs).Design/methodology/approach-The proposed operators are applied in substitution to the suppression and mutation operators used in AISs.The proposed mechanisms were tested in the opt-aiNet,a continuous optimization algorithm inspired in the theories of immunology.The traditional opt-aiNet uses a suppression operator based on the immune network principles to remove similar cells and add random ones to control the diversity of the population.This procedure is computationally expensive,as the Euclidean distances between every possible pair of candidate solutions must be computed.This work proposes a self-organizing suppression mechanism inspired by the self-organizing criticality(SOC)phenomenon,which is less dependent on parameter selection.This work also proposes the use of the q-Gaussian mutation,which allows controlling the form of the mutation distribution during the optimization process.The algorithms were tested in a well-known benchmark for continuous optimization and in a bioinformatics problem:the rigid docking of proteins.Findings-The proposed suppression operator presented some limitations in unimodal functions,but some interesting results were found in some highly multimodal functions.The proposed q-Gaussian mutation presented good performance in most of the test cases of the benchmark,and also in the docking problem.Originality/value-First,the self-organizing suppression operator was able to reduce the complexity of the suppression stage in the opt-aiNet.Second,the use of q-Gaussian mutation in AISs presented better compromise between exploitation and exploration of the search space and,as a consequence,a better performance when compared to the traditional Gaussian mutation.展开更多
Objective:To validate two proposed coronavirus disease 2019(COVID-19)prognosis models,analyze the characteristics of different models,consider the performance of models in predicting different outcomes,and provide new...Objective:To validate two proposed coronavirus disease 2019(COVID-19)prognosis models,analyze the characteristics of different models,consider the performance of models in predicting different outcomes,and provide new insights into the development and use of artificial intelligence(AI)predictive models in clinical decision-making for COVID-19 and other diseases.Materials and Methods:We compared two proposed prediction models for COVID-19 prognosis that use a decision tree and logistic regression modeling.We evaluated the effectiveness of different model-building strategies using laboratory tests and/or clinical record data,their sensitivity and robustness to the timings of records used and the presence of missing data,and their predictive performance and capabilities in single-site and multicenter settings.Results:The predictive accuracies of the two models after retraining were improved to 93.2% and 93.9%,compared with that of the models directly used,with accuracies of 84.3% and 87.9%,indicating that the prediction models could not be used directly and require retraining based on actual data.In addition,based on the prediction model,new features obtained by model comparison and literature evidence were transferred to integrate the new models with better performance.Conclusions:Comparing the characteristics and differences of datasets used in model training,effective model verification,and a fusion of models is necessary in improving the performance of AI models.展开更多
文摘Wireless technology is transforming the future of transportation through the development of the Internet of Vehicles(IoV).However,intricate security challenges are intertwinedwith technological progress:Vehicular ad hoc Networks(VANETs),a core component of IoV,face security issues,particularly the Black Hole Attack(BHA).This malicious attack disrupts the seamless flow of data and threatens the network’s overall reliability;also,BHA strategically disrupts communication pathways by dropping data packets from legitimate nodes altogether.Recognizing the importance of this challenge,we have introduced a new solution called ad hoc On-Demand Distance Vector-Reputation-based mechanism Local Outlier Factor(AODV-RL).The significance of AODVRL lies in its unique approach:it verifies and confirms the trustworthiness of network components,providing robust protection against BHA.An additional safety layer is established by implementing the Local Outlier Factor(LOF),which detects and addresses abnormal network behaviors.Rigorous testing of our solution has revealed its remarkable ability to enhance communication in VANETs.Specifically,Our experimental results achieve message delivery ratios of up to 94.25%andminimal packet loss ratios of just 0.297%.Based on our experimental results,the proposedmechanismsignificantly improves VANET communication reliability and security.These results promise a more secure and dependable future for IoV,capable of transforming transportation safety and efficiency.
基金funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2024R333)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘The concept of smart houses has grown in prominence in recent years.Major challenges linked to smart homes are identification theft,data safety,automated decision-making for IoT-based devices,and the security of the device itself.Current home automation systems try to address these issues but there is still an urgent need for a dependable and secure smart home solution that includes automatic decision-making systems and methodical features.This paper proposes a smart home system based on ensemble learning of random forest(RF)and convolutional neural networks(CNN)for programmed decision-making tasks,such as categorizing gadgets as“OFF”or“ON”based on their normal routine in homes.We have integrated emerging blockchain technology to provide secure,decentralized,and trustworthy authentication and recognition of IoT devices.Our system consists of a 5V relay circuit,various sensors,and a Raspberry Pi server and database for managing devices.We have also developed an Android app that communicates with the server interface through an HTTP web interface and an Apache server.The feasibility and efficacy of the proposed smart home automation system have been evaluated in both laboratory and real-time settings.It is essential to use inexpensive,scalable,and readily available components and technologies in smart home automation systems.Additionally,we must incorporate a comprehensive security and privacy-centric design that emphasizes risk assessments,such as cyberattacks,hardware security,and other cyber threats.The trial results support the proposed system and demonstrate its potential for use in everyday life.
基金Provincial key platforms and major scientific research projects of universities in Guangdong Province,Peoples R China under Grant No.2017GXJK116.
文摘The massive growth of diversified smart devices and continuous data generation poses a challenge to communication architectures.To deal with this problem,communication networks consider fog computing as one of promising technologies that can improve overall communication performance.It brings on-demand services proximate to the end devices and delivers the requested data in a short time.Fog computing faces several issues such as latency,bandwidth,and link utilization due to limited resources and the high processing demands of end devices.To this end,fog caching plays an imperative role in addressing data dissemination issues.This study provides a comprehensive discussion of fog computing,Internet of Things(IoTs)and the critical issues related to data security and dissemination in fog computing.Moreover,we determine the fog-based caching schemes and contribute to deal with the existing issues of fog computing.Besides,this paper presents a number of caching schemes with their contributions,benefits,and challenges to overcome the problems and limitations of fog computing.We also identify machine learning-based approaches for cache security and management in fog computing,as well as several prospective future research directions in caching,fog computing,and machine learning.
文摘Cloud computing promises the advent of a new era of service boosted by means of virtualization technology.The process of virtualization means creation of virtual infrastructure,devices,servers and computing resources needed to deploy an application smoothly.This extensively practiced technology involves selecting an efficient Virtual Machine(VM)to complete the task by transferring applications from Physical Machines(PM)to VM or from VM to VM.The whole process is very challenging not only in terms of computation but also in terms of energy and memory.This research paper presents an energy aware VM allocation and migration approach to meet the challenges faced by the growing number of cloud data centres.Machine Learning(ML)based Artificial Bee Colony(ABC)is used to rank the VM with respect to the load while considering the energy efficiency as a crucial parameter.The most efficient virtual machines are further selected and thus depending on the dynamics of the load and energy,applications are migrated fromoneVMto another.The simulation analysis is performed inMatlab and it shows that this research work results in more reduction in energy consumption as compared to existing studies.
基金Supported by the NNSF of China(60464001) Guangxi Science Foundation(0575092).
文摘In this paper, the generalized Dodd-Bullough-Mikhailov equation is studied. The existence of periodic wave and unbounded wave solutions is proved by using the method of bifurcation theory of dynamical systems. Under different parametric conditions, various sufficient conditions to guarantee the existence of the above solutions are given.Some exact explicit parametric representations of the above travelling solutions are obtained.
基金supported by the Deanship of Scientific Research(DSR),King Abdulaziz University,Jeddah,under grant No.(DF-203-611-1441)。
文摘The introduction of the Internet of Things(IoT)paradigm serves as pervasive resource access and sharing platform for different real-time applications.Decentralized resource availability,access,and allocation provide a better quality of user experience regardless of the application type and scenario.However,privacy remains an open issue in this ubiquitous sharing platform due to massive and replicated data availability.In this paper,privacy-preserving decision-making for the data-sharing scheme is introduced.This scheme is responsible for improving the security in data sharing without the impact of replicated resources on communicating users.In this scheme,classification learning is used for identifying replicas and accessing granted resources independently.Based on the trust score of the available resources,this classification is recurrently performed to improve the reliability of information sharing.The user-level decisions for information sharing and access are made using the classification of the resources at the time of availability.This proposed scheme is verified using the metrics access delay,success ratio,computation complexity,and sharing loss.
文摘There are many ways of describing a solid,porous or fluid region of the computational domain when solving the Navier-Stokes equations(NSE)for flow motions.Amongst these the porous cell method is one of the most flexible approaches.In this method,a parameter is defined as a ratio of the volume open to water and air in a calculation cell to its cell volume.In the calculation,the same numerical procedure is applied to every cell and no explicit boundary conditions are needed at solid boundaries.The method is used to simulate flow through porous media,around solid bodies and over a moving seabed.The results compare well with experimental data and other numerical results.In our future work the porous cell method will be applied to more complex fluid-solid interaction situations.
基金supported by the Researchers Supporting Project Number(RSP-2020/250),King Saud University,Riyadh,Saudi Arabia.
文摘The COVID-19 pandemic has caused hundreds of thousands of deaths,millions of infections worldwide,and the loss of trillions of dollars for many large economies.It poses a grave threat to the human population with an excessive number of patients constituting an unprecedented challenge with which health systems have to cope.Researchers from many domains have devised diverse approaches for the timely diagnosis of COVID-19 to facilitate medical responses.In the same vein,a wide variety of research studies have investigated underlying medical conditions for indicators suggesting the severity and mortality of,and role of age groups and gender on,the probability of COVID-19 infection.This study aimed to review,analyze,and critically appraise published works that report on various factors to explain their relationship with COVID-19.Such studies span a wide range,including descriptive analyses,ratio analyses,cohort,prospective and retrospective studies.Various studies that describe indicators to determine the probability of infection among the general population,as well as the risk factors associated with severe illness and mortality,are critically analyzed and these ndings are discussed in detail.A comprehensive analysis was conducted on research studies that investigated the perceived differences in vulnerability of different age groups and genders to severe outcomes of COVID-19.Studies incorporating important demographic,health,and socioeconomic characteristics are highlighted to emphasize their importance.Predominantly,the lack of an appropriated dataset that contains demographic,personal health,and socioeconomic information implicates the efcacy and efciency of the discussed methods.Results are overstated on the part of both exclusion of quarantined and patients with mild symptoms and inclusion of the data from hospitals where the majority of the cases are potentially ill.
基金This research was supported by Korea Institute for Advancement of Technology(KIAT)grant funded by the Korea Government(MOTIE)(P0012724,The Competency Development Program for Industry Specialist)and the Soonchunhyang University Research Fund.
文摘With the rise of internet facilities,a greater number of people have started doing online transactions at an exponential rate in recent years as the online transaction system has eliminated the need of going to the bank physically for every transaction.However,the fraud cases have also increased causing the loss of money to the consumers.Hence,an effective fraud detection system is the need of the hour which can detect fraudulent transactions automatically in real-time.Generally,the genuine transactions are large in number than the fraudulent transactions which leads to the class imbalance problem.In this research work,an online transaction fraud detection system using deep learning has been proposed which can handle class imbalance problem by applying algorithm-level methods which modify the learning of the model to focus more on the minority class i.e.,fraud transactions.A novel loss function named Weighted Hard-Reduced Focal Loss(WH-RFL)has been proposed which has achieved maximum fraud detection rate i.e.,True PositiveRate(TPR)at the cost of misclassification of few genuine transactions as high TPR is preferred over a high True Negative Rate(TNR)in fraud detection system and same has been demonstrated using three publicly available imbalanced transactional datasets.Also,Thresholding has been applied to optimize the decision threshold using cross-validation to detect maximum number of frauds and it has been demonstrated by the experimental results that the selection of the right thresholding method with deep learning yields better results.
文摘Africa is a developing economy and as such, emphasis has been placed on the achievement of revolutionary goals that will place her on a similar rank as the developed economies. Pertaining to this objective, Heads of States and government all over Africa instigated the African Union (AU) Agenda 2063, which is a framework put in place to achieve a continental transformation over the next 40 years. The use of satellites has been proven to be a major influence on economic growth since it facilitates the exchange of information. Environmental hazards such as climate changes, pollution, and inefficient waste management can be classified as one of the drawbacks to achieving this economic growth we hope to accomplish. The purpose of this paper is to analyze and examine satellite communication as a tool for the attainment of an integrated, prosperous and peaceful Africa by means of combatting environmental hazards in the continent.
基金funding by the University of Malta’s Internal Research Grants。
文摘In several countries,the ageing population contour focuses on high healthcare costs and overloaded health care environments.Pervasive health care monitoring system can be a potential alternative,especially in the COVID-19 pandemic situation to help mitigate such problems by encouraging healthcare to transition from hospital-centred services to self-care,mobile care and home care.In this aspect,we propose a pervasive system to monitor the COVID’19 patient’s conditions within the hospital and outside by monitoring their medical and psychological situation.It facilitates better healthcare assistance,especially for COVID’19 patients and quarantined people.It identies the patient’s medical and psychological condition based on the current context and activities using a fuzzy context-aware reasoning engine based model.Fuzzy reasoning engine makes decisions using linguistic rules based on inference mechanisms that support the patient condition identication.Linguistics rules are framed based on the fuzzy set attributes belong to different context types.The fuzzy semantic rules are used to identify the relationship among the attributes,and the reasoning engine is used to ensure precise real-time context interpretation and current evaluation of the situation.Outcomes are measured using a fuzzy logic-based context reasoning system under simulation.The results indicate the usefulness of monitoring the COVID’19 patients based on the current context.
基金supported in part by the MSIT(Ministry of Science and ICT),Korea,under the ITRC(Information Technology Research Center)support program(IITP-2020-2016-0-00312)supervised by the IITP(Institute for Information&Communications Technology Planning&Evaluation)in part by the MSIP(Ministry of Science,ICT&Future Planning),Korea,under the National Program for Excellence in SW(2015-0-00938)supervised by the IITP(Institute for Information&communications Technology Planning&Evaluation).
文摘Decision making in case of medical diagnosis is a complicated process.A large number of overlapping structures and cases,and distractions,tiredness,and limitations with the human visual system can lead to inappropriate diagnosis.Machine learning(ML)methods have been employed to assist clinicians in overcoming these limitations and in making informed and correct decisions in disease diagnosis.Many academic papers involving the use of machine learning for disease diagnosis have been increasingly getting published.Hence,to determine the use of ML to improve the diagnosis in varied medical disciplines,a systematic review is conducted in this study.To carry out the review,six different databases are selected.Inclusion and exclusion criteria are employed to limit the research.Further,the eligible articles are classied depending on publication year,authors,type of articles,research objective,inputs and outputs,problem and research gaps,and ndings and results.Then the selected articles are analyzed to show the impact of ML methods in improving the disease diagnosis.The ndings of this study show the most used ML methods and the most common diseases that are focused on by researchers.It also shows the increase in use of machine learning for disease diagnosis over the years.These results will help in focusing on those areas which are neglected and also to determine various ways in which ML methods could be employed to achieve desirable results.
文摘Drought is the least understood natural disaster due to the complex relationship of multiple contributory factors. Itsbeginning and end are hard to gauge, and they can last for months or even for years. India has faced many droughtsin the last few decades. Predicting future droughts is vital for framing drought management plans to sustainnatural resources. The data-driven modelling for forecasting the metrological time series prediction is becomingmore powerful and flexible with computational intelligence techniques. Machine learning (ML) techniques havedemonstrated success in the drought prediction process and are becoming popular to predict the weather, especiallythe minimum temperature using backpropagation algorithms. The favourite ML techniques for weather forecastinginclude support vector machines (SVM), support vector regression, random forest, decision tree, logistic regression,Naive Bayes, linear regression, gradient boosting tree, k-nearest neighbours (KNN), the adaptive neuro-fuzzyinference system, the feed-forward neural networks, Markovian chain, Bayesian network, hidden Markov models,and autoregressive moving averages, evolutionary algorithms, deep learning and many more. This paper presentsa recent review of the literature using ML in drought prediction, the drought indices, dataset, and performancemetrics.
基金supported by the National Natural Science Foundation of China(grant no.61573328).
文摘Faster internet, IoT, and social media have reformed the conventional web into a collaborative web resulting in enormous user-generated content. Several studies are focused on such content;however, they mainly focus on textual data, thus undermining the importance of metadata. Considering this gap, we provide a temporal pattern mining framework to model and utilize user-generated content's metadata. First, we scrap 2.1 million tweets from Twitter between Nov-2020 to Sep-2021 about 100 hashtag keywords and present these tweets into 100 User-Tweet-Hashtag (UTH) dynamic graphs. Second, we extract and identify four time-series in three timespans (Day, Hour, and Minute) from UTH dynamic graphs. Lastly, we model these four time-series with three machine learning algorithms to mine temporal patterns with the accuracy of 95.89%, 93.17%, 90.97%, and 93.73%, respectively. We demonstrate that user-generated content's metadata contains valuable information, which helps to understand the users' collective behavior and can be beneficial for business and research. Dataset and codes are publicly available;the link is given in the dataset section.
基金This study was supported by the BK21 FOUR project(AI-driven Convergence Software Education Research Program)funded by the Ministry of Education,School of Computer Science and Engineering,Kyungpook National University,Korea(4199990214394).This work was also supported by the Institute of Information&communications Technology Planning&Evaluation(IITP)grant funded by the Korea Government(MSIT)under Grant 2017-0-00053(A Technology Development of Artificial Intelligence Doctors for Cardiovascular Disease).
文摘Game player modeling is a paradigm of computational models to exploit players’behavior and experience using game and player analytics.Player modeling refers to descriptions of players based on frameworks of data derived from the interaction of a player’s behavior within the game as well as the player’s experience with the game.Player behavior focuses on dynamic and static information gathered at the time of gameplay.Player experience concerns the association of the human player during gameplay,which is based on cognitive and affective physiological measurements collected from sensors mounted on the player’s body or in the player’s surroundings.In this paper,player experience modeling is studied based on the board puzzle game“Candy Crush Saga”using cognitive data of players accessed by physiological and peripheral devices.Long Short-Term Memory-based Deep Neural Network(LSTM-DNN)is used to predict players’effective states in terms of valence,arousal,dominance,and liking by employing the concept of transfer learning.Transfer learning focuses on gaining knowledge while solving one problem and using the same knowledge to solve different but related problems.The homogeneous transfer learning approach has not been implemented in the game domain before,and this novel study opens a new research area for the game industry where the main challenge is predicting the significance of innovative games for entertainment and players’engagement.Relevant not only from a player’s point of view,it is also a benchmark study for game developers who have been facing problems of“cold start”for innovative games that strengthen the game industrial economy.
基金supported in part by Zayed University,office of research under Grant No.R17089.
文摘The unavailability of sufficient information for proper diagnosis,incomplete or miscommunication between patient and the clinician,or among the healthcare professionals,delay or incorrect diagnosis,the fatigue of clinician,or even the high diagnostic complexity in limited time can lead to diagnostic errors.Diagnostic errors have adverse effects on the treatment of a patient.Unnecessary treatments increase the medical bills and deteriorate the health of a patient.Such diagnostic errors that harm the patient in various ways could be minimized using machine learning.Machine learning algorithms could be used to diagnose various diseases with high accuracy.The use of machine learning could assist the doctors in making decisions on time,and could also be used as a second opinion or supporting tool.This study aims to provide a comprehensive review of research articles published from the year 2015 to mid of the year 2020 that have used machine learning for diagnosis of various diseases.We present the various machine learning algorithms used over the years to diagnose various diseases.The results of this study show the distribution of machine learning methods by medical disciplines.Based on our review,we present future research directions that could be used to conduct further research.
基金supported by the National Natural Science Foundation of China(Nos.61502209 and 61502207)the Natural Science Foundation of Jiangsu Province of China(No.BK20130528)Visiting Research Fellow Program of Tongji University(No.8105142504)
文摘Microblogging, a popular social media service platform, has become a new information channel for users to receive and exchange the most up-to-date information on current events. Consequently, it is a crucial platform for detecting newly emerging events and for identifying influential spreaders who have the potential to actively disseminate knowledge about events through microblogs. However, traditional event detection models require human intervention to detect the number of topics to be explored, which significantly reduces the efficiency and accuracy of event detection. In addition, most existing methods focus only on event detection and are unable to identify either influential spreaders or key event-related posts, thus making it challenging to track momentous events in a timely manner. To address these problems, we propose a Hypertext-Induced Topic Search(HITS) based Topic-Decision method(TD-HITS), and a Latent Dirichlet Allocation(LDA) based Three-Step model(TS-LDA). TDHITS can automatically detect the number of topics as well as identify associated key posts in a large number of posts. TS-LDA can identify influential spreaders of hot event topics based on both post and user information.The experimental results, using a Twitter dataset, demonstrate the effectiveness of our proposed methods for both detecting events and identifying influential spreaders.
基金supported by the Researchers Supporting Project(No.RSP-2021/395)King Saud University,Riyadh,Saudi Arabia.
文摘Industrial IoT(IIoT)aims to enhance services provided by various industries,such as manufacturing and product processing.IIoT suffers from various challenges,and security is one of the key challenge among those challenges.Authentication and access control are two notable challenges for any IIoT based industrial deployment.Any IoT based Industry 4.0 enterprise designs networks between hundreds of tiny devices such as sensors,actuators,fog devices and gateways.Thus,articulating a secure authentication protocol between sensing devices or a sensing device and user devices is an essential step in IoT security.In this paper,first,we present cryptanalysis for the certificate-based scheme proposed for a similar environment by Das et al.and prove that their scheme is vulnerable to various traditional attacks such as device anonymity,MITM,and DoS.We then put forward an interdevice authentication scheme using an ECC(Elliptic Curve Cryptography)that is highly secure and lightweight compared to other existing schemes for a similar environment.Furthermore,we set forth a formal security analysis using the random oracle-based ROR model and informal security analysis over the Doleve-Yao channel.In this paper,we present comparison of the proposed scheme with existing schemes based on communication cost,computation cost and security index to prove that the proposed EBAKE-SE is highly efficient,reliable,and trustworthy compared to other existing schemes for an inter-device authentication.At long last,we present an implementation for the proposed EBAKE-SE using MQTT protocol.
基金support in this work under grants 2010/09273-1 and 2009/12944-8Renato Tinòs also thanks CNPq.
文摘Purpose-The purpose of this paper is to propose two operators for diversity and mutation control in artificial immune systems(AISs).Design/methodology/approach-The proposed operators are applied in substitution to the suppression and mutation operators used in AISs.The proposed mechanisms were tested in the opt-aiNet,a continuous optimization algorithm inspired in the theories of immunology.The traditional opt-aiNet uses a suppression operator based on the immune network principles to remove similar cells and add random ones to control the diversity of the population.This procedure is computationally expensive,as the Euclidean distances between every possible pair of candidate solutions must be computed.This work proposes a self-organizing suppression mechanism inspired by the self-organizing criticality(SOC)phenomenon,which is less dependent on parameter selection.This work also proposes the use of the q-Gaussian mutation,which allows controlling the form of the mutation distribution during the optimization process.The algorithms were tested in a well-known benchmark for continuous optimization and in a bioinformatics problem:the rigid docking of proteins.Findings-The proposed suppression operator presented some limitations in unimodal functions,but some interesting results were found in some highly multimodal functions.The proposed q-Gaussian mutation presented good performance in most of the test cases of the benchmark,and also in the docking problem.Originality/value-First,the self-organizing suppression operator was able to reduce the complexity of the suppression stage in the opt-aiNet.Second,the use of q-Gaussian mutation in AISs presented better compromise between exploitation and exploration of the search space and,as a consequence,a better performance when compared to the traditional Gaussian mutation.
基金financially supported by the Natural Science Foundation of Beijing(No.M21012)National Natural Science Foundation of China(No.82174533)Key Technologies R and D Program of the China Academy of Chinese Medical Sciences(No.CI2021A00920).
文摘Objective:To validate two proposed coronavirus disease 2019(COVID-19)prognosis models,analyze the characteristics of different models,consider the performance of models in predicting different outcomes,and provide new insights into the development and use of artificial intelligence(AI)predictive models in clinical decision-making for COVID-19 and other diseases.Materials and Methods:We compared two proposed prediction models for COVID-19 prognosis that use a decision tree and logistic regression modeling.We evaluated the effectiveness of different model-building strategies using laboratory tests and/or clinical record data,their sensitivity and robustness to the timings of records used and the presence of missing data,and their predictive performance and capabilities in single-site and multicenter settings.Results:The predictive accuracies of the two models after retraining were improved to 93.2% and 93.9%,compared with that of the models directly used,with accuracies of 84.3% and 87.9%,indicating that the prediction models could not be used directly and require retraining based on actual data.In addition,based on the prediction model,new features obtained by model comparison and literature evidence were transferred to integrate the new models with better performance.Conclusions:Comparing the characteristics and differences of datasets used in model training,effective model verification,and a fusion of models is necessary in improving the performance of AI models.