Human Adaptive Mechatronics(HAM)includes human and computer system in a closed loop.Elderly person with disabilities,normally carry out their daily routines with some assistance to move their limbs.With the short fall...Human Adaptive Mechatronics(HAM)includes human and computer system in a closed loop.Elderly person with disabilities,normally carry out their daily routines with some assistance to move their limbs.With the short fall of human care takers,mechatronics devices are used with the likes of exoskeleton and exosuits to assist them.The rehabilitation and occupational therapy equipments utilize the electromyography(EMG)signals to measure the muscle activity potential.This paper focuses on optimizing the HAM model in prediction of intended motion of upper limb with high accuracy and to increase the response time of the system.Limb characteristics extraction from EMG signal and prediction of optimal controller parameters are modeled.Time and frequency based approach of EMG signal are considered for feature extraction.The models used for estimating motion and muscle parameters from EMG signal for carrying out limb movement predictions are validated.Based on the extracted features,optimal parameters are selected by Modified Lion Optimization(MLO)for controlling the HAM system.Finally,supervised machine learning makes predictions at different points in time for individual sensing using Support Vector Neural Network(SVNN).This model is also evaluated based on optimal parameters of motion estimation and the accuracy level along with different optimization models for various upper limb movements.The proposed model of human adaptive controller predicts the limb movement by 96%accuracy.展开更多
Internet of things (IoT) has been significantly raised owing to thedevelopment of broadband access network, machine learning (ML), big dataanalytics (BDA), cloud computing (CC), and so on. The development of IoTtechno...Internet of things (IoT) has been significantly raised owing to thedevelopment of broadband access network, machine learning (ML), big dataanalytics (BDA), cloud computing (CC), and so on. The development of IoTtechnologies has resulted in a massive quantity of data due to the existenceof several people linking through distinct physical components, indicatingthe status of the CC environment. In the IoT, load scheduling is realistictechnique in distinct data center to guarantee the network suitability by fallingthe computer hardware and software catastrophe and with right utilize ofresource. The ideal load balancer improves many factors of Quality of Service(QoS) like resource performance, scalability, response time, error tolerance,and efficiency. The scholar is assumed as load scheduling a vital problem inIoT environment. There are many techniques accessible to load scheduling inIoT environments. With this motivation, this paper presents an improved deerhunting optimization algorithm with Type II fuzzy logic (IDHOA-T2F) modelfor load scheduling in IoT environment. The goal of the IDHOA-T2F is todiminish the energy utilization of integrated circuit of IoT node and enhancethe load scheduling in IoT environments. The IDHOA technique is derivedby integrating the concepts of Nelder Mead (NM) with the DHOA. Theproposed model also synthesized the T2L based on fuzzy logic (FL) systemsto counterbalance the load distribution. The proposed model finds usefulto improve the efficiency of IoT system. For validating the enhanced loadscheduling performance of the IDHOA-T2F technique, a series of simulationstake place to highlight the improved performance. The experimental outcomesdemonstrate the capable outcome of the IDHOA-T2F technique over therecent techniques.展开更多
Internet of Things (IoT) is transforming the technical setting ofconventional systems and finds applicability in smart cities, smart healthcare, smart industry, etc. In addition, the application areas relating to theI...Internet of Things (IoT) is transforming the technical setting ofconventional systems and finds applicability in smart cities, smart healthcare, smart industry, etc. In addition, the application areas relating to theIoT enabled models are resource-limited and necessitate crisp responses, lowlatencies, and high bandwidth, which are beyond their abilities. Cloud computing (CC) is treated as a resource-rich solution to the above mentionedchallenges. But the intrinsic high latency of CC makes it nonviable. The longerlatency degrades the outcome of IoT based smart systems. CC is an emergentdispersed, inexpensive computing pattern with massive assembly of heterogeneous autonomous systems. The effective use of task scheduling minimizes theenergy utilization of the cloud infrastructure and rises the income of serviceproviders by the minimization of the processing time of the user job. Withthis motivation, this paper presents an intelligent Chaotic Artificial ImmuneOptimization Algorithm for Task Scheduling (CAIOA-RS) in IoT enabledcloud environment. The proposed CAIOA-RS algorithm solves the issue ofresource allocation in the IoT enabled cloud environment. It also satisfiesthe makespan by carrying out the optimum task scheduling process with thedistinct strategies of incoming tasks. The design of CAIOA-RS techniqueincorporates the concept of chaotic maps into the conventional AIOA toenhance its performance. A series of experiments were carried out on theCloudSim platform. The simulation results demonstrate that the CAIOA-RStechnique indicates that the proposed model outperforms the original version,as well as other heuristics and metaheuristics.展开更多
基金This work was supported by the Deanship of Scientific Research,King Khalid University,Kingdom of Saudi Arabia under research Grant Number(R.G.P.2/100/41).
文摘Human Adaptive Mechatronics(HAM)includes human and computer system in a closed loop.Elderly person with disabilities,normally carry out their daily routines with some assistance to move their limbs.With the short fall of human care takers,mechatronics devices are used with the likes of exoskeleton and exosuits to assist them.The rehabilitation and occupational therapy equipments utilize the electromyography(EMG)signals to measure the muscle activity potential.This paper focuses on optimizing the HAM model in prediction of intended motion of upper limb with high accuracy and to increase the response time of the system.Limb characteristics extraction from EMG signal and prediction of optimal controller parameters are modeled.Time and frequency based approach of EMG signal are considered for feature extraction.The models used for estimating motion and muscle parameters from EMG signal for carrying out limb movement predictions are validated.Based on the extracted features,optimal parameters are selected by Modified Lion Optimization(MLO)for controlling the HAM system.Finally,supervised machine learning makes predictions at different points in time for individual sensing using Support Vector Neural Network(SVNN).This model is also evaluated based on optimal parameters of motion estimation and the accuracy level along with different optimization models for various upper limb movements.The proposed model of human adaptive controller predicts the limb movement by 96%accuracy.
基金The authors extend their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work under grant number(RGP 2/209/42)This research was funded by the Deanship of Scientific Research at Princess Nourah bint Abdulrahman University through the Fast-Track Path of Research Funding Program.
文摘Internet of things (IoT) has been significantly raised owing to thedevelopment of broadband access network, machine learning (ML), big dataanalytics (BDA), cloud computing (CC), and so on. The development of IoTtechnologies has resulted in a massive quantity of data due to the existenceof several people linking through distinct physical components, indicatingthe status of the CC environment. In the IoT, load scheduling is realistictechnique in distinct data center to guarantee the network suitability by fallingthe computer hardware and software catastrophe and with right utilize ofresource. The ideal load balancer improves many factors of Quality of Service(QoS) like resource performance, scalability, response time, error tolerance,and efficiency. The scholar is assumed as load scheduling a vital problem inIoT environment. There are many techniques accessible to load scheduling inIoT environments. With this motivation, this paper presents an improved deerhunting optimization algorithm with Type II fuzzy logic (IDHOA-T2F) modelfor load scheduling in IoT environment. The goal of the IDHOA-T2F is todiminish the energy utilization of integrated circuit of IoT node and enhancethe load scheduling in IoT environments. The IDHOA technique is derivedby integrating the concepts of Nelder Mead (NM) with the DHOA. Theproposed model also synthesized the T2L based on fuzzy logic (FL) systemsto counterbalance the load distribution. The proposed model finds usefulto improve the efficiency of IoT system. For validating the enhanced loadscheduling performance of the IDHOA-T2F technique, a series of simulationstake place to highlight the improved performance. The experimental outcomesdemonstrate the capable outcome of the IDHOA-T2F technique over therecent techniques.
基金This research was supported by Korea Institute for Advancement of Technology(KIAT)grant funded by the Korea Government(MOTIE)(P0012724,The Competency Development Program for Industry Specialist)and the Soonchunhyang University Research Fund.
文摘Internet of Things (IoT) is transforming the technical setting ofconventional systems and finds applicability in smart cities, smart healthcare, smart industry, etc. In addition, the application areas relating to theIoT enabled models are resource-limited and necessitate crisp responses, lowlatencies, and high bandwidth, which are beyond their abilities. Cloud computing (CC) is treated as a resource-rich solution to the above mentionedchallenges. But the intrinsic high latency of CC makes it nonviable. The longerlatency degrades the outcome of IoT based smart systems. CC is an emergentdispersed, inexpensive computing pattern with massive assembly of heterogeneous autonomous systems. The effective use of task scheduling minimizes theenergy utilization of the cloud infrastructure and rises the income of serviceproviders by the minimization of the processing time of the user job. Withthis motivation, this paper presents an intelligent Chaotic Artificial ImmuneOptimization Algorithm for Task Scheduling (CAIOA-RS) in IoT enabledcloud environment. The proposed CAIOA-RS algorithm solves the issue ofresource allocation in the IoT enabled cloud environment. It also satisfiesthe makespan by carrying out the optimum task scheduling process with thedistinct strategies of incoming tasks. The design of CAIOA-RS techniqueincorporates the concept of chaotic maps into the conventional AIOA toenhance its performance. A series of experiments were carried out on theCloudSim platform. The simulation results demonstrate that the CAIOA-RStechnique indicates that the proposed model outperforms the original version,as well as other heuristics and metaheuristics.