Brain-computer interfaces (BCIs) records brain activity using electroencephalogram (EEG) headsets in the form of EEG signals;these signals can berecorded, processed and classified into different hand movements, which...Brain-computer interfaces (BCIs) records brain activity using electroencephalogram (EEG) headsets in the form of EEG signals;these signals can berecorded, processed and classified into different hand movements, which can beused to control other IoT devices. Classification of hand movements will beone step closer to applying these algorithms in real-life situations using EEGheadsets. This paper uses different feature extraction techniques and sophisticatedmachine learning algorithms to classify hand movements from EEG brain signalsto control prosthetic hands for amputated persons. To achieve good classificationaccuracy, denoising and feature extraction of EEG signals is a significant step. Wesaw a considerable increase in all the machine learning models when the movingaverage filter was applied to the raw EEG data. Feature extraction techniques likea fast fourier transform (FFT) and continuous wave transform (CWT) were usedin this study;three types of features were extracted, i.e., FFT Features, CWTCoefficients and CWT scalogram images. We trained and compared differentmachine learning (ML) models like logistic regression, random forest, k-nearestneighbors (KNN), light gradient boosting machine (GBM) and XG boost onFFT and CWT features and deep learning (DL) models like VGG-16, DenseNet201 and ResNet50 trained on CWT scalogram images. XG Boost with FFTfeatures gave the maximum accuracy of 88%.展开更多
文摘Brain-computer interfaces (BCIs) records brain activity using electroencephalogram (EEG) headsets in the form of EEG signals;these signals can berecorded, processed and classified into different hand movements, which can beused to control other IoT devices. Classification of hand movements will beone step closer to applying these algorithms in real-life situations using EEGheadsets. This paper uses different feature extraction techniques and sophisticatedmachine learning algorithms to classify hand movements from EEG brain signalsto control prosthetic hands for amputated persons. To achieve good classificationaccuracy, denoising and feature extraction of EEG signals is a significant step. Wesaw a considerable increase in all the machine learning models when the movingaverage filter was applied to the raw EEG data. Feature extraction techniques likea fast fourier transform (FFT) and continuous wave transform (CWT) were usedin this study;three types of features were extracted, i.e., FFT Features, CWTCoefficients and CWT scalogram images. We trained and compared differentmachine learning (ML) models like logistic regression, random forest, k-nearestneighbors (KNN), light gradient boosting machine (GBM) and XG boost onFFT and CWT features and deep learning (DL) models like VGG-16, DenseNet201 and ResNet50 trained on CWT scalogram images. XG Boost with FFTfeatures gave the maximum accuracy of 88%.