Brain computer interface for tetraplegic patients using EEG signals of motor imagery (21792)
Modern day wheelchairs lack accessibility for individuals with tetraplegia. Brain computer interfaces (BCIs) can use motor imagery (MI) decoded from motor cortical activity to control external devices, offering means of control for individuals with few options to access independence.
We have designed a minimally interactive BCI-controlled wheelchair that only requires the user to have control over eye movement and a functioning motor cortex. The proposed system combines gaze tracking technology for directional commands, MI for stop/start initiation, and observational error related potentials (OErrp) for maintaining safety in the event of an error from the user or the BCI.
MI experiments were recorded via wireless electroencephalography (EEG) using only channels C3, Cz and C4. A linear discriminant analysis (LDA) model was used to classify the presence/absence of MI based on power spectral density (PSD) features of the signal between the Mu band of 7-30Hz. Offline decoding accuracies were obtained using 5-fold cross-validation. Using publicly available OErrP datasets, we developed a principal component analysis (PCA) and support vector machine pipeline to decode OErrPs. Using leave-one-out cross-validation, we showed that using PCA could improve the decoding accuracy of OErrP from ~80% to >90%.
Preliminary results indicate the possibility to command a BCI-controlled wheelchair with a high level of accuracy. Future work will involve continuous development of BCI systems by recording pilot OErrP experiments and implementing parallel online decoding of MI and OErrPs.