Arnau Dillen, Mohsen Omidi, Fakhreddine Ghaffari, Bram Vanderborght, Bart Roelands, Olivier Romain, Ann Nowé, Kevin De Pauw
Objective: Brain-computer interface (BCI) control systems monitor neural activity to detect the user's intentions, enabling device control through mental imagery. Despite their potential, decoding neural activity in real-world conditions poses significant challenges, making BCIs currently impractical compared to traditional interaction methods. This study introduces a novel motor imagery (MI) BCI control strategy for operating a physically assistive robotic arm, addressing the difficulties of MI decoding from electroencephalogram (EEG) signals, which are inherently non-stationary and vary across individuals. Approach: A proof-of-concept BCI control system was developed using commercially available hardware, integrating MI with eye tracking in an augmented reality (AR) user interface to facilitate a shared control approach. This system proposes actions based on the user's gaze, enabling selection through imagined movements. A user study was conducted to evaluate the system's usability, focusing on its effectiveness and efficiency. Main results:Participants performed tasks that simulated everyday activities with the robotic arm, demonstrating the shared control system's feasibility and practicality in real-world scenarios. Despite low online decoding performance (mean accuracy: 0.52 9, F1: 0.29, Cohen's Kappa: 0.12), participants achieved a mean success rate of 0.83 in the final phase of the user study when given 15 minutes to complete the evaluation tasks. The success rate dropped below 0.5 when a 5-minute cutoff time was selected. Significance: These results indicate that integrating AR and eye tracking can significantly enhance the usability of BCI systems, despite the complexities of MI-EEG decoding. While efficiency is still low, the effectiveness of our approach was verified. This suggests that BCI systems have the potential to become a viable interaction modality for everyday applications in the future.
Dillen, A, Omidi, M, Ghaffari, F, Vanderborght, B, Roelands, B, Romain, O, Nowé, A & De Pauw, K 2024, 'A shared robot control system combining augmented reality and motor imagery brain-computer interfaces with eye tracking', Journal of Neural Engineering, vol. 21, no. 5, 056028. https://doi.org/10.1088/1741-2552/ad7f8d
Dillen, A., Omidi, M., Ghaffari, F., Vanderborght, B., Roelands, B., Romain, O., Nowé, A., & De Pauw, K. (2024). A shared robot control system combining augmented reality and motor imagery brain-computer interfaces with eye tracking. Journal of Neural Engineering, 21(5), Article 056028. https://doi.org/10.1088/1741-2552/ad7f8d
@article{1e554a6de46747979c7489582deb9006,
title = "A shared robot control system combining augmented reality and motor imagery brain-computer interfaces with eye tracking",
abstract = "Objective: Brain-computer interface (BCI) control systems monitor neural activity to detect the user's intentions, enabling device control through mental imagery. Despite their potential, decoding neural activity in real-world conditions poses significant challenges, making BCIs currently impractical compared to traditional interaction methods. This study introduces a novel motor imagery (MI) BCI control strategy for operating a physically assistive robotic arm, addressing the difficulties of MI decoding from electroencephalogram (EEG) signals, which are inherently non-stationary and vary across individuals. Approach: A proof-of-concept BCI control system was developed using commercially available hardware, integrating MI with eye tracking in an augmented reality (AR) user interface to facilitate a shared control approach. This system proposes actions based on the user's gaze, enabling selection through imagined movements. A user study was conducted to evaluate the system's usability, focusing on its effectiveness and efficiency. Main results:Participants performed tasks that simulated everyday activities with the robotic arm, demonstrating the shared control system's feasibility and practicality in real-world scenarios. Despite low online decoding performance (mean accuracy: 0.52 9, F1: 0.29, Cohen's Kappa: 0.12), participants achieved a mean success rate of 0.83 in the final phase of the user study when given 15 minutes to complete the evaluation tasks. The success rate dropped below 0.5 when a 5-minute cutoff time was selected. Significance: These results indicate that integrating AR and eye tracking can significantly enhance the usability of BCI systems, despite the complexities of MI-EEG decoding. While efficiency is still low, the effectiveness of our approach was verified. This suggests that BCI systems have the potential to become a viable interaction modality for everyday applications in the future.",
author = "Arnau Dillen and Mohsen Omidi and Fakhreddine Ghaffari and Bram Vanderborght and Bart Roelands and Olivier Romain and Ann Now{\'e} and {De Pauw}, Kevin",
note = "Publisher Copyright: {\textcopyright} 2024 The Author(s). Published by IOP Publishing Ltd.",
year = "2024",
month = sep,
day = "25",
doi = "10.1088/1741-2552/ad7f8d",
language = "English",
volume = "21",
journal = "Journal of Neural Engineering",
issn = "1741-2552",
publisher = "IOP Publishing",
number = "5",
}