Details
Original language | English |
---|---|
Article number | 5728 |
Number of pages | 17 |
Journal | Applied Sciences (Switzerland) |
Volume | 13 |
Issue number | 9 |
Publication status | Published - 6 May 2023 |
Abstract
Human grasping is a relatively fast process and control signals for upper limb prosthetics cannot be generated and processed in a sufficiently timely manner. The aim of this study was to examine whether discriminating between different grasping movements at a cortical level can provide information prior to the actual grasping process, allowing for more intuitive prosthetic control. EEG datasets were captured from 13 healthy subjects who repeatedly performed 16 activities of daily living. Common classifiers were trained on features extracted from the waking-state frequency and total-frequency time domains. Different training scenarios were used to investigate whether classifiers can already be pre-trained by base networks for fine-tuning with data of a target person. A support vector machine algorithm with spatial covariance matrices as EEG signal descriptors based on Riemannian geometry showed the highest balanced accuracy (0.91 ± 0.05 SD) in discriminating five grasping categories according to the Cutkosky taxonomy in an interval from 1.0 s before to 0.5 s after the initial movement. Fine-tuning did not improve any classifier. No significant accuracy differences between the two frequency domains were apparent (p > 0.07). Neurofunctional representations enabled highly accurate discrimination of five different grasping movements. Our results indicate that, for upper limb prosthetics, it is possible to use them in a sufficiently timely manner and to predict the respective grasping task as a discrete category to kinematically prepare the prosthetic hand.
Keywords
- activities of daily living, brain–computer interface, electroencephalography, movement decoding, prosthetic control
ASJC Scopus subject areas
- Materials Science(all)
- Physics and Astronomy(all)
- Instrumentation
- Engineering(all)
- Chemical Engineering(all)
- Process Chemistry and Technology
- Computer Science(all)
- Computer Science Applications
- Chemical Engineering(all)
- Fluid Flow and Transfer Processes
Sustainable Development Goals
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
In: Applied Sciences (Switzerland), Vol. 13, No. 9, 5728, 06.05.2023.
Research output: Contribution to journal › Article › Research › peer review
}
TY - JOUR
T1 - Early Predictability of Grasping Movements by Neurofunctional Representations
T2 - A Feasibility Study
AU - Jakubowitz, Eike
AU - Feist, Thekla
AU - Obermeier, Alina
AU - Gempfer, Carina
AU - Hurschler, Christof
AU - Windhagen, Henning
AU - Laves, Max Heinrich
N1 - Funding Information: This study received funding from the European Union’s Horizon 2020 Research and Innovation Programme under Grant Agreement No. 688857, called “SoftPro”.
PY - 2023/5/6
Y1 - 2023/5/6
N2 - Human grasping is a relatively fast process and control signals for upper limb prosthetics cannot be generated and processed in a sufficiently timely manner. The aim of this study was to examine whether discriminating between different grasping movements at a cortical level can provide information prior to the actual grasping process, allowing for more intuitive prosthetic control. EEG datasets were captured from 13 healthy subjects who repeatedly performed 16 activities of daily living. Common classifiers were trained on features extracted from the waking-state frequency and total-frequency time domains. Different training scenarios were used to investigate whether classifiers can already be pre-trained by base networks for fine-tuning with data of a target person. A support vector machine algorithm with spatial covariance matrices as EEG signal descriptors based on Riemannian geometry showed the highest balanced accuracy (0.91 ± 0.05 SD) in discriminating five grasping categories according to the Cutkosky taxonomy in an interval from 1.0 s before to 0.5 s after the initial movement. Fine-tuning did not improve any classifier. No significant accuracy differences between the two frequency domains were apparent (p > 0.07). Neurofunctional representations enabled highly accurate discrimination of five different grasping movements. Our results indicate that, for upper limb prosthetics, it is possible to use them in a sufficiently timely manner and to predict the respective grasping task as a discrete category to kinematically prepare the prosthetic hand.
AB - Human grasping is a relatively fast process and control signals for upper limb prosthetics cannot be generated and processed in a sufficiently timely manner. The aim of this study was to examine whether discriminating between different grasping movements at a cortical level can provide information prior to the actual grasping process, allowing for more intuitive prosthetic control. EEG datasets were captured from 13 healthy subjects who repeatedly performed 16 activities of daily living. Common classifiers were trained on features extracted from the waking-state frequency and total-frequency time domains. Different training scenarios were used to investigate whether classifiers can already be pre-trained by base networks for fine-tuning with data of a target person. A support vector machine algorithm with spatial covariance matrices as EEG signal descriptors based on Riemannian geometry showed the highest balanced accuracy (0.91 ± 0.05 SD) in discriminating five grasping categories according to the Cutkosky taxonomy in an interval from 1.0 s before to 0.5 s after the initial movement. Fine-tuning did not improve any classifier. No significant accuracy differences between the two frequency domains were apparent (p > 0.07). Neurofunctional representations enabled highly accurate discrimination of five different grasping movements. Our results indicate that, for upper limb prosthetics, it is possible to use them in a sufficiently timely manner and to predict the respective grasping task as a discrete category to kinematically prepare the prosthetic hand.
KW - activities of daily living
KW - brain–computer interface
KW - electroencephalography
KW - movement decoding
KW - prosthetic control
UR - http://www.scopus.com/inward/record.url?scp=85159325216&partnerID=8YFLogxK
U2 - 10.3390/app13095728
DO - 10.3390/app13095728
M3 - Article
AN - SCOPUS:85159325216
VL - 13
JO - Applied Sciences (Switzerland)
JF - Applied Sciences (Switzerland)
SN - 2076-3417
IS - 9
M1 - 5728
ER -