Minimizing Subject-dependent Calibration for BCI with Riemannian Transfer Learning - Université de Versailles Saint-Quentin-en-Yvelines
Conference Papers Year : 2021

Minimizing Subject-dependent Calibration for BCI with Riemannian Transfer Learning

Abstract

Calibration is still an important issue for user experience in Brain-Computer Interfaces (BCI). Common experimental designs often involve a lengthy training period that raises the cognitive fatigue, before even starting to use the BCI. Reducing or suppressing this subject-dependent calibration is possible by relying on advanced machine learning techniques, such as transfer learning. Building on Riemannian BCI, we present a simple and effective scheme to train a classifier on data recorded from different subjects, to reduce the calibration while preserving good performances. The main novelty of this paper is to propose a unique approach that could be applied on very different paradigms. To demonstrate the robustness of this approach, we conducted a meta-analysis on multiple datasets for three BCI paradigms: motor imagery, eventrelated potentials (P300) and SSVEP. Relying on the MOABB open source framework to ensure the reproducibility of the experiments and the statistical analysis, the results clearly show that the proposed approach could be applied on any kind of BCI paradigm and in most of the cases to significantly improve the classifier reliability. We point out some key features to further improve transfer learning methods.
Fichier principal
Vignette du fichier
NER2020_transferMOABB-vendredisoir.pdf (262.17 Ko) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-03202360 , version 1 (19-04-2021)

Identifiers

Cite

Salim Khazem, Sylvain Chevallier, Quentin Barthélemy, Karim Haroun, Camille Noûs. Minimizing Subject-dependent Calibration for BCI with Riemannian Transfer Learning. NER (Neural engineering), May 2021, Rome (virtual), Italy. pp.523-526, ⟨10.1109/NER49283.2021.9441279⟩. ⟨hal-03202360⟩
355 View
374 Download

Altmetric

Share

More