Denoising Autoencoder for Biosignals: Denoising Autoencoder for Biosignals
Studenteropgave: Kandidatspeciale og HD afgangsprojekt
- Simon Anielski Barsøe Jensen
- Rasmus Hjelm Rasmussen
4. semester, Software, Kandidat (Kandidatuddannelse)
This paper introduces a self-supervised framework for pre-training on EEG data. The goal is to
create a model that produces good features, such that the model can be used for transfer learning.
Our framework is based on a denoising autoencoder architecture. We have a model that receives
an input that is augmented using token masking, and tries to reconstruct the original input. The
pre-training is done using a subset of the Temple University Hospital EEG Data Corpus (TUEG)
datasets. Our model is a transformer based model, inspired by the likes of BERT. For our results
we ran benchmarks on the 3 different datasets we did fine tuning on. Here we compare to some
supervised methods. The results show that our model, though not the best, is comparable to the
supervised models. The benchmark proves that the model has learned transferable features.
create a model that produces good features, such that the model can be used for transfer learning.
Our framework is based on a denoising autoencoder architecture. We have a model that receives
an input that is augmented using token masking, and tries to reconstruct the original input. The
pre-training is done using a subset of the Temple University Hospital EEG Data Corpus (TUEG)
datasets. Our model is a transformer based model, inspired by the likes of BERT. For our results
we ran benchmarks on the 3 different datasets we did fine tuning on. Here we compare to some
supervised methods. The results show that our model, though not the best, is comparable to the
supervised models. The benchmark proves that the model has learned transferable features.
Sprog | Engelsk |
---|---|
Udgivelsesdato | 17 jun. 2021 |
Antal sider | 16 |
Emneord | Deep learning, EEG |
---|