Convolutional Neural Networks for Sleep Stage Scoring on a Two-Channel EEG Signal

UDC.coleccionInvestigaciónes_ES
UDC.departamentoCiencias da Computación e Tecnoloxías da Informaciónes_ES
UDC.endPage4079es_ES
UDC.grupoInvRedes de Neuronas Artificiais e Sistemas Adaptativos -Informática Médica e Diagnóstico Radiolóxico (RNASA - IMEDIR)es_ES
UDC.grupoInvRNASA - IMEDIR (INIBIC)es_ES
UDC.institutoCentroINIBIC - Instituto de Investigacións Biomédicas de A Coruñaes_ES
UDC.issue24es_ES
UDC.journalTitleSoft Computinges_ES
UDC.startPage4067es_ES
dc.contributor.authorFernández-Blanco, Enrique
dc.contributor.authorRivero, Daniel
dc.contributor.authorPazos, A.
dc.date.accessioned2021-04-05T11:25:19Z
dc.date.available2021-04-05T11:25:19Z
dc.date.issued2019-06-26
dc.descriptionThis is a pre-print of an article published in Soft Computing. The final authenticated version is available online at: https://doi.org/10.1007/s00500-019-04174-1es_ES
dc.description.abstract[Abstract] Sleeping problems have become one of the major diseases all over the world. To tackle this issue, the basic tool used by specialists is the Polysomnogram, which is a collection of different signals recorded during sleep. After its recording, the specialists have to score the different signals according to one of the standard guidelines. This process is carried out manually, which can be highly time consuming and very prone to annotation errors. Therefore, over the years, many approaches have been explored in an attempt to support the specialists in this task. In this paper, an approach based on convolutional neural networks is presented, where an in-depth comparison is performed in order to determine the convenience of using more than one signal simultaneously as input. Additionally, the models were also used as parts of an ensemble model to check whether any useful information can be extracted from signal processing a single signal at a time which the dual-signal model cannot identify. Tests have been performed by using a well-known dataset called expanded sleep-EDF, which is the most commonly used dataset as benchmark for this problem. The tests were carried out with a leave-one-out cross-validation over the patients, which ensures that there is no possible contamination between training and testing. The resulting proposal is a network smaller than previously published ones, but which overcomes the results of any previous models on the same dataset. The best result shows an accuracy of 92.67% and a Cohen’s Kappa value over 0.84 compared to human experts.es_ES
dc.description.sponsorshipInstituto de Salud Carlos III; PI17/01826es_ES
dc.description.sponsorshipXunta de Galicia; ED431D 2017/23es_ES
dc.description.sponsorshipXunta de Galicia; ED431D 2017/16es_ES
dc.description.sponsorshipXunta de Galicia; ED431G/01es_ES
dc.identifier.citationFernandez-Blanco E, Rivero D, Pazos A. Convolutional neural networks for sleep stage scoring on a two-channel EEG signal. Soft Comput.2020; 24:4067-4079es_ES
dc.identifier.issn1432-7643
dc.identifier.urihttp://hdl.handle.net/2183/27654
dc.language.isoenges_ES
dc.publisherSpringer Naturees_ES
dc.relation.urihttps://doi.org/10.1007/s00500-019-04174-1es_ES
dc.rights.accessRightsopen accesses_ES
dc.subjectConvolutional neural networkses_ES
dc.subjectDeep learninges_ES
dc.subjectElectroencephalographyes_ES
dc.subjectPolysomnographyes_ES
dc.subjectSignal processinges_ES
dc.titleConvolutional Neural Networks for Sleep Stage Scoring on a Two-Channel EEG Signales_ES
dc.typejournal articlees_ES
dspace.entity.typePublication
relation.isAuthorOfPublication244a6828-de1c-45f3-86b6-69bb81250814
relation.isAuthorOfPublicationd8e10433-ea19-4a35-8cc6-0c7b9f143a6d
relation.isAuthorOfPublicationfa192a4c-bffd-4b23-87ae-e68c29350cdc
relation.isAuthorOfPublication.latestForDiscovery244a6828-de1c-45f3-86b6-69bb81250814

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
FernandezBlanco_2019_Convolutional_neural_networks_sleep_stage_scoring_two_channel_EEG_signal
Size:
358.69 KB
Format:
Adobe Portable Document Format
Description: